WO2023231893A1 - Procédé d'affichage de curseur et dispositif électronique - Google Patents

Procédé d'affichage de curseur et dispositif électronique Download PDF

Info

Publication number
WO2023231893A1
WO2023231893A1 PCT/CN2023/096281 CN2023096281W WO2023231893A1 WO 2023231893 A1 WO2023231893 A1 WO 2023231893A1 CN 2023096281 W CN2023096281 W CN 2023096281W WO 2023231893 A1 WO2023231893 A1 WO 2023231893A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
electronic device
cursor
user
mouse
Prior art date
Application number
PCT/CN2023/096281
Other languages
English (en)
Chinese (zh)
Inventor
卢跃东
孟德泉
李荣根
周星辰
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023231893A1 publication Critical patent/WO2023231893A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Definitions

  • Embodiments of the present application relate to the field of electronic equipment, and in particular, to a cursor display method and electronic equipment.
  • Embodiments of the present application provide a cursor display method and an electronic device.
  • the method can realize direct, convenient and fast data interaction with the cursor on different electronic devices, thereby enhancing the user's experience.
  • a cursor display method is provided. The method is applied to a first electronic device, and the first electronic device is connected to a second electronic device via screen projection.
  • the method includes: the first electronic device displays on the first screen Displaying a cursor, the first screen is the screen of the first electronic device; the first electronic device controls the cursor to move from the first screen to the second screen in response to receiving the user's first operation.
  • the second screen is The screen of the second electronic device, the first screen and the second screen are different logical screens.
  • the first electronic device and the second electronic device are connected through screen projection.
  • the first electronic device has functions similar to the host of the second electronic device.
  • the cursor is how the input device appears on the screen.
  • the first electronic device controls the cursor to move from the first screen to the second screen. It can also be understood that when the cursor is displayed on the second screen, the cursor disappears from the first screen, thereby creating a visual effect of the cursor moving from the first screen to the second screen.
  • the first operation can be used to express the user's intention.
  • the first electronic device determines the target screen (second screen) to which the cursor is to move according to the user's intention, and displays the cursor on the target screen.
  • the first electronic device determines the target screen for the cursor movement by sensing the user's intention and determining the user's intention, and displays the cursor on the target screen to realize the free movement of the cursor on the screens of multiple electronic devices.
  • This method can improve the user's office efficiency in a one-core multi-screen scenario and enhance the user's experience.
  • the method before the first electronic device controls the cursor to move from the first screen to the second screen, the method further includes: the first electronic device according to the first The operation determines the identifier of the second screen, which is different from the identifier of the first screen; the first electronic device determines that the second screen is the target screen according to the identifier of the second screen.
  • the first electronic device can determine the identity of the target screen for the cursor movement according to the user's intention, and then determine the target screen for the cursor movement.
  • the target screen (which may be the second screen) is different from the original screen (which may be the first screen). screen) are different logical screens. It is convenient for the first electronic device to control the subsequent operation of moving the cursor to the target screen.
  • the first electronic device controls the cursor to move from the first screen to the second screen, including: the first electronic device sends an input event to the second screen, and the input Events include at least one of the following: mouse movement events, mouse click events, or drag events.
  • the first electronic device may send the input event to the target screen. So that the user can perform subsequent actions on the target screen.
  • the first electronic device sending the input event to the second screen can also be understood as the first electronic device sending the input event to the second electronic device.
  • the first electronic device responds to receiving the user's first operation, including: the first electronic device responds to receiving the user's operation on the first screen with The cursor is moved along a preset path; or the first electronic device responds to receiving a user's tap on the first screen or the second screen; or the first electronic device responds to receiving a user's tap on the first electronic device.
  • System settings including: the first electronic device responds to receiving the user's operation on the first screen with The cursor is moved along a preset path; or the first electronic device responds to receiving a user's tap on the first screen or the second screen; or the first electronic device responds to receiving a user's tap on the first electronic device.
  • the user's first operation includes but is not limited to the above implementation.
  • the specific method of the first operation can be preset according to the actual situation.
  • the fun of the cursor movement process can be enhanced; when the user inputs by system settings, the cursor can be moved to the target screen more accurately.
  • the user can tap on the first screen or the second screen with his knuckles, and the specific tapping method is not limited in this application.
  • the first electronic device responds to receiving that the user moves the cursor in a preset path on the first screen, including: the first electronic device responds to receiving The user moves the cursor on the first screen to the edge of the first screen and continues to move; or the first electronic device responds to receiving that the user shakes the cursor in a preset shape on the first screen.
  • the preset path may be a moving path of the cursor or a shaking path of the cursor.
  • the preset path may be a path with a preset shape such as "C” or "S”, which is not limited in this application.
  • the first electronic device responds to receiving that the user moves the cursor on the first screen to an edge of the first screen and continues to move, including: the first An electronic device determines the positional relationship between the first electronic device and the second electronic device; the first electronic device responds to receiving that the user moves the cursor to the edge of the first screen according to the positional relationship on the first screen and Keep moving.
  • the first electronic device when the user inputs by moving the cursor on the first screen to the edge of the first screen and continuing to move it, before the cursor is displayed on the screen of the second electronic device, the first electronic device also needs to determine the first
  • the positional relationship between the electronic device and the second electronic device is the positional relationship between the first screen and the second screen.
  • other electronic devices in the same cursor system as the first electronic device can also determine the positional relationship between the electronic devices.
  • the second electronic device can also determine the positional relationship between the first electronic device and the second electronic device.
  • the first electronic device when the second screen is located on the right side of the first screen, the first electronic device responds to receiving that the user clicks on the first Move the cursor on the screen to the right edge of the first screen and continue to move to the right to display the cursor on the second screen; when the second screen is located on the left side of the first screen, the first electronic device responds to receiving the user's input on the first screen. Move the cursor on one screen to the left edge of the first screen and continue moving to the left to display the cursor on the second screen.
  • the first electronic device determines the positional relationship between the first electronic device and the second electronic device, including: the first electronic device responds to receiving the user's The second operation is to determine the positional relationship between the first electronic device and the second electronic device; or the first electronic device senses the position of the second electronic device and determines the positional relationship between the first electronic device and the second electronic device.
  • the second operation may be a user's setting operation.
  • the user's setting operation can be performed in the system setting interface of the first electronic device, or it can be the factory setting of the first electronic device. This application does not limit the specific setting operation of the positional relationship between the first electronic device and the second electronic device.
  • the first electronic device can sense the location of the second electronic device by itself, or after sensing by itself, the user can further set the position relationship.
  • the method further includes: the first electronic device determining the position of the cursor on the second screen in response to receiving a first operation from the user.
  • the first electronic device displays the cursor on the screen of the second electronic device and also needs to determine the specific location of the cursor on the second screen.
  • the first electronic device determines the position of the cursor on the second screen in response to receiving the user's first operation, including: the first electronic device In response to receiving the user's first operation, it is determined that the position of the cursor on the second screen corresponds to the position on the first screen; or in response to receiving the user's first operation, the first electronic device determines The cursor is the default position on the second screen.
  • the first electronic device determines The position of the cursor on the second screen corresponds to the position of the cursor on the first screen. Alternatively, the first electronic device determines that the cursor is at a default position on the second screen. In this way, when the cursor moves to the corresponding position or default position on the second screen, the user can directly perform operations on objects such as icons, pictures, or text at the corresponding position without moving the cursor to the corresponding position of the above objects, which is more convenient for the user.
  • a drag object is displayed on the first screen
  • the method further includes: the first electronic device responds to receiving a third user's request for the drag object. Operation: send the input event to the second screen, so that the drag object dragged from the first screen is displayed on the second screen.
  • the third operation is the user's drag operation on the drag object.
  • the drag object while moving the cursor of the first electronic device, the drag object can also be moved together. This method realizes the movement of drag objects between multiple screens, further improves the efficiency of collaborative use of multiple electronic devices, and improves the user experience.
  • the method further includes: the first electronic device responding to receiving the user's fourth operation on the drag object on the second screen, sending the The second screen sends the input event, so that the movement effect of the drag object is displayed on the second screen.
  • the fourth operation is the user's movement operation on the drag object.
  • the drag object includes at least one of the following: a file, an application icon, a text, a picture, or a window.
  • the first electronic device and the second electronic device Screen connection includes: the first electronic device and the second electronic device are connected to the screen through at least one of the following methods: Bluetooth, wired or wireless communication technology Wi-Fi.
  • a cursor display system which is characterized in that it includes: a first electronic device and a second electronic device, the first electronic device is connected to the second electronic device for screen projection; the first electronic device, For displaying a cursor on a first screen, the first screen being the screen of the first electronic device; in response to receiving a first operation from the user, controlling the cursor to move from the first screen to a second screen, the second screen
  • the screen is the screen of the second electronic device, the first screen and the second screen are different logical screens; the second electronic device is used to display the cursor on the second screen.
  • the first electronic device before the first electronic device controls the cursor to move from the first screen to the second screen, the first electronic device is also used to perform the first operation according to the first operation. Determine the identifier of the second screen, which is different from the identifier of the first screen; determine the second screen as the target screen according to the identifier of the second screen.
  • the first electronic device controls the cursor to move from the first screen to the second screen, including: the first electronic device sends an input event to the second screen, and the input Events include at least one of the following: mouse movement events, mouse click events, or drag events.
  • the first electronic device responds to receiving the user's first operation, including: the first electronic device responds to receiving the user's operation on the first screen with The cursor is moved along a preset path; or the first electronic device responds to receiving a user's tap on the first screen or the second screen; or the first electronic device responds to receiving a user's tap on the first electronic device.
  • System settings including: the first electronic device responds to receiving the user's operation on the first screen with The cursor is moved along a preset path; or the first electronic device responds to receiving a user's tap on the first screen or the second screen; or the first electronic device responds to receiving a user's tap on the first electronic device.
  • the first electronic device responds to receiving that the user moves the cursor in a preset path on the first screen, including: the first electronic device responds to receiving The user moves the cursor on the first screen to the edge of the first screen and continues to move; or the first electronic device responds to receiving that the user shakes the cursor in a preset shape on the first screen.
  • the first electronic device when the first electronic device responds to receiving that the user moves the cursor on the first screen to an edge of the first screen and continues to move; the first electronic device The electronic device and/or the second electronic device is also used to determine the positional relationship between the first electronic device and the second electronic device; the first electronic device responds to receiving that the user clicks on the first screen according to the positional relationship. Move the cursor to the edge of the first screen and continue moving.
  • the first electronic device is further configured to receive the user's first operation and determine the position of the cursor on the second screen.
  • the first electronic device is further configured to receive the user's first operation and determine the position of the cursor on the second screen, including: the first The electronic device is further configured to determine that the position of the cursor on the second screen corresponds to the position of the cursor on the first screen in response to receiving the user's first operation; or the first electronic device is further configured to respond After receiving the user's first operation, the cursor is determined to be a default position on the second screen.
  • a drag object is displayed on the first screen; the first electronic device is also configured to respond to receiving a third operation of the user on the drag object. , sending the input event to the second screen; the second electronic device is also used to display the drag object dragged from the first screen on the second screen.
  • the first electronic device is also configured to respond to receiving Upon the user's fourth operation on the drag object, the input event is sent to the second screen; the second electronic device is also used to display the movement effect of the drag object on the second screen.
  • the drag object includes at least one of the following: a file, an application icon, a text, a picture, or a window.
  • the first electronic device and the second electronic device are screen-casting connected, including: the first electronic device and the second electronic device through the following At least one way to connect the screen: Bluetooth, wired or wireless communication technology Wi-Fi.
  • an electronic device in a third aspect, includes modules/units that perform the above-mentioned first aspect or any possible design method of the first aspect; these modules/units can be implemented by hardware, or can be The corresponding software implementation is executed through hardware.
  • a computer-readable storage medium which is characterized in that the computer-readable storage medium includes a computer program or instructions, and when the computer program or instructions are run on a computer, the results of the first aspect and any of the instructions thereof are as follows: The method in one possible implementation is executed.
  • a fifth aspect provides a computer program product, characterized in that the computer program product includes a computer program or instructions, and when the computer program or instructions are run on a computer, the first aspect and any of the possible methods are implemented as in the first aspect. The methods in the implementation are executed.
  • a sixth aspect provides a computer program that, when run on a computer, causes the method in the first aspect and any possible implementation thereof to be executed.
  • Figure 1 is a schematic structural diagram of an electronic device.
  • Figure 2 is a software structure block diagram of an electronic device provided by an embodiment of the present application.
  • Figure 3 is a schematic diagram of a connection scenario between a tablet computer and a display provided by an embodiment of the present application.
  • FIG. 4 is a system architecture diagram of a first electronic device provided by an embodiment of the present application.
  • Figure 5 is a set of GUIs provided by an embodiment of the present application.
  • Figure 6 is another set of GUI provided by the embodiment of the present application.
  • Figure 7 is another set of GUI provided by the embodiment of the present application.
  • Figure 8 is another set of GUI provided by the embodiment of the present application.
  • Figure 9 is another set of GUI provided by the embodiment of the present application.
  • Figure 10 is another set of GUI provided by the embodiment of the present application.
  • Figure 11 is another set of GUI provided by the embodiment of the present application.
  • Figure 12 is another set of GUI provided by the embodiment of the present application.
  • Figure 13 is another set of GUI provided by the embodiment of the present application.
  • Figure 14 is another set of GUI provided by the embodiment of the present application.
  • Figure 15 is another set of GUI provided by the embodiment of the present application.
  • Figure 16 is another set of GUI provided by the embodiment of the present application.
  • Figure 17 is another set of GUI provided by the embodiment of the present application.
  • Figure 18 is a schematic flow chart of a cursor display method provided by an embodiment of the present application.
  • Figure 19 is another set of GUI provided by the embodiment of the present application.
  • Figure 20 is another set of GUI provided by the embodiment of the present application.
  • Figure 21 is a schematic flow chart of a cursor display method provided by an embodiment of the present application.
  • Figure 22 is a schematic flow chart of a cursor display method provided by an embodiment of the present application.
  • the electronic device may be a portable electronic device that also includes other functions such as a personal digital assistant and/or a music player function, such as a mobile phone, a tablet computer, a wearable electronic device with wireless communication functions (such as a smart watch) wait.
  • portable electronic devices include, but are not limited to, carrying Or portable electronic devices with other operating systems.
  • the above-mentioned portable electronic device may also be other portable electronic devices, such as a laptop computer (Laptop). It should also be understood that in some other embodiments, the above-mentioned electronic device may not be a portable electronic device, but a desktop computer.
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (SIM) card interface 195, etc.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) wait.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • NPU neural-network processing unit
  • different processing units can be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • Interfaces may include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, universal asynchronous receiver and transmitter (universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and /or universal serial bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous receiver and transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization. For example: Antenna 1 can be reused as a diversity antenna for a wireless LAN. In other embodiments, antennas may be used in conjunction with tuning switches.
  • the mobile communication module 150 can provide solutions for wireless communication including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, perform filtering, amplification and other processing on the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor and convert it into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be disposed in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs sound signals through audio devices (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194.
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110 and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT), and global navigation satellites.
  • WLAN wireless local area networks
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi) -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 194 is used to display images, videos, etc.
  • Display 194 includes a display panel.
  • the display panel can use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • AMOLED organic light-emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed, quantum dot light emitting diode (QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the electronic device 100 can implement the shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the optical signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be provided in the camera 193.
  • Camera 193 is used to capture still images or video.
  • the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into The electrical signal is then passed to the ISP for conversion into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other format image signals.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital video.
  • Electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • NPU is a neural network (NN) computing processor.
  • NN neural network
  • Intelligent cognitive applications of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement the data storage function. Such as saving music, videos, etc. files in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the electronic device 100 .
  • the internal memory 121 may include a program storage area and a data storage area. Among them, the stored program area can store an operating system, at least one application program required for a function (such as a sound playback function, an image playback function, etc.).
  • the storage data area may store data created during use of the electronic device 100 (such as audio data, phone book, etc.).
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc.
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 answers a call or a voice message, the voice can be heard by bringing the receiver 170B close to the human ear.
  • Microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can speak close to the microphone 170C with the human mouth and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which in addition to collecting sound signals, may also implement a noise reduction function. In other embodiments, the electronic device 100 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions, etc.
  • the pressure sensor 180A is used to sense pressure signals and can convert the pressure signals into electrical signals.
  • pressure sensor 180A may be disposed on display screen 194 .
  • pressure sensors 180A there are many types of pressure sensors 180A, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc.
  • a capacitive pressure sensor may include at least two parallel plates of conductive material.
  • the electronic device 100 determines the intensity of the pressure based on the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position based on the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch location but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the alarm clock application icon, an instruction to create an alarm clock is executed.
  • Fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to achieve fingerprint unlocking, access to application locks, fingerprint photography, fingerprint answering of incoming calls, etc. For example, when the mobile phone detects the user's touch operation on the lock screen interface, the mobile phone can collect the user's fingerprint information through the fingerprint sensor 180H, and match the collected fingerprint information with the fingerprint information preset in the mobile phone. If the match is successful, the phone can enter the non-lock screen interface from the lock screen interface.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K can be disposed on the display screen 194.
  • the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near the touch sensor 180K.
  • the touch sensor can pass the detected touch operation to the application processor to determine the touch event type.
  • Visual output related to the touch operation may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a location different from that of the display screen 194 .
  • FIG. 2 is a software structure block diagram of the electronic device 100 according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • the Android system is divided into four layers, from top to bottom: application layer, application framework layer, Android runtime and system libraries, and kernel layer.
  • the application layer can include a series of application packages.
  • the application layer can include cameras, settings, skin modules, user interface (UI), third-party applications, etc.
  • third-party applications can include gallery, calendar, calls, maps, navigation, WLAN, Bluetooth, music, video, short messages, etc.
  • the application framework layer provides an application programming interface (API) and programming framework for applications in the application layer.
  • API application programming interface
  • the application framework layer can include some predefined functions.
  • the application framework layer can include a window manager, content provider, view system, phone manager, resource manager, notification manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can obtain the display size, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make this data accessible to applications. Said data can include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, etc., such as indication information for prompting a virtual shutter key in the embodiment of the present application.
  • a view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of the electronic device 100 .
  • call status management including connected, hung up etc.
  • the resource manager provides various resources to applications, such as localized strings, icons, pictures, layout files, video files, etc.
  • the notification manager allows applications to display notification information in the status bar, which can be used to convey notification-type messages and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be notifications that appear in the status bar at the top of the system in the form of charts or scroll bar text, such as notifications for applications running in the background, or notifications that appear on the screen in the form of conversation windows. For example, text information is prompted in the status bar, a beep sounds, the electronic device vibrates, the indicator light flashes, etc.
  • Android runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library contains two parts: one is the functional functions that need to be called by the Java language, and the other is the core library of Android.
  • the application layer and application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and application framework layer into binary files.
  • the virtual machine is used to perform object life cycle management, stack management, thread management, security and exception management, and garbage collection and other functions.
  • System libraries can include multiple functional modules. For example: surface manager (surface manager), media libraries (media libraries), 3D graphics processing libraries (for example: OpenGL ES), 2D graphics engines (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, composition, and layer processing.
  • 2D Graphics Engine is a drawing engine for 2D drawing.
  • system library can also include status monitoring service modules, etc., such as the physical status recognition module, which is used to analyze and recognize user gestures; the sensor service module, which is used to monitor the sensor data uploaded by various sensors in the hardware layer, and determine the electronic The physical state of the device 100.
  • status monitoring service modules such as the physical status recognition module, which is used to analyze and recognize user gestures; the sensor service module, which is used to monitor the sensor data uploaded by various sensors in the hardware layer, and determine the electronic The physical state of the device 100.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • the hardware layer may include various types of sensors, such as the various types of sensors introduced in Figure 1, acceleration sensors, gyroscope sensors, touch sensors, etc. involved in the embodiment of this application.
  • the physical devices involved in the electronic device 100 mainly include sensors, decision support systems (decision support systems, DSS) display chips, touch screens, and fingerprint recognition modules. and other hardware components; as well as screen management module, display driver, fingerprint driver, anti-accidental touch and other kernel software layers; anti-accidental touch input, screen control, screen-off display (always on display, AOD) service, power management and other application framework layers functions; as well as application layer services such as special adaptation applications (camera), third-party applications, system hibernation, and AOD.
  • DSS decision support systems
  • DSS decision support systems
  • DSS decision support systems
  • touch screens touch screens
  • fingerprint recognition modules and other hardware components
  • screen management module display driver, fingerprint driver, anti-accidental touch and other kernel software layers
  • anti-accidental touch input, screen control, screen-off display (always on display, AOD) service power management and other application framework layers functions
  • application layer services such as special adaptation applications (camera), third-party applications, system hibernation, and AOD.
  • FIG. 3 a schematic diagram of a screen projection connection scenario between a tablet computer and a monitor is shown.
  • the tablet computer in the figure can also be an electronic device such as a mobile phone or a personal computer.
  • the tablet pictured is for illustrative purposes only. Tablets and monitors are in a single-core, multi-screen scenario.
  • the tablet computer is equipped with the Android system; the display is equipped with systems such as Harmony OS.
  • the screen of the tablet computer and the screen of the monitor have different identifiers (display ID).
  • the screens are independent of each other, have different characteristics, and can be considered as different logical screens. This results in objects such as applications, icons, or windows being displayed on separate logical screens.
  • Input events will also be distributed to different logical screens respectively, and the mouse cursor cannot move freely between multiple logical screens.
  • the tablet computer and the monitor establish a screen projection connection
  • the tablet computer is similar to the host computer.
  • the monitor can be used as the second screen of the tablet computer.
  • the content on the tablet computer is displayed through the monitor, and the user can use the tablet computer like a computer.
  • the applicable scenario of this application is different from the scenario where a computer equipped with Windows system is connected to an extended screen.
  • the screen of the computer and the screen of the extended screen form a logical screen, but the screen of the computer and the screen of the extended screen Each is a different physical screen.
  • Objects such as applications, icons, or windows are displayed on the logical screen and can be displayed on different physical screens at the same time. It can be understood that different physical screens are cut from one logical screen. In this scenario, there is no problem that the mouse cannot move between multiple logical screens.
  • Embodiments of the present application provide a cursor display method and an electronic device.
  • the method can be applied to the usage scenario of multiple electronic devices with one core and multiple screens, and provides the dynamic movement of keyboard and mouse and the reporting of input events for the scenario of one core and multiple screens. solution.
  • This method can dynamically move the screen where the mouse cursor is located (the screen of the tablet or the screen of the monitor) by sensing the user's intention and determining the user's intention. And send input events to the specified screen. In this way, the cursor can be moved freely between different screens, and input events can be transmitted between different screens. Users can freely move the cursor between multiple screens with a mouse cursor, browse content on multiple screens at the same time, and drag and drop content between multiple screens. This method can improve the efficiency of collaborative use of multiple electronic devices, thereby improving the user experience.
  • the software system of the first electronic device 400 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture or a cloud architecture. Take the layered architecture as an example.
  • the layered architecture divides the software into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • FIG. 4 a schematic system architecture diagram of the first electronic device 400 is shown.
  • the structure of the first electronic device 400 may be similar to the electronic device 100 above.
  • the screen management module 411, the drag management module 412 and the keyboard and mouse management module 413 are in the application layer 410 (which can also be understood as the application layer shown in Figure 2), and the view module 421, window management service The module 422, the input management service module 423 and the underlying input module 424 are within the framework layer 420 (which can also be understood as the application framework layer shown in Figure 2).
  • the screen management module 411, the drag management module 412, the keyboard and mouse management module 413, the view module 421, the window management service module 422, the input management service module 423 and the bottom input module 424 are within the framework layer 420 .
  • the framework layer 420 is mainly responsible for providing application programming interfaces and programming frameworks for the application layer.
  • the first electronic device 400 may also include other layers, such as a core layer (the core layer shown in FIG. 2 ) and so on.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer can at least include display drivers, camera drivers, audio drivers, sensor drivers, etc.
  • the screen management module 411 can be used to determine the target screen where the mouse cursor moves or the target screen where the input event is sent.
  • the screen management module 411 stores the identification (Identity, ID) of each screen.
  • the drag management module 412 is used to manage drag events.
  • the keyboard and mouse management module 413 is used to manage mouse events.
  • View module 421 is used to manage and control pieces.
  • the input event includes a mouse event or a drag event
  • the mouse event includes a mouse move event or a mouse click event.
  • mouse movement can be understood as the movement of the mouse cursor. Taking mouse movement events as an example, the keyboard and mouse management module 413 can register mouse event monitoring with the window management service module 422.
  • the window management service module 422 calls back the mouse movement event to the keyboard and mouse management module 413.
  • the keyboard and mouse management module 413 determines the mouse movement event, it sends the mouse movement event to the input management service module module 423.
  • the input management service module module 423 sends the mouse movement event to the bottom input module 424.
  • the bottom input module 424 changes the mouse display position and The reporting direction of mouse movement events.
  • the drag management module 412 can learn whether a drag event occurs through the view module 421 or the window management service module 422.
  • the first electronic device 400 may be a mobile phone, a tablet computer, a personal computer, a cellular phone, a wearable device, or other electronic device.
  • the second electronic device 500 may be an electronic device such as a tablet computer, a personal computer, a monitor, or a smart screen.
  • the input device 600 may be a mouse, a keyboard, a touch pad, a touch screen, etc., which is not specifically limited in this application. It can be understood that the screen of the first electronic device 400 is the first screen, and the screen of the second electronic device 500 is the second screen.
  • the first screen and the second screen are Any screen can be used as a target screen.
  • the target screen can also be called the enable screen, and the enable screen can be understood as the screen that displays mouse operations.
  • the mouse appears as a cursor on the enable screen.
  • the first electronic device 400 is a mobile phone
  • the second electronic device 500 is a monitor
  • the input device 600 is a mouse.
  • the first electronic device 400 and the second electronic device 500 may be connected in a wired or wireless manner. Based on the established connection, the first electronic device 400 and the second electronic device 500 can be used together.
  • the wireless communication protocol used may be a wireless fidelity (Wi-Fi) protocol or a Bluetooth (Bluetooth) protocol. , ZigBee protocol, near field communication (NFC) protocol, etc., or various cellular network protocols.
  • the input device 600 may be a peripheral of the first electronic device 400 or a peripheral of the second electronic device 500, which is not limited in this application.
  • the user connects the mobile phone to the display through wired or wireless connections and uses the display as the second screen of the mobile phone.
  • the mobile phone and the monitor can display different GUIs, so that users can handle different work tasks through the mobile phone and the monitor at the same time.
  • the user connects the mouse to the phone.
  • the display is on the right side of the phone.
  • the mouse cursor can be displayed at the last position it appeared. For example, the mouse cursor is displayed on the screen of the mobile phone. The user moves the mouse from left to right so that the mouse cursor displayed on the phone reaches the right edge of the phone screen.
  • the current mouse cursor when the mouse continues to move to the right, can be displayed on the left edge of the display as shown in Figure 5(b). In another implementation, when the mouse continues to move to the right, the current mouse cursor can be directly displayed at the default position of the monitor. For example, the default position can be the center of the monitor. This process can realize the movement of the mouse cursor from the mobile phone to the monitor. Of course, the mouse cursor can also move from the monitor to the phone. The user can then move the mouse on the monitor screen or perform any operation task.
  • the display shown in Figure 5 is located on the right side of the mobile phone, which is a preset direction.
  • This direction can be a preset direction set by the user on the mobile phone or monitor interface; it can also be a preset direction determined by the mobile phone or monitor by sensing its relative position; it can also be a preset direction determined by the user after the mobile phone or monitor senses its relative position. Adjust the determined preset direction.
  • the preset direction can be set to any direction of the display relative to the mobile phone, which is not limited in this application.
  • the keyboard and mouse management module in the mobile phone determines based on the user's intention that the user wants to move the mouse from the mobile phone to the display.
  • the screen management module moves the screen where the mouse is located (for example, moves the mouse from the screen of the mobile phone to the screen of the monitor), and also That is to say, modify the screen ID reported by input events (mouse movement events).
  • This allows input events to be sent to the selected screen (eg, the screen of the monitor).
  • the keyboard and mouse management module can also determine the position of the mouse cursor displayed on the display (for example, the position of the left edge of the display screen as shown in Figure 5(b)) according to the user's intention.
  • the position of the mouse cursor displayed on the display may have a corresponding relationship with the original position on the mobile phone.
  • the corresponding relationship may be that the left edge position of the mouse cursor displayed on the display corresponds to the original right edge position on the mobile phone.
  • the default position of the mouse cursor displayed on the monitor corresponds to its original right edge position on the phone.
  • the default position may be the center position of the display. This application does not limit the specific correspondence between the position of the mouse cursor displayed on the display and its original position on the mobile phone.
  • the cursor display method obtaineds the user's intention (for example, the user moves the mouse to the edge of the mobile phone and continues to move outward), and selects the target screen for mouse movement according to the user's intention, and modifies the target for input event reporting. Screen.
  • This allows subsequent input events to be sent to the selected target screen, for example, the subsequent input event for a mouse move event can be a mouse click event.
  • you can set the corresponding position or default position of the mouse cursor displayed on the target screen.
  • This method enables the cursor to move freely between different screens in a multi-screen scenario, allowing multiple screens to share a cursor to perform operations, which helps improve the user experience when using multiple screens.
  • FIG. 6 another set of GUIs provided by the embodiment of the present application is shown.
  • the following description takes the first electronic device 400 as a mobile phone and the second electronic device 500 as a display as an example.
  • the input device 600 is a mouse. As shown in FIG. 6(a) , the input device 600 may be a peripheral of the first electronic device 400 or a peripheral of the second electronic device 500, which is not limited in this application.
  • the user connects the mobile phone to the display through wired or wireless connections.
  • the screen of the mobile phone is the first screen and the screen of the display is the second screen.
  • the user connects the mouse to the phone.
  • the display is located on the right side of the mobile phone.
  • the mouse cursor can be displayed at the last position it appeared.
  • the mouse cursor is displayed on the screen of the mobile phone.
  • the mouse cursor is displayed in the center of the phone screen.
  • This process can realize the movement of the mouse from the mobile phone to the monitor.
  • the mouse can also move from the monitor to the mobile phone, as shown in Figure 14 for details.
  • the user can then move the mouse on the monitor screen or perform any operation task.
  • the user can operate the picture in the center of the monitor screen by clicking the mouse.
  • the position of the mouse cursor displayed on the monitor screen corresponds to the position displayed on the mobile phone screen.
  • the mouse cursor can move from the center of the mobile phone screen to the center of the monitor screen.
  • the mouse cursor can move from the left side of the mobile phone screen to the left side of the monitor screen.
  • the position of the mouse cursor displayed on the monitor screen can be the set default position. At this time, the user shakes the mouse anywhere on the mobile phone screen, and the mouse cursor will move from the mobile phone screen to the monitor screen.
  • the default position is, for example, the center of the monitor screen.
  • the input device 800 is a touch panel. As shown in FIG. 6(c) , the input device 800 may be a peripheral of the first electronic device 400 or a peripheral of the second electronic device 500, which is not limited in this application. Users can draw a preset shape "C" on the touch screen with their fingers to move the mouse cursor from the phone screen to the monitor.
  • FIG. 7 another set of GUIs provided by the embodiment of the present application is shown.
  • the following description takes the first electronic device 400 as a mobile phone, the second electronic device 500 as a monitor, and the input device 600 as a mouse as an example.
  • the input device 600 may be a peripheral of the first electronic device 400 or a peripheral of the second electronic device 500, which is not limited in this application.
  • the user connects the mobile phone to the display through wired or wireless connections.
  • the screen of the mobile phone is the first screen and the screen of the display is the second screen.
  • the user connects the mouse to the phone.
  • the display is located on the right side of the mobile phone.
  • the mouse cursor can be displayed at the last position it appeared.
  • the mouse cursor is displayed on the screen of the mobile phone.
  • the mouse cursor is displayed in the center of the phone screen.
  • the mobile phone can not only determine the user's intention based on the user shaking the mouse in a preset shape, but also based on the gestures made or drawn by the user's knuckles.
  • the position of the mouse cursor displayed on the monitor screen corresponds to the position displayed on the mobile phone screen. For example, if the user clicks or taps on the center of the mobile phone screen, the mouse cursor can move from the center of the mobile phone screen to the center of the monitor screen.
  • the mouse cursor can move from the left side of the mobile phone screen to the left side of the monitor screen.
  • the position of the mouse displayed on the monitor screen can be the set default position. At this time, when the user clicks or taps the screen anywhere on the mobile phone screen, the mouse cursor will move from the mobile phone screen to the monitor screen.
  • the default position in , for example, the default position can be the center of the monitor screen.
  • the keyboard and mouse management module in the mobile phone determines that the user wants to move the mouse from the mobile phone to the display based on the user's intention.
  • the screen management module moves the screen where the mouse is located (for example, moves the mouse from the screen of the mobile phone to the screen of the monitor), that is, modifies the screen ID reported by the input event (mouse movement event). This allows input events to be sent to the selected screen (eg, the screen of the monitor).
  • the keyboard and mouse management module can also determine the position of the mouse cursor displayed on the display (for example, the center position of the display screen as shown in Figure 6(b) or Figure 7(b)) according to the user's intention.
  • the position of the mouse cursor displayed on the display may have a corresponding relationship with the original position on the mobile phone.
  • the corresponding relationship can be that the center position of the mouse cursor displayed on the display corresponds to the center position originally located on the mobile phone; or, the mouse cursor displayed at the left side of the display screen corresponds to the center left position originally located on the mobile phone screen. corresponding to the position; or, the default position of the mouse cursor displayed on the monitor screen corresponds to any position of the original mobile phone, wherein the default position can be the center of the monitor screen.
  • This application does not limit the specific correspondence between the position of the mouse cursor displayed on the display and its original position on the mobile phone.
  • the cursor display method obtaineds the user's intention (for example, the user shakes in the center of the mobile phone screen in a preset shape or the user double-clicks the screen in the center of the mobile phone screen with his knuckles), and selects the mouse movement according to the user's intention.
  • Target screen modify the target screen for input event reporting. This causes subsequent input events to be sent to the selected target screen.
  • This method enables the cursor to move freely between different screens, and allows the cursor to be directly displayed at the corresponding position or default position of the target screen, making it convenient for the user to perform the next operation. It can further improve the user experience when using multiple screens.
  • FIG. 8 another set of GUIs provided by the embodiment of the present application is shown.
  • the following description takes the first electronic device 400 as a mobile phone and the second electronic device 500 as a display as an example.
  • the user connects the mobile phone to the display through wired or wireless connections.
  • the screen of the mobile phone is the first screen, and the screen of the display is the second screen.
  • the user connects the mouse to the phone.
  • the display is on the right side of the phone.
  • the mouse cursor can be displayed at the last position it appeared.
  • the mouse cursor is displayed on the screen of the mobile phone.
  • the user performs system settings on the mobile phone to display the screen of the mouse. By setting the system to display the mouse on an external screen (monitor screen), the mouse cursor can be moved from the mobile phone to the monitor.
  • the user can then move the mouse on the monitor screen or perform any operation task.
  • the user uses system settings to move the mouse cursor from the mobile phone screen to the center of the display screen as shown in Figure 8(b).
  • the display position of the mouse cursor on the target screen may be the center position of the target screen by default.
  • the user can operate the image in the center of the monitor screen by clicking the mouse.
  • the keyboard and mouse management module in the mobile phone can determine that the user wants to move the mouse from the mobile phone to the display according to the user's intention.
  • the screen management module moves the screen where the mouse cursor is located (for example, moves the mouse from the screen of the mobile phone to the screen of the monitor), that is, modifies the screen ID reported by the input event (mouse movement event). This allows input events to be sent to the selected screen (eg, the screen of the monitor).
  • the keyboard and mouse management module can also determine the position of the mouse cursor displayed on the monitor according to the user's intention (for example, the default central position of the monitor screen as shown in Figure 8(b)).
  • the position of the mouse cursor displayed on the display can be any preset position, which is not limited in the embodiments of the present application.
  • the cursor display method obtaineds the user's intention (for example, the user performs system settings on the screen where the mouse is displayed on the mobile phone), selects the target screen for mouse movement according to the user's intention, and modifies the target screen for input event reporting. . This causes subsequent input events to be sent to the selected target screen. At the same time, after obtaining the user's intention, you can set the corresponding position or default position of the mouse cursor displayed on the target screen. This method enables the cursor to move freely between different screens, allowing multiple screens to share a cursor to perform operations, which helps improve the user experience when using multiple screens.
  • the first electronic device 400 is often wired or wirelessly connected to multiple second electronic devices. Taking a first electronic device 400 that is connected to a second electronic device 500 and a third electronic device 700 as an example below, the cursor display method in the embodiment of the present application will be described.
  • FIG. 9 another set of GUIs provided by the embodiment of the present application is shown.
  • the following description takes the first electronic device 400 as a mobile phone, the second electronic device 500 as the first display, the third electronic device 700 as the second display, and the input device 600 as a mouse as an example.
  • the input device 600 may be a peripheral of the first electronic device 400, a peripheral of the second electronic device 500, or a peripheral of the third electronic device 700.
  • This application is This is not a limitation.
  • the user connects the mobile phone to the first display and the second display through wired or wireless connections.
  • the mobile phone screen is the first screen
  • the first display is used as the second screen of the mobile phone
  • the second display is used as the third screen of the mobile phone.
  • the user connects the mouse to the phone.
  • both the first display and the second display are located on the right side of the mobile phone.
  • the mouse cursor can be displayed at the position where it last appeared. For example, the mouse cursor is displayed on the screen of the mobile phone.
  • the user moves the mouse from left to right so that the mouse cursor displayed on the phone reaches the right edge of the phone screen.
  • the mouse cursor is first displayed on the left edge of the first monitor.
  • the mouse cursor will move to the right edge of the first monitor.
  • the mouse cursor may be displayed on the left edge of the second display as shown in Figure 9(b).
  • This process enables the mouse cursor to move between the mobile phone and multiple displays.
  • the user can move the mouse cursor on the screen of the second display screen or perform any operation task. It can be understood that by moving the mouse from right to left, the user can also move the mouse cursor from the second display to the first display to the mobile phone. This process is opposite to the above process and will not be described again here.
  • first display and the second display shown in Figure 9 are both located on the right side of the mobile phone.
  • This direction can be a preset direction set by the user on the mobile phone or display interface; it can also be a preset direction determined by the mobile phone or display by sensing its relative position; or it can be further adjusted by the user after the mobile phone or display senses its relative position.
  • the preset direction can be set to any direction of the display relative to the mobile phone, which is not limited in this application.
  • the keyboard and mouse management module in the mobile phone can determine based on the user's intention that the user wants to move the mouse cursor from the mobile phone to the display.
  • the screen management module moves the screen where the mouse cursor is located (for example, moves the mouse cursor from the screen of the mobile phone to the screen of the second monitor), that is, modifies the screen ID reported by the input event (mouse movement event).
  • This allows input events to be sent to the selected screen (eg, the screen of the second display).
  • the keyboard and mouse management module can also determine the position of the mouse cursor displayed on the second display (for example, the position of the left edge of the second display screen as shown in Figure 9(b)) according to the user's intention.
  • the position of the mouse cursor displayed on the second display may have a corresponding relationship with the original position on the mobile phone screen.
  • the corresponding relationship may be that the left edge position of the mouse cursor displayed on the second display corresponds to the right edge position originally on the mobile phone; or, the default position of the mouse cursor displayed on the second display corresponds to the right edge position originally on the mobile phone.
  • the right edge position corresponds.
  • the default position may be the center position of the second display. This application does not limit the specific correspondence between the position of the mouse cursor displayed on the display and its original position on the mobile phone.
  • the cursor display method obtaineds the user's intention (for example, the user moves the mouse to the edge of the mobile phone and continues to move outward), and selects the target screen for the mouse cursor movement according to the user's intention, and modifies the input event reporting. target screen. This causes subsequent input events to be sent to the selected target screen. At the same time, after obtaining the user's intention, you can set the corresponding position or default position of the mouse cursor displayed on the target screen. This method enables the cursor to move freely between different screens, allowing multiple screens to share a cursor to perform operations, which helps improve the user experience when using multiple screens.
  • FIG. 10 Another set of GUIs provided by the embodiment of the present application is shown.
  • the following description takes the first electronic device 400 as a mobile phone, the second electronic device 500 as a first display, and the third electronic device 700 as a second display as an example.
  • the input device 600 is a mouse. As shown in FIG. 10(a) , the input device 600 may be a peripheral of the first electronic device 400 , a peripheral of the second electronic device 500 , or a peripheral of the third electronic device 700 .
  • This application describes This is not a limitation.
  • the user connects the mobile phone to the first display and the second display through wired or wireless connections.
  • the mobile phone screen is the first screen, the first display is used as the second screen of the mobile phone, and the second display is used as the third screen of the mobile phone.
  • the user connects the mouse to the phone. For example, both the first display and the second display are located on the right side of the mobile phone.
  • the mouse cursor can be displayed at the last position it appeared.
  • the mouse cursor is displayed on the screen of the mobile phone.
  • the mouse appears in the center of the phone screen.
  • the user shakes or rotates the mouse in a preset shape in the center of the mobile phone screen, which can trigger the mouse cursor to move to the center of the corresponding monitor screen.
  • the user shakes the mouse in the preset shape "C" in the center of the mobile phone screen, and the mouse cursor can be moved directly from the mobile phone to the first display; as shown in Figure 10(b), the user Shake the mouse in the preset shape "S" in the center of the phone screen, and the mouse cursor can be moved directly from the phone to the second monitor.
  • the user can move the mouse cursor on the screen of the first display or the second display or perform any operation task.
  • the user can operate the picture at the center of the screen of the first display or the second display by clicking the mouse.
  • the position of the mouse cursor displayed on the first display and the second display screen corresponds to the position displayed on the mobile phone screen.
  • the user shakes the mouse in the center of the mobile phone screen, and the mouse cursor moves from the center of the mobile phone screen to the center of the first display screen or the second display screen.
  • the mouse cursor can move from the left side of the mobile phone screen to the left side of the first display screen or the second display screen.
  • the mouse cursor is on the first monitor and the position displayed on the second monitor screen can be the set default position. At this time, the user shakes the mouse anywhere on the mobile phone screen, and the mouse cursor will move from the mobile phone screen to the first monitor screen or the second monitor screen.
  • Default position for example, the default position can be the center of the screen.
  • the input device 800 is a touch panel.
  • the input device 800 may be a peripheral device of the first electronic device 400, a peripheral device of the second electronic device 500, or a peripheral device of the third electronic device 700, which is not limited in this application.
  • the user can draw a preset shape "C" on the touch screen with his finger to move the mouse cursor from the mobile phone screen to the first display.
  • the user can draw a preset shape "S" on the touch screen with his finger to move the mouse cursor from the mobile phone screen to the second display.
  • FIG. 11 another set of GUIs provided by the embodiment of the present application is shown.
  • the following description takes the first electronic device 400 as a mobile phone, the second electronic device 500 as the first display, the third electronic device 700 as the second display, and the input device 600 as a mouse as an example.
  • the input device 600 may be a peripheral of the first electronic device 400 , a peripheral of the second electronic device 500 , or a peripheral of the third electronic device 700 .
  • This application describes This is not a limitation.
  • the user connects the mobile phone to the first display and the second display through wired or wireless connections.
  • the mobile phone screen is the first screen
  • the first display is used as the second screen of the mobile phone
  • the second display is used as the third screen of the mobile phone.
  • the user connects the mouse to the phone.
  • both the first display and the second display are located on the right side of the mobile phone.
  • the mouse cursor can be displayed at the last position it appeared.
  • the mouse cursor is displayed on the screen of the mobile phone.
  • the user clicks twice on the center of the phone screen with his knuckles, and the mouse cursor can move directly from the phone to the first display; as shown in Figure 11(b), the user clicks on the center of the phone screen with his knuckles. Click the screen three times in the center of the mobile phone screen, and the mouse cursor can be moved directly from the mobile phone to the second monitor. Subsequently, the user can move the mouse cursor on the screen of the first display or the second display or perform any operation task. For example, the user can operate the picture at the center of the screen of the first display or the second display by clicking the mouse.
  • the position of the mouse cursor displayed on the first display and the second display screen corresponds to the position displayed on the mobile phone screen.
  • the user clicks on the center of the mobile phone screen with his knuckle and the mouse cursor moves from the center of the mobile phone screen to the center of the first display screen or the second display screen.
  • the mouse cursor can move from the left side of the mobile phone screen to the left side of the first display screen or the second display screen.
  • the position of the mouse cursor displayed on the first display and the second display screen can be the set default position.
  • the mouse cursor will be displayed from the mobile phone screen.
  • the screen moves to a default position on the first display screen or the second display screen.
  • the default position may be the center of the screen.
  • the keyboard and mouse management module in the mobile phone determines that the user wants to move the mouse from the mobile phone to the display based on the user's intention.
  • the screen management module moves the screen where the mouse is located (for example, moves the mouse from the screen of the mobile phone to the screen of the monitor).
  • modify the screen ID reported by input events mouse movement events.
  • This allows input events to be sent to the selected screen (eg, the screen of the first display or the second display).
  • the keyboard and mouse management module can also determine the position of the mouse cursor displayed on the monitor according to the user's intention. For example, the position of the mouse cursor on the first display or the second display has a corresponding relationship with the original position on the mobile phone screen.
  • the corresponding relationship may be that the mouse is displayed at the center position of the first display or the second display screen corresponding to the original center position on the mobile phone screen; or, the mouse is displayed at the left side of the first display or the second display screen. Corresponds to the original left position on the mobile phone screen; or the mouse cursor is displayed at the default position and original position of the first or second monitor. Corresponds to any position on the mobile phone screen, where the default position may be the center position of the first display or the second display.
  • the cursor display method obtaineds the user's intention (for example, the user shakes in the center of the mobile phone screen in a preset shape or the user clicks the screen at the center of the mobile phone screen with his knuckles), and selects the mouse movement according to the user's intention.
  • Target screen modify the target screen for input event reporting. This causes subsequent input events to be sent to the selected target screen.
  • This method enables the cursor to move freely between different screens, and allows the cursor to be directly displayed at the corresponding position or default position of the target screen, making it convenient for the user to perform the next operation. It can further improve the user experience when using multiple screens.
  • the following shows a method in which the user moves the mouse cursor continuously between different screens by shaking the mouse or by clicking the screen.
  • FIG 12 shows that in the scenario of Figure 10, the user moves the mouse cursor from the first display to the second display by shaking the mouse.
  • the input device 600 may be a peripheral of the first electronic device 400, a peripheral of the second electronic device 500, or a peripheral of the third electronic device 700.
  • the user connects the mobile phone to the first display and the second display through wired or wireless connections.
  • the mobile phone screen is the first screen
  • the first display is used as the second screen of the mobile phone
  • the second display is used as the third screen of the mobile phone.
  • the user connects the mouse to the phone.
  • both the first display and the second display are located on the right side of the mobile phone.
  • the mouse cursor can be displayed at the last position it appeared.
  • the mouse cursor is displayed on the screen of the mobile phone.
  • the mouse appears in the center of the phone screen.
  • the user shakes the mouse in the preset shape "C" in the center of the mobile phone screen, and the mouse cursor can first move from the mobile phone to the first display;
  • Figure 12(b) As shown in the figure, the user shakes the mouse in the preset shape "C” again in the center of the screen of the first display, and the mouse cursor can move from the first display to the second display.
  • the user shakes the mouse in a preset shape "C", corresponding to which the mouse cursor can move to the electronic device on the right side.
  • C a preset shape
  • the user shakes the mouse in the preset shape "C” it means that the mouse cursor moves to the electronic device on the right side once.
  • the mouse cursor will move to the target screen.
  • the user shakes the mouse in the preset shape "C", and the mouse cursor can move from the mobile phone screen to the screen of the first display. That is to say, the target screen corresponding to the preset shape "C” is The screen of the first monitor.
  • the user shakes the mouse in the preset shape "S”, and the mouse cursor can move from the screen of the first display to the screen of the second display. That is to say, the target screen corresponding to the preset shape "S” is the screen of the second display. Users can continuously shake different preset shapes to move the mouse cursor to the corresponding target screen.
  • FIG. 13 another set of GUIs provided by the embodiment of the present application is shown.
  • the first electronic device 400 the following The computer
  • the second electronic device 500 is the first display
  • the third electronic device 700 is the second display
  • the input device 600 is a mouse.
  • the input device 600 may be a peripheral of the first electronic device 400, a peripheral of the second electronic device 500, or a peripheral of the third electronic device 700.
  • This application is This is not a limitation.
  • the user connects the mobile phone to the first display and the second display through wired or wireless connections.
  • the mobile phone screen is the first screen
  • the first display is used as the second screen of the mobile phone
  • the second display is used as the third screen of the mobile phone.
  • the user connects the mouse to the phone.
  • both the first display and the second display are located on the right side of the mobile phone.
  • the mouse cursor can be displayed at the last position it appeared.
  • the mouse cursor is displayed on the screen of the mobile phone.
  • the user clicks twice on the center of the mobile phone screen with his knuckles, and the mouse cursor can first move from the mobile phone to the first display; as shown in Figure 13(b) Display, the user clicks the screen twice again in the center of the screen of the first display, and the mouse cursor can move from the first display to the second display.
  • the mouse cursor can move to the electronic device on the right side.
  • the mouse cursor moves once to the right side of the electronic device.
  • the mouse cursor will move to the target screen.
  • the mouse cursor can move from the mobile phone screen to the screen of the first display. That is to say, the target screen corresponding to clicking the screen twice is the screen of the first display.
  • the user can click the screen different times to move the mouse cursor to the corresponding target screen.
  • the screen that the user clicks on may be the target screen where the mouse cursor moves. For example, if the user clicks twice on the first display screen, the mouse cursor can move to the first displayed screen; if the user clicks twice on the mobile phone screen, the mouse cursor can move to the mobile phone screen. The user can click on the screen of different devices to move the mouse cursor to the clicked target screen.
  • Figures 12 and 13 show two ways for the user to move the mouse cursor: shaking the mouse and clicking the screen.
  • the mouse cursor can be moved from the mobile phone screen to the screen of the first display, and then to the screen of the second display. the process of.
  • the method of continuously moving the mouse cursor is not limited to the above method.
  • the user can first move the mouse cursor to the screen of the first monitor by shaking the mouse, and then move the mouse cursor to the screen of the second monitor by clicking the screen; or, the user can first move the mouse cursor to the edge of the mobile screen Then move the mouse cursor to the screen of the first monitor by continuing to move toward the edge, and then move the mouse cursor to the screen of the second monitor by shaking the mouse or clicking the screen.
  • the following shows a method in which the user moves the mouse cursor from the target screen to the original screen by shaking the mouse or clicking the screen.
  • FIG. 14 another set of GUIs provided by the embodiment of the present application is shown.
  • the following description takes the first electronic device 400 as a mobile phone, the second electronic device 500 as the first display, the third electronic device 700 as the second display, and the input device 600 as a mouse as an example.
  • the input device 600 may be a peripheral of the first electronic device 400 or a second electronic device.
  • the peripherals of the sub-device 500 may also be the peripherals of the third electronic device 700, which is not limited in this application.
  • the user connects the mobile phone to the first display and the second display through wired or wireless connections.
  • the mobile phone screen is the first screen, the first display is used as the second screen of the mobile phone, and the second display is used as the third screen of the mobile phone.
  • the user connects the mouse to the phone.
  • both the first display and the second display are located on the right side of the mobile phone.
  • the mouse cursor can be displayed at the last position it appeared.
  • the mouse cursor is displayed on the screen of the mobile phone.
  • the mouse appears in the center of the phone screen.
  • the user shakes the mouse in a preset shape "C" in the center of the mobile phone screen, and the mouse cursor can move from the mobile phone to the first display; as shown in Figure 14(b) According to the display, the user shakes the mouse in a preset shape "V" in the center of the screen of the first display, and the cursor of the mouse can move from the first display to the screen of the mobile phone.
  • the user shakes the mouse in the preset shape "S" in the center of the mobile phone screen, and the mouse cursor can move from the mobile phone to the second display;
  • Figure 14(d) As shown in the figure, the user shakes the mouse in a preset shape "V" in the center of the second display screen, and the mouse cursor can move from the second display to the mobile phone screen.
  • the user shakes the mouse in a preset shape "V" to indicate moving the mouse cursor to the screen where the mouse was last displayed. For example, when the mouse cursor was last displayed on the first display, the user shook the mouse in a preset shape "V" in the center of the screen of the second display, and the mouse cursor moved from the second display to the first display.
  • FIG. 15 another set of GUIs provided by the embodiment of the present application is shown.
  • the following description takes the first electronic device 400 as a mobile phone, the second electronic device 500 as the first display, the third electronic device 700 as the second display, and the input device 600 as a mouse as an example.
  • the input device 600 may be a peripheral of the first electronic device 400, a peripheral of the second electronic device 500, or a peripheral of the third electronic device 700.
  • This application is This is not a limitation.
  • the user connects the mobile phone to the first display and the second display through wired or wireless connections.
  • the mobile phone screen is the first screen
  • the first display is used as the second screen of the mobile phone
  • the second display is used as the third screen of the mobile phone.
  • the user connects the mouse to the phone.
  • both the first display and the second display are located on the right side of the mobile phone.
  • the mouse cursor can be displayed at the last position it appeared.
  • the mouse cursor is displayed on the screen of the mobile phone.
  • the target screen corresponding to the screen clicked twice is the screen of the first display
  • the target screen corresponding to the screen clicked three times is the screen of the second display
  • the target screen corresponding to the screen clicked four times is the screen of the mobile phone.
  • the user clicks twice on the center of the phone screen and the mouse cursor can move from the phone to the first display
  • the user clicks on the center of the first display Four screens, the mouse cursor can move from the first monitor to the phone screen.
  • the user clicks three times on the center of the mobile phone screen, and the mouse cursor can move from the mobile phone to the second display; as shown in Figure 15(d), the user clicks four times on the center of the second display screen screen, the mouse cursor can be moved from the second monitor to the phone screen.
  • it can also be preset to click the screen four times (or other operations) to mean moving the mouse cursor to the screen last displayed by the mouse. For example, when the mouse cursor was last displayed on the first display, the user clicked four times on the center of the screen of the second display, and the mouse cursor moved from the second display to the first display. Alternatively, the user can click on the screen of a different device to move the mouse cursor to the clicked target screen.
  • FIG. 16 another set of GUIs provided by the embodiment of the present application is shown.
  • the following description takes the first electronic device 400 as a mobile phone, the second electronic device 500 as a first display, and the third electronic device 700 as a second display as an example.
  • the user connects the mobile phone to the first display and the second display through wired or wireless connections.
  • the screen of the mobile phone is the first screen
  • the first display is used as the second screen of the mobile phone
  • the second display is used as the third screen of the mobile phone.
  • the user connects the mouse to the phone.
  • the first display and the second display are located on the right side of the mobile phone.
  • the mouse cursor can be displayed at the location where it last appeared.
  • the mouse cursor is displayed on the screen of the mobile phone.
  • the user performs system settings on the mobile phone to display the screen of the mouse.
  • the mouse cursor can be moved from the mobile phone to the third monitor. Movement of one display or second display.
  • the user clicks the peripheral screen setting control 1601 in the system settings with the mouse, and the mobile phone displays the peripheral screen setting GUI.
  • the user clicks the external screen 2 control 1602 in the peripheral screen setting GUI with the mouse to move the mouse cursor from the mobile phone to the second display. Subsequently, the user can move the mouse cursor on the screen of the second display or perform any operation task.
  • the user uses system settings to move the mouse cursor from the mobile phone screen to the center of the second display screen as shown in Figure 16(b). It can be understood that when moving the mouse through system settings, the display position of the mouse cursor on the target screen can be the center position of the target screen by default. At this time, the user can operate the image in the center of the monitor screen by clicking the mouse.
  • the keyboard and mouse management module in the mobile phone determines that the user wants to move the mouse from the mobile phone to the display based on the user's intention.
  • the screen management module moves the screen where the mouse is located (for example, moves the mouse from the screen of the mobile phone to the screen of the second display).
  • modify the screen ID reported by input events mie movement events.
  • This allows input events to be sent to the selected screen (eg, the screen of the second display).
  • the keyboard and mouse management module can also determine the position of the mouse cursor displayed on the display according to the user's intention (for example, the default center position of the second display screen as shown in Figure 12(b)).
  • the position of the mouse cursor displayed on the display can be any preset position, which is not limited in the embodiments of the present application.
  • the cursor display method obtaineds the user's intention (for example, the user performs system settings on the screen where the mouse is displayed on the mobile phone), selects the target screen for mouse movement according to the user's intention, and modifies the target screen for input event reporting. . This causes subsequent input events to be sent to the selected target screen. At the same time, after obtaining the user's intention, you can set the mouse cursor to the corresponding position on the target screen.
  • This method enables the cursor to move freely between different screens, and can be moved directly to the target screen in a multi-screen scenario. This allows multiple screens to share a cursor to perform operations, which helps improve the user experience when using multiple screens.
  • FIG. 17 another set of GUIs provided by the embodiment of the present application is shown.
  • the following description takes the first electronic device 400 as a mobile phone, the second electronic device 500 as the first display, the third electronic device 700 as the second display, and the input device 600 as a mouse as an example.
  • the input device 600 may be a peripheral of the first electronic device 400, a peripheral of the second electronic device 500, or a peripheral of the third electronic device 700.
  • This application is This is not a limitation.
  • the user connects the mobile phone to the first display and the second display through wired or wireless connections.
  • the mobile phone screen is the first screen
  • the first display is used as the second screen of the mobile phone
  • the second display is used as the third screen of the mobile phone.
  • the user connects the mouse to the phone.
  • the first display is located on the left side of the mobile phone
  • the second display is located on the right side of the mobile phone.
  • the mouse cursor can be displayed at the last position it appeared.
  • the mouse cursor is displayed on the screen of the mobile phone.
  • the user moves the mouse from left to right so that the mouse cursor displayed on the mobile phone reaches the right edge of the mobile phone screen.
  • the mouse cursor may be displayed on the left edge of the second display as shown in Figure 17(a).
  • the user moves the mouse from right to left so that the mouse cursor displayed on the phone reaches the left edge of the phone screen.
  • the mouse cursor may be displayed on the right edge of the first display as shown in Figure 17(b).
  • the first display shown in Figure 17 is located on the left side of the mobile phone, and the second display is located on the right side of the mobile phone.
  • This direction can be a preset direction set by the user on the mobile phone or display interface; it can also be a preset direction determined by the mobile phone or display by sensing its relative position; or it can be further adjusted by the user after the mobile phone or display senses its relative position.
  • the preset direction can be set to any direction of the display relative to the mobile phone, which is not limited in this application.
  • the keyboard and mouse management module in the mobile phone determines based on the user's intention that the user wants to move the mouse from the mobile phone to the display.
  • the screen management module moves the screen where the mouse is located (for example, moves the mouse from the screen of the mobile phone to the screen of the first display or the second display). In other words, modify the screen ID reported by input events (mouse movement events). This allows input events to be sent to the selected screen (eg, the screen of the first display or the second display).
  • the keyboard and mouse management module can also determine the position of the mouse cursor displayed in the first display or the second display according to the user's intention (for example, the position of the left edge of the second display screen as shown in Figure 17(a), or as shown in Figure 17(a) The position of the right edge of the first display screen shown in 17(b)).
  • the position displayed on the second display or the first display where the mouse cursor is located may have a corresponding relationship with the original position on the mobile phone screen.
  • the corresponding relationship may be that the left edge position of the mouse cursor displayed on the second display corresponds to the right edge position originally on the mobile phone, and the right edge position of the mouse cursor on the first display corresponds to the left edge position originally on the mobile phone.
  • the side edge position corresponds to the position; alternatively, the default position of the mouse cursor displayed on the second monitor corresponds to the right edge position originally on the mobile phone, and the default position of the mouse cursor on the first display corresponds to the left edge position originally on the mobile phone.
  • the default position may be the center position of the first display or the second display. This application does not limit the specific correspondence between the position of the mouse cursor displayed on the display and its original position on the mobile phone.
  • the cursor display method obtaineds the user's intention (for example, the user moves the mouse to the edge of the mobile phone and continues to move outward), and selects the target screen for mouse movement according to the user's intention, and modifies the target for input event reporting. Screen. This causes subsequent input events to be sent to the selected target screen. At the same time, after obtaining the user's intention, you can set the corresponding position of the mouse cursor displayed on the target screen. This method enables the cursor to move freely between different screens, allowing multiple screens to share a cursor to perform operations, which helps improve the user experience when using multiple screens.
  • the user intention mentioned above can be that the user moves the mouse to the edge of the first screen and continues to move outward as shown in Figure 5, Figure 9 or Figure 17; it can be as shown in Figure 6, Figure 10,
  • the user shown in Figure 12 or Figure 14 shakes the mouse in a preset shape; it can be the user clicking the screen with his knuckles as shown in Figure 7, Figure 11, Figure 13 or Figure 15; it can be as shown in Figure 8 or Figure 16 of users changing system settings.
  • the ways to reflect the user's intention in this application include but are not limited to the above.
  • the specific implementation can be set according to actual needs. This application does not limit the specific implementation of the user's intention.
  • Figure 18 shows a schematic flow chart of a cursor display method. This method can be applied to the scenarios shown in Figures 5 to 17, and can be applied to the system architecture of the first electronic device 400 shown in Figure 4. This method 1800 is described in detail below.
  • the window management service module senses mouse movement.
  • the bottom input module controls the mouse to move on the first screen.
  • the underlying input module controls the distribution of input events to the first screen.
  • the underlying input module passes the mouse movement event to the keyboard and mouse management module.
  • the mouse can also be called a peripheral.
  • the mouse cursor can move freely on the first screen following the user's operations.
  • Other input events accompanying mouse movement events will also be displayed on the first screen.
  • Other input events may be mouse click events, for example.
  • the first screen may be any one of multiple screens. It can be understood that the first screen is the original screen where the mouse moves.
  • the second screen is the target screen for mouse movement. For example, when the mouse cursor moves from the mobile phone screen to the computer screen, the mobile phone screen is the first screen and the computer screen is the second screen.
  • the mouse cursor moves from the computer screen to the mobile phone screen, the computer screen is the first screen and the mobile phone screen is the second screen.
  • step S1801 can be executed first, and then steps S1802, S1803; or steps S1802, S1803 can be executed first, and then step S1801; or both at the same time.
  • the keyboard and mouse management module determines the user's intention.
  • the window management service module senses mouse movement and passes the mouse movement event to the keyboard and mouse management module.
  • the keyboard and mouse management module determines the user's intention based on the user's operation of input devices such as the mouse.
  • the user's operation on the mouse may include at least one of the following: the mouse moves to the edge of the screen and continues to move outward, the user shakes the mouse in a preset shape, the user draws a specified gesture content through the mouse, and the user draws a gesture with the knuckles .
  • the keyboard and mouse management module sends the user's intention to the screen management module.
  • the screen management module selects the second screen.
  • the screen management module sends the information of the second screen to the keyboard and mouse management module.
  • the keyboard and mouse management module modifies the second screen for reporting input events.
  • the keyboard and mouse management module sends the information of the second screen to the bottom input module.
  • the screen management module can select the second screen according to the user's intention.
  • the second screen can be understood as the screen of the monitor or the target screen for mouse movement.
  • the screen management module determines the ID and other information of the second screen, and sends the information to the keyboard and mouse management module.
  • the keyboard and mouse management module modifies the screen ID for input event reporting based on the user's intention. That is, move the mouse to the second screen and cause the input event to be sent to the second screen.
  • the keyboard and mouse management module can also set the position of the mouse displayed on the second screen according to the user's intention. The position of the mouse cursor on the second screen corresponds to the original position of the first screen.
  • the corresponding relationship may be that the center position of the mouse cursor on the second screen corresponds to the center position of the first screen where the mouse cursor is originally located; or, the left (or right) position of the mouse cursor on the second screen corresponds to the center position of the first screen where the mouse cursor is originally located. corresponds to the left (or right) position of the mouse cursor; or, the default position of the mouse cursor on the second screen corresponds to any position on the original first screen, where the default position may be the central position.
  • the screen management module can also be configured according to the user's needs.
  • the intent selects a third screen (for example, the screen of a second monitor), and sets the position of the mouse displayed on the third screen according to the user's intent.
  • the bottom input module controls the mouse to move on the second screen.
  • the underlying input module controls the distribution of input events to the second screen.
  • FIG. 19 another set of GUIs provided by the embodiment of the present application is shown.
  • the following description takes the first electronic device 400 as a mobile phone, the second electronic device 500 as a display, and the drag object as a picture as an example.
  • the user connects the mobile phone to the display through wired or wireless connections, and uses the display as the second screen of the mobile phone.
  • the user connects the mouse to the phone.
  • the display is on the right side of the phone.
  • the mouse cursor can be displayed at the last position it appeared.
  • the mouse cursor is displayed on the screen of the mobile phone.
  • Mobile phones and monitors can display different GUIs.
  • the mobile phone displays the interface of the photo album application (application, App), and the display displays the interface of the social networking App.
  • the mouse cursor and the dragged image will be displayed on the left edge of the monitor.
  • the picture can move to the bottom of the social App interface as shown in Figure 19(a), and the user can move the mouse on the monitor screen or perform any operation task.
  • the dragging object that follows the movement of the mouse will show a dragging display effect on the screens of the mobile phone and the monitor.
  • the picture can be input into the input box as shown in Figure 19(b), or directly sent to the chat interface as shown in Figure 19(b).
  • the mouse can be used to drag pictures from the mobile phone to the monitor.
  • the mouse can also drag pictures from the monitor to the phone.
  • the keyboard and mouse management module in the mobile phone determines based on the user's intention that the user wants to move from the mobile phone to the display through the mouse.
  • the screen management module moves the screen where the mouse is located (for example, moves the mouse cursor from the screen of the mobile phone to the screen of the monitor).
  • mouse events This causes mouse events to be sent to the selected screen (for example, the screen of the monitor).
  • the keyboard and mouse management module can also determine the position of the display where the mouse cursor is located according to the user's intention (for example, the position of the left edge of the display screen as shown in Figure 19(a)).
  • the drag management module in the mobile phone determines that a drag event is occurring, and modifies the screen ID reported by the input event (drag event) so that the drag event is sent to the target on the selected screen (for example, the monitor screen).
  • the drag management module can also modify the screen ID reported by the drag effect so that the drag effect is displayed on the monitor screen.
  • the cursor display method obtaineds the user's intention (for example, the user moves the mouse to the edge of the phone and continues to move outward while dragging the object with the mouse), and selects the target screen for mouse movement according to the user's intention. Modify the target screen for input event reporting. This causes subsequent input events to be sent to the selected target screen. At the same time, after obtaining the user's intention, you can set the mouse cursor to the corresponding position on the target screen. This method allows the cursor to move freely between different screens, so that multiple screens can share a cursor to perform operations. At the same time, the dragged object can be interacted between multiple screens more conveniently, which helps to improve the user experience. Experience when using multiple screens.
  • FIG. 20 Another set of GUIs provided by the embodiment of the present application is shown.
  • the following description takes the first electronic device 400 as a mobile phone, the second electronic device 500 as a display, and the drag object as a window as an example.
  • the user connects the mobile phone to the display through wired or wireless connections, and uses the display as the second screen of the mobile phone.
  • the user connects the mouse to the phone.
  • the display is on the right side of the phone.
  • the mouse cursor can be displayed at the last position it appeared.
  • the mouse cursor is displayed on the screen of the mobile phone.
  • Mobile phones and monitors can display different GUIs.
  • the mobile phone displays the interface of the calculator App, and the monitor displays the interface of the social networking App.
  • the display effect of the calculator window 1601 is that part of the calculator window 1601 is displayed on the mobile phone screen and part of it is displayed on the monitor screen. Continue dragging and keep the left mouse button pressed.
  • the calculator window 2001 can be displayed on the monitor screen as shown in Figure 20(b).
  • the calculator window 2001 can be displayed on the monitor in full screen or non-full screen. in the screen.
  • the mobile phone screen can be displayed as the main interface.
  • the mouse can drag the calculator window 1601 from the mobile phone to the monitor. The user can then move the mouse on the monitor screen or perform any operation task. Of course, the mouse can also be used to drag the calculator window 2001 from the monitor to the mobile phone.
  • the keyboard and mouse management module in the mobile phone determines based on the user's intention that the user wants to move from the mobile phone to the display through the mouse.
  • the screen management module moves the screen where the mouse is located (for example, moves the mouse cursor from the screen of the mobile phone to the screen of the monitor).
  • the keyboard and mouse management module can also determine the position of the monitor where the mouse cursor is based on the user's intention.
  • the drag management module in the mobile phone determines that a drag event is occurring, and modifies the screen ID reported by the input event (drag event) so that the drag event is sent to on the selected screen (for example, the screen of the monitor).
  • the drag management module can also modify the screen ID reported by the drag effect so that the drag effect is displayed on the monitor screen.
  • the cursor display method obtaineds the user's intention (for example, the user moves the mouse to the edge of the phone and continues to move outward while dragging the object with the mouse), and selects the target screen for mouse movement according to the user's intention. Modify the target screen for input event reporting. This causes subsequent input events to be sent to the selected target screen. At the same time, after obtaining the user's intention, you can set the mouse cursor to the corresponding position on the target screen.
  • This method enables the cursor to move freely between different screens, allowing multiple screens to share one cursor to perform operations. It also allows dragging objects to interact between multiple screens more conveniently, helping to improve user experience. Experience with multiple screens.
  • the drag object may be a picture as shown in Figure 19, a window as shown in Figure 20, an application, data, a file or a file, etc.
  • the application does not limit the specific form or content of the drag object.
  • the input events mentioned above include mouse events and drag events.
  • the user can move the drag object to the edge of the screen by selecting it and continuing to move outward, as shown in Figures 19 and 20, while also moving the drag object to the target screen; the user can also You can select the dragging object and shake the mouse in a preset shape to display the effect of the dragging object being moved together with the mouse.
  • the display effects in this application that reflect the movement of the mouse and the movement of the dragged object include but are not limited to the above types, and this application does not specifically limit this.
  • Figure 21 shows a schematic flow chart of a cursor display method. This method can be applied to the scenarios shown in FIGS. 19 and 20 , and can be applied to the system architecture of the first electronic device 400 shown in FIG. 4 .
  • the method 2100 is described in detail below.
  • the window management service module senses the drag event.
  • the underlying input module controls dragging to be executed on the first screen.
  • the underlying input module controls the distribution of input events to the first screen.
  • Step S2101 can be executed first, and then steps S2102, S2103; or steps S2102, S2103 can be executed first, and then step S2101; or both at the same time.
  • the underlying input module passes the drag event to the keyboard and mouse management module.
  • the keyboard and mouse management module processes mouse data.
  • the keyboard and mouse management module can process the mouse data in it. Process and determine that a mouse movement event currently occurs.
  • the keyboard and mouse management module sends a request for second screen information to the screen management module.
  • the screen management module selects the second screen.
  • the screen management module sends the information of the second screen to the drag management module.
  • the keyboard and mouse management module sends the mouse movement information to the drag and drop management module.
  • the drag management module modifies the drag event information.
  • the drag management module sends the drag event display position information to the window management service module.
  • the drag management module sends the drag event distribution screen information to the underlying input module.
  • a drag event occurs, a mouse movement event occurs simultaneously.
  • the drag management module detects whether a drag event currently occurs.
  • the drag management module modifies the drag event information. That is to say, the drag management module modifies the target screen ID reported by the drag event so that the drag event is sent to the target screen.
  • the target screen may be the second screen, or one of multiple screens when multiple screens are involved (for example, one of the screens of the first display and the second display).
  • the underlying input module controls dragging to be executed on the second screen.
  • the underlying input module distributes input events to the second screen.
  • a drag event when a drag event occurs, the drag effect of the drag object will be displayed on the target screen. After that, input events such as drag events will also be distributed to the target screen.
  • the keyboard as an input device, can follow the movement of the mouse to a certain position on the target screen and perform an input operation at that position.
  • Figure 22 shows a cursor display method 2200 provided by this application. This method can be applied to the system architecture of the first electronic device as shown in Figure 4. The detailed steps are as follows:
  • the first electronic device displays the cursor on the first screen.
  • the connection method can be Bluetooth, wireless communication technology Wi-Fi or wired.
  • the first electronic device projects the social App interface to the second electronic device.
  • the second electronic device displays the social App interface of the first electronic device, and the first electronic device can close the current social App interface. That is to say, when the first electronic device and the second electronic device are connected via screen projection, the interfaces displayed by the two electronic devices may be the same or different, which is not limited in this application.
  • the first screen is the screen of the first electronic device
  • the cursor is the display mode of the input device of the first electronic device on the screen.
  • S2202 In response to receiving the user's first operation, the first electronic device controls the cursor to move from the first screen to the second screen.
  • the second screen is the screen of the second electronic device.
  • the first screen and the second screen are different logical screens.
  • the first electronic device and the second electronic device are within a cursor display system.
  • the first electronic device controls the cursor to move from the first screen to the second screen. It can also be understood that when the cursor is displayed on the second screen, the cursor disappears from the first screen, thereby producing a visual effect of the cursor moving from the first screen to the second screen.
  • the first electronic device determines the identity of the second screen according to the first operation, and the identity of the second screen is different from the identity of the first screen.
  • the first electronic device determines the second screen as the target screen according to the identification of the second screen.
  • the first electronic device sends the input event to the second screen.
  • Input events include at least one of the following Item: Mouse move event, mouse click event or drag event.
  • the first electronic device sends the input event to the second screen, which can also be understood as the first electronic device sends the input event to the second electronic device.
  • the first electronic device responds to receiving the user's first operation including: the first electronic device responds to receiving that the user moves the cursor in a preset path on the first screen; or the first electronic device responds to receiving The user taps on the first screen or the second screen; or the first electronic device responds to receiving the user's system setting on the first electronic device.
  • the first operation is used to express the user's intention.
  • the first electronic device responds to receiving that the user moves the cursor in a preset path on the first screen, including: the first electronic device responds to receiving that the user moves the cursor on the first screen to edge of the first screen and continues to move; or the first electronic device responds to receiving that the user shakes the cursor in a preset shape on the first screen.
  • the first electronic device responds to receiving that the user moves the cursor on the first screen to the edge of the first screen and continues moving as shown in Figure 5; the first electronic device responds to receiving that the user moves the cursor on the first screen.
  • the cursor is shaken in a preset shape on the screen as shown in Figure 6.
  • the first electronic device responds to receiving the user's tap on the first screen as shown in Figure 7.
  • the first electronic device responds to receiving the user's tap on the first screen.
  • the system settings on the first electronic device may be as shown in Figure 8.
  • the first electronic device determines the positional relationship between the first electronic device and the second electronic device, and the first electronic device responds to receiving that the user moves the cursor to the edge of the first screen according to the positional relationship on the first screen and Keep moving. It should be understood that the positional relationship between the first electronic device and the second electronic device is also the positional relationship between the first screen and the second screen. In the cursor display system, other electronic devices besides the first electronic device can also determine the positional relationship between the electronic devices. For example, the second electronic device determines the positional relationship between the first electronic device and the second electronic device.
  • the first electronic device determines the positional relationship between the first electronic device and the second electronic device in response to receiving the user's second operation.
  • the user's second operation is the user's setting operation.
  • the first electronic device senses the location of the second electronic device and determines the positional relationship between the first electronic device and the second electronic device.
  • the second screen may be located on the left or right side of the first screen, for example, as shown in FIG. 9 or FIG. 17 .
  • the first electronic device determines the position of the cursor on the second screen in response to receiving the first operation from the user.
  • the first electronic device determines that the position of the cursor on the second screen corresponds to the position on the first screen in response to receiving the first operation of the user; or, the first electronic device responds to receiving the first operation of the user.
  • the user's first operation determines the cursor's default position on the second screen. It should be understood that the default position may be the center position of the second screen.
  • a drag object is displayed on the first screen, and in response to receiving the user's third operation on the drag object, the first electronic device sends an input event to the second electronic device, so that the second electronic device displays the drag object from The dragged object is dragged from the first screen.
  • the user's third operation may be the user's drag operation, and the input event may be a drag event.
  • the first electronic device in response to receiving the user's fourth operation on the drag object on the second screen, sends an input event to the second electronic device, so that the movement effect of the drag object is displayed on the second screen.
  • the user's fourth operation is the user's movement operation
  • the input event may be a drag event.
  • the drag object includes at least one of the following: a file, an application icon, a text, a picture, or a window.
  • the screen displayed by the cursor can be dynamically moved, and at the same time, input events can be sent to the input and screen displayed by the cursor. so, The cursor can be moved freely between different screens, and input events can be transmitted between different screens. Users can freely move the cursor between multiple screens with a mouse, browse content on multiple screens at the same time, and drag and drop content between multiple screens. This method can improve the efficiency of collaborative use of multiple electronic devices, thereby improving the user experience.
  • Embodiments of the present application provide a computer program product.
  • the computer program product When the computer program product is run on a first electronic device, it causes the first electronic device to execute the technical solutions in the above embodiments.
  • the implementation principles and technical effects are similar to the above-mentioned method-related embodiments, and will not be described again here.
  • Embodiments of the present application provide a readable storage medium.
  • the readable storage medium contains instructions.
  • the instructions When the instructions are run on a first electronic device, the first electronic device executes the technical solutions of the above embodiments.
  • the implementation principles and technical effects are similar and will not be described again here.
  • Embodiments of the present application provide a chip.
  • the chip is used to execute instructions.
  • the technical solutions in the above embodiments are executed.
  • the implementation principles and technical effects are similar and will not be described again here.
  • the disclosed systems, devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or can be integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit.
  • the functions are implemented in the form of software functional units and sold or used as independent products, they can be stored in a computer-readable storage medium.
  • the technical solutions of the embodiments of the present application are essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium , including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in various embodiments of this application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program code. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Des modes de réalisation de la présente demande concernent un procédé d'affichage de curseur et un dispositif électronique. Le procédé est appliqué à un premier dispositif électronique. Le premier dispositif électronique est connecté à un second dispositif électronique dans un mode de projection d'écran. Le procédé comprend les étapes suivantes : le premier dispositif électronique affiche un curseur sur un premier écran ; et en réponse à une première opération reçue d'un utilisateur, le premier dispositif électronique commande le curseur pour qu'il se déplace du premier écran vers un second écran, le premier écran et le second écran ayant des écrans logiques différents. Selon le procédé, le premier dispositif électronique peut déplacer le curseur selon l'intention d'un utilisateur, ce qui permet au curseur de commuter librement entre des écrans de différents dispositifs électroniques, ce qui aide l'utilisateur à gérer de manière collaborative des tâches au moyen de multiples dispositifs, et ce qui améliore la convivialité d'utilisation de l'utilisateur.
PCT/CN2023/096281 2022-05-31 2023-05-25 Procédé d'affichage de curseur et dispositif électronique WO2023231893A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210607324.6A CN117193583A (zh) 2022-05-31 2022-05-31 光标显示的方法及电子设备
CN202210607324.6 2022-05-31

Publications (1)

Publication Number Publication Date
WO2023231893A1 true WO2023231893A1 (fr) 2023-12-07

Family

ID=89003938

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/096281 WO2023231893A1 (fr) 2022-05-31 2023-05-25 Procédé d'affichage de curseur et dispositif électronique

Country Status (2)

Country Link
CN (1) CN117193583A (fr)
WO (1) WO2023231893A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104049779A (zh) * 2014-06-25 2014-09-17 华东理工大学 在多个显示器之间实现鼠标指针快速切换
CN105512086A (zh) * 2016-02-16 2016-04-20 联想(北京)有限公司 信息处理设备以及信息处理方法
CN112527174A (zh) * 2019-09-19 2021-03-19 华为技术有限公司 一种信息处理方法及电子设备
CN114089901A (zh) * 2020-07-29 2022-02-25 华为技术有限公司 一种跨设备的对象拖拽方法及设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104049779A (zh) * 2014-06-25 2014-09-17 华东理工大学 在多个显示器之间实现鼠标指针快速切换
CN105512086A (zh) * 2016-02-16 2016-04-20 联想(北京)有限公司 信息处理设备以及信息处理方法
CN112527174A (zh) * 2019-09-19 2021-03-19 华为技术有限公司 一种信息处理方法及电子设备
CN114089901A (zh) * 2020-07-29 2022-02-25 华为技术有限公司 一种跨设备的对象拖拽方法及设备

Also Published As

Publication number Publication date
CN117193583A (zh) 2023-12-08

Similar Documents

Publication Publication Date Title
JP7142783B2 (ja) 音声制御方法及び電子装置
US11922005B2 (en) Screen capture method and related device
WO2021013158A1 (fr) Procédé d'affichage et appareil associé
WO2021103981A1 (fr) Procédé et appareil de traitement d'affichage à écran divisé, et dispositif électronique
WO2021129326A1 (fr) Procédé d'affichage d'écran et dispositif électronique
WO2020052529A1 (fr) Procédé pour régler rapidement une petite fenêtre lors d'un affichage plein écran pendant une vidéo, interface utilisateur graphique et terminal
WO2021036571A1 (fr) Procédé d'édition de bureau et dispositif électronique
WO2021078284A1 (fr) Procédé de continuation de contenu et dispositif électronique
WO2021000881A1 (fr) Procédé de division d'écran et dispositif électronique
JP2022549157A (ja) データ伝送方法及び関連装置
WO2021213164A1 (fr) Procédé d'interaction entre des interfaces d'application, dispositif électronique et support de stockage lisible par ordinateur
WO2022100305A1 (fr) Procédé et appareil d'affichage d'image entre dispositifs, et dispositif électronique
WO2020062294A1 (fr) Procédé de commande d'affichage pour une barre de navigation de système, interface utilisateur graphique et dispositif électronique
WO2021036770A1 (fr) Procédé de traitement d'écran partagé et dispositif terminal
WO2021063237A1 (fr) Procédé de commande de dispositif électronique et dispositif électronique
WO2021063098A1 (fr) Procédé de réponse d'écran tactile, et dispositif électronique
WO2021121052A1 (fr) Procédé et système de coopération à écrans multiples et dispositif électronique
WO2022068483A1 (fr) Procédé et appareil de démarrage d'application, et dispositif électronique
WO2022017393A1 (fr) Système d'interaction d'affichage, procédé d'affichage, et dispositif
WO2022127661A1 (fr) Procédé de partage d'applications et dispositif électronique et support de stockage
WO2021078032A1 (fr) Procédé d'affichage d'interface utilisateur et dispositif électronique
WO2020238759A1 (fr) Procédé d'affichage d'interface et dispositif électronique
WO2020037469A1 (fr) Procédé d'affichage d'interface et dispositif électronique
WO2020155875A1 (fr) Procédé d'affichage destiné à un dispositif électronique, interface graphique personnalisée et dispositif électronique
WO2021057203A1 (fr) Procédé d'opération et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23815076

Country of ref document: EP

Kind code of ref document: A1