WO2020244500A1 - Procédé de commande tactile dans un scénario de partage d'écran, et appareil électronique - Google Patents

Procédé de commande tactile dans un scénario de partage d'écran, et appareil électronique Download PDF

Info

Publication number
WO2020244500A1
WO2020244500A1 PCT/CN2020/093908 CN2020093908W WO2020244500A1 WO 2020244500 A1 WO2020244500 A1 WO 2020244500A1 CN 2020093908 W CN2020093908 W CN 2020093908W WO 2020244500 A1 WO2020244500 A1 WO 2020244500A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch event
touch
source device
interface
display
Prior art date
Application number
PCT/CN2020/093908
Other languages
English (en)
Chinese (zh)
Inventor
魏曦
曹原
范振华
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2020244500A1 publication Critical patent/WO2020244500A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Definitions

  • This application relates to the field of terminal technology, and in particular to a touch method and electronic equipment in a screen projection scenario.
  • the electronic device can realize the switching and display of multimedia data among multiple devices by means of screen projection.
  • the mobile phone when a user uses a video application in a mobile phone to watch a video, the mobile phone can be set as the source device, and the display interface in the source device can be sent to other destination devices that support the screen projection function for display.
  • the user needs to operate the current display interface of the video application, he still needs to perform corresponding operations in the mobile phone (ie, the source device) to update the display data of the mobile phone, and then the mobile phone projects the updated display data to the destination device for display.
  • the source device when the source device is not around the user or the user is inconvenient to operate the source device, the user cannot perform related control on the display interface that is being projected, resulting in a poor user experience when the projected screen is displayed.
  • the present application provides a touch method and electronic device in a screen projection scene.
  • the target device can receive and respond to the control operation performed by the user on the screen projection interface, thereby improving the user's touch experience in the screen projection scene.
  • this application provides a touch method in a screen projection scenario, including: a source device displays a first display interface; in response to a screen projection instruction input by a user, the source device sets N in the first display interface (N is (Integer greater than 0) controls are projected to the projection interface displayed by the first destination device; subsequently, if the source device receives the first touch event sent by the first destination device, the source device can execute the corresponding Operating instructions.
  • the destination device can generate a touch event in response to a touch operation input by the user, and send the touch event to the source device to implement corresponding functions, and realize the control function of reversely controlling the source device, thereby improving the user's casting Touch experience in the screen scene.
  • the aforementioned first touch event may include the coordinates of the touch point and the type of the touch event (for example, an event type such as a single click, a double tap, or a sliding).
  • the method further includes: the source device determines a target control corresponding to the first touch event, and the target control is among the aforementioned N controls
  • the operation instruction executed by the source device is the corresponding operation instruction when the target control is triggered on the source device.
  • the target control corresponding to the first touch event is a play button
  • the operation instruction corresponding to the first touch event is an operation instruction when the play button is triggered.
  • the aforementioned first touch event may be: a touch event generated by the first destination device when the user inputs a first touch operation on the aforementioned projection interface.
  • the first touch operation is a touch operation actually input by the user when the screen is projected.
  • the above-mentioned first touch event may also be: after the first destination device generates a touch event (such as a fifth touch event) in response to a touch operation input by the user on the projection interface, the first destination device sends the fifth touch event
  • a touch event such as a fifth touch event
  • the mapping is a touch event in the first display interface.
  • the source device may store a configuration file corresponding to the above-mentioned first display interface.
  • the configuration file records the display positions of the above-mentioned N controls in the first display interface, and where the N controls are The display position in the projection interface; at this time, the source device determines the target control corresponding to the first touch event, which specifically includes: the source device can determine the display position of the above N controls in the projection interface recorded in the configuration file The target control corresponding to the first touch event.
  • the source device may determine the first control as the target control.
  • the source device can determine the first control at the top level as the target control.
  • the method further includes: the source device maps the first touch event to the second touch event according to the above configuration file, and the second touch The event is: a touch event that the source device will generate when the user inputs a second touch operation to the target control in the first display interface; it should be noted that the user does not actually input the second touch operation in the first display interface.
  • the source device maps the second touch event corresponding to the second touch operation according to the first touch event.
  • the execution of the operation instruction corresponding to the first touch event by the source device means: the source device executes and reports the mapped second touch event to the first application (the first display interface being displayed by the source device is the interface of the first application) ), so that the first application executes the operation instruction corresponding to the second touch event.
  • the coordinate of the touch point in the first touch event is A
  • the coordinate of the touch point in the second touch event after mapping is B.
  • the source device executes the operation instruction corresponding to the first touch event, which is actually the first application responding to the coordinate This is the second touch event of B.
  • the user's touch operation on the projection interface can also reversely control the related application in the source device to implement the corresponding function.
  • the source device maps the first touch event to the second touch event according to the above configuration file, including: the source device displays the first display on the first display interface according to the target control recorded in the configuration file The corresponding relationship between the position and the second display position in the projection interface is converted from the first coordinate in the first touch event to the second coordinate to obtain the second touch event.
  • the source device can reversely calculate the second coordinate corresponding to the above-mentioned first coordinate in the first display interface according to changes in the translation, scaling, or rotation of the target control before and after the screen projection recorded in the configuration file.
  • the source device after the source device determines the target control corresponding to the first touch event, it can also report the identification of the target control and the event type of the first touch event to the first application, so that the first The application executes the first function, that is, the function corresponding to the first application when the target control is triggered by the operation indicated by the event type, so as to realize the function of the user to reversely control the related application in the source device in the projection interface.
  • the method further includes: the source device generates the target control at the first display position in the first display interface according to the target control recorded in the configuration file.
  • the third touch event, the event type of the third touch event is the same as the event type of the first touch event, and the third coordinate in the third touch event is located in the first display position; at this time, the source device executes corresponding to the first touch event
  • the operation instructions include: the source device reports the third touch event to the first application. That is to say, the source device converts the user’s first touch event on the screen projection interface into a third touch event on the first display interface. Then, the process of the source device responding to the third touch event is actually the source device responding to the third touch event. The process of the first touch event.
  • the source device determines the target control of the first touch event, and maps the first touch event to the second touch event. It is understandable that after the target device generates the first touch event, it can also determine the target control of the first touch event according to the above method, and map the first touch event to the second touch event. Furthermore, the destination device can send the mapped second touch event to the source device, and the source device reports the second touch event to the first application, so that the first application executes the operation instruction corresponding to the second touch event, that is, The operation instruction corresponding to the first touch event.
  • the source device displays the first display interface, it further includes: in response to the second screen-casting instruction input by the user, the source device sets M in the first display interface (M is an integer greater than 0). ) A control is projected to the second destination device for display; the source device receives the fourth touch event sent by the second destination device; the source device executes an operation instruction corresponding to the fourth touch event.
  • M is an integer greater than 0
  • the source device executes an operation instruction corresponding to the fourth touch event.
  • this application provides a touch method in a projection scene, including: a destination device receives a first message sent by a first source device, the first message includes a drawing instruction of a first target control, and the first target control One or more controls in the first display interface displayed for the first source device; the destination device invokes the drawing instruction of the first target control to draw the screen projection interface, the screen projection interface includes the first target control; in response to the user casting For the first touch operation input to the first target control on the screen interface, the destination device generates a first touch event; the destination device instructs the first source device to execute an operation instruction corresponding to the first touch event.
  • the destination device instructs the first source device to execute the operation instruction corresponding to the first touch event, including: the destination device sends the first touch event to the first source device, so that the first source device executes The operation instruction corresponding to the first touch event. After receiving the first touch event, the first source device can execute the operation instruction corresponding to the first touch event according to the method in the first aspect.
  • the target device after the target device generates the first touch event, it further includes: the target device maps the first touch event to a second touch event, and the second touch event is: The touch event that the first source device will generate when the first target control inputs the second touch operation.
  • the method for the destination device to map the first touch event to the second touch event is the same as the method for the source device to map the first touch event to the second touch event in the first aspect.
  • the destination device instructs the first source device to execute the operation instruction corresponding to the first touch event, including: the destination device sends the mapped second touch event to the first source device, so that the first source device executes the operation instruction corresponding to the second touch event.
  • the operation instruction corresponding to the touch event including: the destination device sends the mapped second touch event to the first source device, so that the first source device executes the operation instruction corresponding to the second touch event.
  • the above method further includes: the destination device receives a second message sent by the second source device, the second message includes a drawing instruction of the second target control, and the second target control is displayed by the second source device One or more controls in the second display interface; the destination device calls the drawing instruction of the second target control to draw the second target control in the projection interface; in response to the user input to the second target control in the projection interface In the third touch operation, the destination device generates a third touch event; the destination device instructs the second source device to execute an operation instruction corresponding to the third touch event.
  • the user can input touch operations on the controls projected from different source devices in the target device to control the corresponding
  • the source device implements the function corresponding to the touch operation, thereby improving the user's touch experience in the screen projection scene.
  • the present application provides an electronic device, including: a touch screen, one or more processors, one or more memories, and one or more computer programs; wherein the processor is coupled with the touch screen and the memory, and the above one One or more computer programs are stored in the memory, and when the electronic device is running, the processor executes one or more computer programs stored in the memory, so that the electronic device executes any one of the contact recommendation methods described above.
  • the present application provides a computer storage medium including computer instructions, which when the computer instructions run on an electronic device, cause the electronic device to execute the contact recommendation method described in any one of the first aspect.
  • this application provides a computer program product, which when the computer program product runs on an electronic device, causes the electronic device to execute the contact recommendation method described in any one of the first aspect.
  • the present application provides a touch control system, which may include at least one source device and at least one destination device; the source device may be used to perform touch control in the projection scene as described in any one of the first aspect.
  • Control method the target device can be used to execute the touch method in the projection scene as described in any one of the second aspect.
  • the electronic equipment described in the third aspect, the computer storage medium described in the fourth aspect, the computer program product described in the fifth aspect, and the touch control system described in the sixth aspect provided above are all used for execution.
  • the beneficial effects that can be achieved can refer to the beneficial effects of the corresponding method provided above, which will not be repeated here.
  • FIG. 1 is an architecture diagram 1 of a communication system provided by an embodiment of this application;
  • FIG. 2 is a second structural diagram of a communication system provided by an embodiment of this application.
  • FIG. 3 is a first structural diagram of an electronic device according to an embodiment of the application.
  • FIG. 4 is a structural diagram of an operating system in an electronic device provided by an embodiment of the application.
  • FIG. 5 is a first schematic diagram of a touch method in a projection scene provided by an embodiment of the application.
  • FIG. 6 is a schematic diagram of a second scene of a touch method in a projection scene provided by an embodiment of the application;
  • FIG. 7 is a scene schematic diagram 3 of a touch method in a projection scene provided by an embodiment of the application.
  • FIG. 8 is a scene schematic diagram 4 of a touch method in a projection scene provided by an embodiment of the application.
  • FIG. 9 is a schematic diagram five of a touch method in a projection scene provided by an embodiment of the application.
  • FIG. 10 is a scene schematic diagram 6 of a touch method in a projection scene provided by an embodiment of the application.
  • FIG. 11 is a scene schematic diagram 7 of a touch method in a projection scene provided by an embodiment of this application.
  • FIG. 12 is a scene schematic diagram eight of a touch method in a projection scene provided by an embodiment of this application.
  • FIG. 13 is a scene schematic diagram 9 of a touch method in a projection scene provided by an embodiment of the application.
  • FIG. 14 is a schematic diagram ten of a touch method in a projection scene provided by an embodiment of the application.
  • 15 is a schematic eleventh scene of a touch method in a projection scene provided by an embodiment of this application.
  • 16 is a schematic diagram 12 of a touch method in a projection scene provided by an embodiment of this application.
  • FIG. 17 is a schematic diagram 13 of a touch method in a projection scene provided by an embodiment of the application.
  • FIG. 18 is a second structural diagram of an electronic device provided by an embodiment of this application.
  • a touch method in a projection scene can be applied to a communication system 100, and the communication system 100 may include N (N>1) electronic devices.
  • the communication system 100 may include an electronic device 101 and an electronic device 102.
  • the electronic device 101 may be connected to the electronic device 102 through one or more communication networks 104.
  • the communication network 104 may be a wired network or a wireless network).
  • the aforementioned communication network 104 may be a local area network (local area networks, LAN), or a wide area network (wide area networks, WAN), such as the Internet.
  • the communication network 104 can be implemented using any known network communication protocol.
  • the above-mentioned network communication protocol can be various wired or wireless communication protocols, such as Ethernet, universal serial bus (USB), and Firewire (FIREWIRE).
  • GSM Global System for Mobile Communications
  • GPRS General Packet Radio Service
  • CDMA Code Division Multiple Access
  • WCDMA Broadband Code Division Multiple Access
  • TD-SCDMA time-division code division multiple access
  • LTE long term evolution
  • Bluetooth wireless fidelity (Wi-Fi)
  • Wi-Fi wireless fidelity
  • NFC voice over Internet protocol (VoIP) based on Internet protocol, communication protocol supporting network slicing architecture or any other suitable communication protocol.
  • VoIP voice over Internet protocol
  • the electronic device 101 may establish a Wi-Fi connection with the electronic device 102 through a Wi-Fi protocol.
  • the electronic device 101 may be used as a source device
  • the electronic device 102 may be used as a destination device
  • the electronic device 101 ie, the source device
  • the electronic device 101 may project the display content in its display interface to the electronic device 102 (ie, the destination device) for display.
  • the electronic device 102 can also be used as a source device, and the electronic device 102 projects the display content in its display interface to the electronic device 101 (that is, the destination device) for display.
  • the communication system 100 described above may further include one or more other electronic devices such as an electronic device 103.
  • the electronic device 103 may be a wearable device.
  • the electronic device 103 may also be used as a source device or a destination device for projection display.
  • both the electronic device 102 and the electronic device 103 can be used as the destination device of the electronic device 101.
  • the electronic device 101 can project the display content in its display interface to the electronic device 102 and the electronic device 103 for display at the same time.
  • one source device can simultaneously project to multiple destination devices.
  • both the electronic device 102 and the electronic device 103 can be used as the source device of the electronic device 101.
  • the electronic device 102 and the electronic device 103 can simultaneously project the display content in their display interfaces to the electronic device 101 for display.
  • a destination device can simultaneously receive and display the display content sent by multiple source devices.
  • any electronic device in the aforementioned communication system 100 can be used as a source device or a destination device, which is not limited in the embodiment of the present application.
  • the specific structures of the electronic device 101, the electronic device 102, and the electronic device 103 may be the same or different.
  • each of the above electronic devices may specifically be mobile phones, tablet computers, smart TVs, wearable electronic devices, car machines, notebook computers, ultra-mobile personal computers (UMPC), handheld computers, netbooks, and personal digital assistants. (personal digital assistant, PDA), virtual reality equipment, etc.
  • UMPC ultra-mobile personal computers
  • PDA personal digital assistant
  • virtual reality equipment etc.
  • the embodiments of the present application do not make any restrictions on this.
  • FIG. 3 shows a schematic structural diagram of the electronic device 101.
  • the electronic device 101 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2.
  • Mobile communication module 150 Wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, camera 193, display screen 194, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 101.
  • the electronic device 101 may include more or fewer components than shown in the figure, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (PCM) interface, and a universal asynchronous transmitter receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / Or Universal Serial Bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous transmitter receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive the wireless charging input through the wireless charging coil of the electronic device 101. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110.
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 101 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 101 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 101.
  • the mobile communication module 150 may include one or more filters, switches, power amplifiers, low noise amplifiers (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering and amplifying the received electromagnetic waves, and then transmitting them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 101, including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), and global navigation satellites.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating one or more communication processing modules.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic wave radiation via the antenna 2.
  • the antenna 1 of the electronic device 101 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 101 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 101 implements a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, connected to the display 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, etc.
  • the display screen 194 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active-matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the electronic device 101 may include one or N display screens 194, and N is a positive integer greater than one.
  • the electronic device 101 can realize a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back from the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transfers the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats.
  • the electronic device 101 may include 1 or N cameras 193, and N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 101 selects the frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 101 may support one or more video codecs. In this way, the electronic device 101 can play or record videos in a variety of encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 101.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store one or more computer programs, and the one or more computer programs include instructions.
  • the processor 110 can run the above-mentioned instructions stored in the internal memory 121 to enable the electronic device 101 to execute the methods provided in some embodiments of the present application, as well as various functional applications and data processing.
  • the internal memory 121 may include a storage program area and a storage data area. Among them, the storage program area can store the operating system; the storage program area can also store one or more application programs (such as a gallery, contacts, etc.) and so on.
  • the data storage area can store data (such as photos, contacts, etc.) created during the use of the electronic device 101.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, universal flash storage (UFS), etc.
  • the processor 110 executes the instructions stored in the internal memory 121 and/or the instructions stored in the memory provided in the processor to cause the electronic device 101 to execute the method provided in the embodiments of the present application. And various functional applications and data processing.
  • the electronic device 101 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called a “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 101 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 101 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can approach the microphone 170C through the mouth to make a sound, and input the sound signal to the microphone 170C.
  • the electronic device 101 may be provided with one or more microphones 170C. In other embodiments, the electronic device 101 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals. In some other embodiments, the electronic device 101 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, and a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA, CTIA
  • the sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
  • the above electronic equipment may also include one or more components such as buttons, motors, indicators, and SIM card interfaces, which are not limited in the embodiment of the present application.
  • the above-mentioned software system of the electronic device 101 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of the present application takes a layered Android system as an example to illustrate the software structure of the electronic device 101.
  • FIG. 4 is a block diagram of the software structure of the electronic device 101 according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of applications.
  • the above-mentioned applications may include APPs (applications) such as call, contact, camera, gallery, calendar, map, navigation, Bluetooth, music, video, short message, etc.
  • APPs applications
  • the application framework layer provides application programming interfaces (application programming interface, API) and programming frameworks for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a view system, a notification manager, an activity manager, a window manager, a content provider, a resource manager, an input manager, and so on.
  • the view system can be used to construct the display interface of the application.
  • Each display interface can consist of one or more controls.
  • controls can include interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, and widgets.
  • Multiple controls in the display interface can be organized hierarchically according to a tree structure to form a complete ViewTree (view tree).
  • the view system can draw the display interface according to the ViewTree of the display interface, and each control in the display interface corresponds to a set of drawing instructions, such as DrawLine, DrawPoint, DrawBitmap, etc.
  • FIG. 5 shows the chat interface 401 of the WeChat APP.
  • the bottommost control in the chat interface 401 is the root node.
  • the root node is provided with a basemap 402 control.
  • the following controls are also included: a title bar 403, a chat background 404, and an input bar 405.
  • the title bar 403 further includes a return button 406 and a title 407
  • the chat background 404 further includes an avatar 408 and a bubble 409
  • the input bar 405 further includes a voice input button icon 410, an input box 411, and a send button 412.
  • the above controls are layered in order to form a view tree A as shown in Figure 5(b).
  • the base map 402 is a child node of the root node
  • the title bar 403, the chat background 404, and the input field 405 are all child nodes of the base map 402.
  • Both the return button 406 and the title 407 are child nodes of the title bar 403.
  • Both the avatar 408 and the bubble 409 are child nodes of the chat background 404.
  • the voice input button icon 410, the input box 411, and the send button 412 are all child nodes of the input field 405.
  • the view system can call the drawing instructions of the corresponding controls layer by layer to draw each control according to the layer relationship between the controls in the view tree A, and finally form the chat interface 401.
  • the view system can split, delete, or reorganize the controls in the view tree of the current display interface to determine This time, it needs to be projected to one or more target controls displayed on the target device. Furthermore, the electronic device 101 can project the determined target control to the target device to form a screen projection interface, thereby adapting to device characteristics such as the display size of the target device, and improving the display effect and user experience of the target device in the screen projection scene.
  • the electronic device 101 projects the target control in the display interface to the projection interface of the target device (for example, the electronic device 102)
  • the user can display the control in the projection interface in the electronic device 102.
  • the target control inputs a corresponding touch operation to control the electronic device 101 to implement a function corresponding to the touch operation.
  • the APP running in the application layer obtains the touch operation input by the user on the touch screen is a process of distributing messages from the bottom layer to the top layer.
  • the touch screen can obtain the relevant information of the touch operation (for example, the coordinates of the touch point, etc.), and then the touch screen can report the original touch generated by the touch operation to the kernel layer in the form of an interrupt through the corresponding driver event.
  • the kernel layer can encapsulate the touch event into a high-level touch event (for example, action down event, action move event, action up event, etc.) that can be read by the upper layer, and combine the Advanced touch events are sent to the Framework layer.
  • the Framework layer can report the above-mentioned advanced touch events to the application process of the running application A in the application layer.
  • the application process of application A calls the corresponding library function to determine the specific control acted by the advanced touch event and the event type of the advanced touch event.
  • the event type may include single click, double tap, sliding, etc. Take the user clicking the play button as an example.
  • the process of application A can call the callback function corresponding to the touch event of clicking the play button Realize the application function corresponding to this touch operation.
  • a coordinate conversion module may be set in the application framework layer of the target device.
  • the original touch event reported by the touch screen of the target device to the kernel layer includes the coordinates (x, y) of the touch point, and the coordinates (x, y) are the user's touch position in the projection interface after the projection.
  • the touch point (x, y) in the advanced touch event reported by the kernel layer to the Framework layer is also the user's touch position in the projection interface.
  • its coordinate conversion module can map the coordinate (x, y) to the corresponding coordinate (x', y') in the display interface of the source device.
  • the destination device can send an advanced touch event carrying coordinates (x', y') to the source device, and the Framework layer of the source device reports the advanced touch event to the screen-projecting application.
  • the application receives an advanced touch event with a touch point of (x', y'), it is equivalent to receiving a touch event generated by the user on the touch coordinates (x', y') in the source device, then the application can Respond to the touch event carrying the coordinates (x', y') to realize the corresponding application function.
  • the target device can generate a first touch event that carries coordinates (x, y). Furthermore, the destination device may map the first touch event to a second touch event whose touch point is (x', y') on the display interface of the source device. In this way, after the source device receives the second touch event sent by the target device, it can execute the corresponding application function in response to the second touch event, so as to realize the reverse control function of the target device on the display interface of the source device after the screen is projected.
  • the aforementioned coordinate conversion module can also be set in the Framework layer of the source device.
  • the destination device can send the first touch event with a touch point of (x, y) to the source device, and the coordinate conversion module of the source device maps the first touch event to a touch point of (x', y')
  • the embodiment of the present application does not impose any limitation on this.
  • a touch event is detected and generated by the touch screen, and the coordinates of the touch point in the touch event are coordinated as an example. It is understandable that when the user inputs a single click, long press or sliding touch operation on the touch screen, the touch screen can detect a series of touch events. For each touch event, the destination device (or source device) can convert the coordinates of the touch point in the touch event according to the above-mentioned method, and the embodiment of the present application does not impose any limitation on this.
  • the aforementioned activity manager can be used to manage the life cycle of each application.
  • Applications usually run in the operating system in the form of activity.
  • the activity manager can schedule the activity process of the application to manage the life cycle of each application.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display, determine whether there is a status bar, lock the screen, take a screenshot, etc.
  • the content provider is used to store and retrieve data and make these data accessible to applications.
  • the data may include video, image, audio, phone calls made and received, browsing history and bookmarks, phone book, etc.
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, etc.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), three-dimensional graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem, and provides a combination of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support multiple audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, a sensor driver, etc., which are not limited in the embodiment of the present application.
  • the mobile phone may project one or more controls in its display interface to the smart watch for display.
  • the playback interface 600 includes the following controls: a base map 601, a status bar 602, a title bar 603, an album cover 604, lyrics 605, and a control bar 606.
  • the status bar 602 includes controls such as time, signal strength, and battery capacity.
  • the title bar 603 includes controls such as the song name 6031 and the singer 6032.
  • the control bar 606 includes controls such as a progress bar 6061, a pause button 6062, a previous button 6063, and a next button 6064.
  • the mobile phone can obtain the corresponding view tree when the view system draws the aforementioned playback interface 600, and the drawing instructions and drawing resources of each control in the view tree.
  • FIG. 7 it is the view tree 701 of the above-mentioned playing interface 600.
  • the view tree 701 records the layer relationship between the various controls in the aforementioned playback interface 600.
  • the root node of the playback interface 600 includes a sub-node of the base map 601.
  • the status bar 602, the title bar 603, the album cover 604, the lyrics 605, and the control bar 606 are all sub-nodes of the base map 601.
  • the song title 6031 and the singer 6041 are child nodes of the title column 603.
  • the progress bar 6061, the pause button 6062, the previous button 6063, and the next button 6064 are child nodes of the control bar 606.
  • the mobile phone can further determine one or more controls (ie, target controls) in the playback interface 600 that need to be projected to the display in the mobile phone 500.
  • controls ie, target controls
  • a configuration file corresponding to the aforementioned playback interface 600 may be preset in the mobile phone.
  • the mobile phone may obtain the configuration file corresponding to the playing interface 600 from the server.
  • the configuration file records one or more controls (ie, target controls) that need to be projected onto the smart watch in the playback interface 600 of the mobile phone.
  • the above configuration file may be stored in a mobile phone or server in a format such as JSON (JavaScript Object Notation) format, XML (Extensible Markup Language) format, or text format, and the embodiment of the present application does not impose any limitation on this.
  • JSON JavaScript Object Notation
  • XML Extensible Markup Language
  • the configuration file 1 corresponding to the playing interface 600 may be:
  • the configuration file 1 contains multiple "src” fields (for example, the aforementioned "src1" field and "src2" field).
  • Each "src” field records the specific position of a control in the playback interface 600.
  • the position of each control can be uniquely determined by the values of the 4 parameters: left, top, widht, and height.
  • left is the size of the top left corner of the control on the x axis
  • top is the size of the top left corner of the control on the y axis
  • width is the width of the control
  • height is the height of the control.
  • the one or more controls recorded in the configuration file 1 are the target controls that the mobile phone needs to project to the smart watch.
  • the mobile phone can identify the target control in the playback interface 600 that needs to be projected to the mobile phone 500 based on the view tree 701.
  • the target control includes: the song name 6031 and artist 6041 in the title bar 603, the pause button 6062 in the control bar 606, the previous button 6063 and the next button 6064, and the album cover 604.
  • the configuration file 1 may also record the specific display position of the target control on the screen projection interface after the screen is projected.
  • a "dest1" field corresponding to the "src1" field can be set in the above configuration file 1, and the "dest1" field is used to indicate the display position of the control 1 in the target device.
  • the "dest1" field is as follows:
  • the mobile phone can determine the display position of each target control in the playback interface 600 on the screen projection interface of the smart watch according to the respective "dest” fields in the configuration file 1.
  • the playback interface 600 of the mobile phone ie the source device
  • the control 1 recorded in the "src1" field is located in the first coordinate system before the screen is projected.
  • the screen projection interface of the smart watch ie, the destination device
  • the control 1 recorded in the "dest1" field is located in the area 802 of the second coordinate system before and after the projection.
  • Any position in the area 801 uniquely corresponds to a position in the area 802.
  • the configuration file 1 can also record the change of the display position of the target control before and after the screen is projected.
  • the following fields are also set for control 1 in configuration file 1:
  • the "translationx” field and the “translationy” field are used to indicate the translation distance of the control 1 on the x-axis and the y-axis after the projection; the "scalex” field and the “scaley” field are respectively used to indicate that the control 1 is in the The zoom ratio on the x-axis and y-axis; the “rotatedegree” field is used to indicate the rotation angle of control 1 after projection; the “order” field is used to indicate the layer position of control 1 after projection (for example, in the bottom layer Still in the top layer).
  • the mobile phone can also determine the display position of the control 1 on the screen projection interface of the smart watch after the screen is projected based on the change relationship of the display position of the control 1 before and after the screen is projected recorded in the above field. That is, the position of the control 1 in the first coordinate system and the position of the control 1 in the second coordinate system are determined.
  • the view tree 701 of the playback interface 600 can be split, cropped, and reorganized to generate a projection screen.
  • the mobile phone deletes nodes in the view tree 701 that are not target controls, such as the above-mentioned base map 601, status bar 602, various controls in the status bar 602, and the control bar 606 in the progress bar 6061.
  • the phone can change the song name in the title bar 603 6031 and singer 6041 are set as the child nodes of the album cover 604, and the pause button 6062, the previous button 6063 and the next button 6064 in the control bar 606 are also set as the child nodes of the album cover 604.
  • the mobile phone that is, the source device
  • the UI message includes the view tree 901 and the drawing instructions and drawing resources related to each control in the view tree 901. .
  • the smart watch After the smart watch receives the UI message corresponding to the above-mentioned playback interface 600, it can call the drawing instructions of each target control in the view tree 901 in turn according to the level and order in the view tree 901, and draw the target at the position specified in the configuration file 1. Control. Finally, as shown in (b) of FIG. 9, the smart watch can draw the screen projection interface 902 after the above-mentioned playback interface 600 is projected. Each control in the projection interface 902 corresponds to each control in the view tree 901 one-to-one.
  • the mobile phone when the mobile phone displays the above-mentioned playback interface 600 on the smart watch, it can split, delete, and reorganize the controls in the playback interface 600, so that the final screen projection is in the smart watch.
  • the interface 902 can be adapted to the display size of the display screen in the smart watch and the user's usage requirements, so as to improve the display effect and user experience during screen projection between multiple devices.
  • the user can input the corresponding touch operation on each target control after the projection in the screen projection interface 902, and the smart watch can respond to the touch operation A corresponding touch event is generated, and further, the smart watch can control the mobile phone to implement a function corresponding to the touch event.
  • the touch event may include the coordinates of the touch point and the time type of the touch event (for example, single click, double tap, slide, etc.).
  • the user wants the music APP to pause the song being played he can click the pause button 6062 in the screen projection interface 902.
  • the user wants the music APP to play the previous song he can click the previous button 6063 in the screen projection interface 902.
  • the user wants the music APP to play the next song he can click the next button 6064 in the screen projection interface 902.
  • the screen projection interface 902 displayed by the smart watch is located in the second coordinate system.
  • the touch sensor of the smart watch can detect the touch operation input by the user on the screen projection interface 902 in real time.
  • the touch sensor of the smart watch can encapsulate the detected touch information (for example, the coordinate information of touch point A, the touch time, etc.) as a first touch event, and the first touch The touch event is reported to the core layer of the smart watch.
  • the first touch event is generated by the smart watch in response to the first touch operation of the user clicking the pause button 6062 in the projection interface 902.
  • the touch sensor can encapsulate the detected touch operation as the first original touch event and report it to the kernel layer through the driver, and then the kernel layer encapsulates the original touch event as the upper layer.
  • the first high-level touch event read is reported to the application framework layer.
  • the application framework layer After the application framework layer receives the first advanced touch event carrying the coordinates A (x, y), it can determine the user's current touch target according to the display position of each control recorded in the configuration file 1 in the projection interface 902 Which one is the control?
  • the “dest1” field in the aforementioned configuration file 1 records that the pause button 6062 is located in area 1 of the projection interface 902. Then, when the coordinates A(x, y) fall into area 1, the smart watch can determine that the target control acted on by the user this touch operation is the pause button 6062.
  • the coordinates A(x, y) may fall into two controls at the same time.
  • the coordinates A(x, y) are located in the area where the pause button 6062 is located, and are also located in the area where the album cover 604 is located.
  • the smart watch can determine the uppermost control as the target control for the user's current touch operation according to the "order" field recorded in the configuration file 1.
  • the screen projection interface 902 Take the user clicking the pause button 6062 in the screen projection interface 902 as an example.
  • the application framework layer determines that the first touch event this time is a touch event on the pause button 6062
  • the screen can be projected according to the pause button 6062 recorded in the configuration file 1.
  • the front and back position relationship restores the touch point A'corresponding to the touch point A in the first coordinate system of the mobile phone (that is, the source device).
  • a pause in the second coordinate system is formed Button 6062.
  • the smart watch restores the touch point A'corresponding to the touch point A, it can perform the corresponding reverse translation, reverse zoom, or reverse rotation on the above coordinates A(x, y), so as to restore the phone
  • the playback interface 600 when the playback interface 600 is displayed in the first coordinate system where the mobile phone (i.e., the source device) is located, the coordinates of point A'in the pause button 6062 are A'(100, 20).
  • the pause button 6062 When projecting the pause button 6062, as shown in (b) in Figure 11, the pause button 6062 is translated 20 units in the negative direction on the x-axis, and the pause button 6062 is translated in the positive direction on the y-axis 30 units. And, the pause button 6062 is enlarged by 1.5 times.
  • the coordinates of point A corresponding to point A'on the pause button 6062 are A((100-20)*1.5, (20+30*1.5)) , Which is A(120,75).
  • the smart watch can reduce the coordinates of point A on the x-axis and y-axis by 1.5 times, and then reverse the x-axis coordinates by 20 Units, and translate the coordinates of point A on the y axis by 30 units in the opposite direction to obtain the coordinates A'(100, 20) corresponding to the coordinates A (120, 75) in the first coordinate system.
  • the smart watch can reversely calculate the translation distance of the touch point A this time according to the translation distance.
  • the zoom ratio of the pause button 6062 on the x-axis and the y-axis is recorded in the configuration file 1
  • the smart watch can reversely calculate the zoom ratio of the touched point A according to the zoom ratio.
  • the rotation angle of the pause button 6062 is recorded in the configuration file 1
  • the smart watch can reversely calculate the rotation angle of the touched point A according to the rotation angle.
  • the smart watch may also preset a coordinate mapping formula between the first coordinate system and the second coordinate system. In this way, after the smart watch obtains the touch point A of this touch event, it can calculate the touch point A'corresponding to the touch point A in the first coordinate system of the mobile phone according to the coordinate mapping formula.
  • the smart watch restores the touch point A'in the playback interface 600 corresponding to the touch point A on the screen projection interface 902
  • the coordinates A'(x', y') of the touch point A' can be replaced in the first touch event
  • the coordinates A(x, y) of the touch point A form a second touch event.
  • the second touch event refers to a touch event generated by the mobile phone when the user inputs a second touch operation of clicking the pause button 6062 in the playback interface 600.
  • the user does not actually click the pause button 6062 in the playback interface 600, but the smart watch simulates the user's presence in the playback interface 600 by converting touch point A to touch point A' The second touch operation of clicking the pause button 6062.
  • the smart watch can send the aforementioned second touch event to the mobile phone.
  • the application framework layer of the mobile phone can report the second touch event to the music APP running in the application layer, so that the music APP can pause the audio being played in response to the second touch event at point A'. It is understandable that the music APP can respond to the second touch event of point A', which is equivalent to the music APP responding to the user's first touch event of point A on the screen projection interface 902.
  • the user inputs the first touch operation at point A(x, y) in the projection interface 902 of the target device, and the target device generates the first touch event corresponding to the first touch operation.
  • the destination device (or source device) performs coordinate conversion on the coordinates of the touch point in the first touch event and generates a second touch event, so that the music APP in the source device thinks that the user is on the A'( of the playback interface 600 based on the second touch event).
  • x', y') point performs the second touch operation.
  • the music APP can execute the corresponding application function in response to the second touch event, so as to realize reverse control of the source device by the destination device during screen projection.
  • the smart watch (that is, the destination device) can also send touch events carrying A(x, y) points to the mobile phone (that is, the source device), and then the application framework layer in the mobile phone sends the touch event according to the above method.
  • the A(x,y) point is restored to the A'(x',y') point in the playback interface 600, and the touch event with the touch point being A'(x',y') is reported to the music APP in the phone , To achieve the function of pausing audio playback.
  • the identification of each control may also be recorded in the aforementioned configuration file 1.
  • the control corresponding to the "dest1" field is the pause button 6062
  • the identification of the pause button 6062 is 001.
  • the smart watch ie, the target device
  • the smart watch can determine that the user has performed a click operation on the pause button 6062 in the screen projection interface 902 based on the coordinates of the touch point and the touch time in the series of touch events detected.
  • the smart watch may send the identification of the pause button 6062 (for example, 001) and the determined type of touch event (for example, a click operation) to the mobile phone (that is, the source device).
  • the mobile phone can determine that the user has performed the event of clicking the pause button 6062. Then, the application framework layer in the mobile phone can report the event of the user clicking the pause button 6062 to the running music APP, so that the music APP can be called and clicked.
  • the function corresponding to the pause button 6062 realizes the function of pausing the audio playback, that is, executing the operation instruction corresponding to the first touch event.
  • the smart watch that is, the destination device determines that the user has performed a click operation on the pause button 6062 in the screen projection interface 902
  • it can also be based on the specific position of the pause button 6062 recorded in the configuration file 1 in the playback interface 600 , Generate a corresponding touch event (such as a third touch event).
  • the event type of the third touch event is the same as the event type of the first touch event, and both are click events.
  • the coordinate B of the touch point in the third touch event can be located at any position of the pause button 6062 in the play interface 600.
  • the mobile phone can also report the third touch event to the music APP running in the application layer, so that the music APP can respond to the first point of point B.
  • the three-touch event pauses the audio being played.
  • the music APP responding to the third touch event of point B is equivalent to the music APP responding to the user's first touch event of point A on the screen projection interface 902.
  • the user can also input the corresponding touch operation in the playback interface 600 displayed on the mobile phone (ie, the source device).
  • the mobile phone detects the touch event corresponding to the touch operation, the coordinate of the touch point can be converted.
  • the touch event is reported to the music APP to realize the corresponding application function.
  • the user can input touch operations in the source device to control the source device to achieve corresponding functions, or input touch operations in the destination device to control the source device to achieve corresponding functions, thereby improving the user’s ability to cast Touch experience in the screen scene.
  • the source device can continue to use the above screen projection method to project the updated display interface to the destination device for display.
  • the embodiment does not impose any limitation on this.
  • the user can project the display content in one source device to multiple different destination devices for display. Then, according to the aforementioned touch control method, the user can input a corresponding touch operation in each target device to control the source device to implement related application functions.
  • the playback interface 1201 being displayed by the video APP can be simultaneously screened to two destination devices.
  • one destination device is a smart watch
  • the other destination device is a smart TV.
  • the mobile phone can recognize that the playback interface 1201 needs to be projected to the first target control displayed on the smart watch according to the configuration file 1 corresponding to the smart watch: the control bar 1205 and the controls 1206 in the control bar 1205. Control 1207 and control 1208. Furthermore, as shown in FIG. 12, the mobile phone can project the control bar 1205 and various controls in the control bar 1205 to the smart watch to form a first projection interface 1301.
  • the mobile phone can recognize that the second target control that the playback interface 1201 needs to be projected to the smart TV display is: the video screen 1202 and the text control 1203 and the progress bar 1204 in the video screen 1202. Furthermore, as still shown in FIG. 12, the mobile phone can project the video screen 1202 and various controls in the video screen 1202 to the smart TV to form a second screen projection interface 1302.
  • the user can input a touch operation in the first projection interface 1301 to control the video APP running in the mobile phone (ie, the source device).
  • the user can also input a touch operation in the second screen projection interface 1302 to control the video APP running in the mobile phone (ie, the source device).
  • the smart watch may generate a first touch event including the touch point P1 .
  • the smart watch can convert the touch point P1 in the first screen projection interface 1301 to the touch point P1' in the playback interface 1201 according to the positional relationship of the pause button 1106 recorded in the configuration file 1 before and after the screen projection.
  • the smart watch can send the second touch event including the touch point P1' to the mobile phone, so that the video APP in the mobile phone can execute an instruction to pause the video in response to the second touch event with the touch point P1'.
  • the smart TV ie, the second destination device
  • the smart TV can generate a first touch that includes the touch point P2. event.
  • the smart watch can convert the touch point P2 in the second screen projection interface 1302 to the touch point P2' in the aforementioned playback interface 1201 according to the positional relationship of the progress bar 1104 recorded in the configuration file 2 before and after the screen projection.
  • the smart TV can send the second touch event including the touch point P2' to the mobile phone, so that the video APP in the mobile phone can respond to the second touch event with the touch point P2' to switch the video to the point P2' on the progress bar 1104 Play at the corresponding position.
  • the mobile phone can respond to each touch event in sequence according to the time sequence of receiving each touch event.
  • each target device detects a touch operation input by the user, it may also record the touch time of the touch operation.
  • the destination device sends a corresponding touch event to the source device, it can also send the touch time of the touch event. In this way, the source device can respond to touch events sent by different destination devices in sequence according to the sequence of the touch time.
  • the user can input a touch operation in any destination device to reversely control the source device to realize the control function corresponding to the touch operation. So as to improve the user's touch experience in the projection scene.
  • the user can project the display content from multiple source devices to the same destination device for display. Then, according to the above touch method, after the user inputs a corresponding touch operation on a certain control in the target device, the user can control the source device corresponding to the control to implement related application functions.
  • the user can simultaneously use the mobile phone and the smart watch as the source device of the smart TV (that is, the destination device).
  • the mobile phone can project the display content in the lock screen interface 1501 being displayed to the smart TV
  • the smart watch can project the display content in the detection interface 1502 being displayed to the smart TV.
  • smart TVs can also display the display screen of the smart TV itself.
  • the mobile phone can identify the first target control in the lock screen interface 1501 that needs to be projected to the smart TV display according to the configuration file 1 corresponding to the lock screen interface 1501 as: the icons 1512 and 1512 in the notification message 1511 Message content 1513.
  • the smart watch can identify the second target control in the detection interface 1502 that needs to be projected to the smart TV display according to the configuration file 2 corresponding to the detection interface 1502 as: heart rate information 1521 and calorie information 1522.
  • the smart TV After the smart TV receives the first target control sent by the mobile phone and the second target control sent by the smart watch, it can split and reorganize the first target control, the second target control and the control 1503 in its own display interface. . Furthermore, as still shown in FIG. 15, the smart TV may display the above-mentioned first target control, second target control, and control 1503 in the projection interface 1504. In this way, the destination device can simultaneously display the display content of multiple source devices.
  • the user can input touch operations on the corresponding controls in the projection interface 1504. If the user inputs a touch operation on the first target control in the projection interface 1504, the mobile phone (ie, the first source device) can be controlled to implement the corresponding function. If the user inputs a touch operation on the second target control in the projection interface 1504, the smart watch (ie, the second source device) can be controlled to implement the corresponding function.
  • the smart TV may generate a first touch event including the touch point Q1. Since the message content 1513 belongs to the first target control projected by the mobile phone (that is, the first source device), the smart TV can change the position of the message content 1513 recorded in the configuration file 1 before and after the screen projection to change the screen in the projection interface 1504
  • the touch point Q1 is converted into the touch point Q1' in the lock screen interface 1501.
  • the smart watch can send the second touch event including the touch point Q1' to the mobile phone, so that the mobile phone can expand the message content 1513 in response to the second touch event with the touch point P1'.
  • the smart TV ie, the destination device
  • the smart TV may generate a first touch event including the touch point Q2.
  • the heart rate information 1521 belongs to the second target control projected by the smart watch (that is, the second source device)
  • the smart TV can set the projection interface 1504 according to the position relationship of the heart rate information 1521 recorded in the profile 2 before and after the projection.
  • the touch point Q2 of is converted into the touch point Q2' in the detection interface 1502.
  • the smart TV can send the second touch event including the touch point Q2' to the smart watch, so that the smart watch can display the detailed content of the heart rate information 1521 in response to the second touch event with the touch point P2'.
  • an electronic device in a certain conference site can be used as a destination device, and various electronic devices in other conference sites can be used as a source device.
  • Each source device can project the target control to the target device for display according to the above method.
  • the user can input a corresponding control operation to the target control in the target device, thereby controlling the corresponding source device to respond to the control operation to implement reverse control during screen projection.
  • students can install the teaching assistant APP on their mobile phones or computers or tablets.
  • their electronic devices can be used as source devices to project the display content of the answer area to the teacher's mobile phone or computer or tablet for display.
  • the teacher can not only preview the answering process of multiple students in their respective answering areas in real time, but also remotely control the student's source device in their own electronic equipment, help students solve problems online, and improve the teaching effect of the teaching assistant APP.
  • the embodiment of the present application discloses an electronic device including a processor, and a memory, an input device, an output device, and a communication module connected to the processor.
  • the input device and the output device can be integrated into one device.
  • a touch sensor can be used as an input device
  • a display screen can be used as an output device
  • the touch sensor and display screen can be integrated into a touch screen.
  • the above electronic device may include: a touch screen 1801, which includes a touch sensor 1806 and a display screen 1807; one or more processors 1802; a memory 1803; a communication module 1808; one or more Application programs (not shown); and one or more computer programs 1804.
  • the above-mentioned devices can be connected through one or more communication buses 1805.
  • the one or more computer programs 1804 are stored in the aforementioned memory 1803 and configured to be executed by the one or more processors 1802, and the one or more computer programs 1804 include instructions, and the aforementioned instructions can be used to execute the aforementioned implementations.
  • the steps in the example Among them, all relevant content of the steps involved in the above method embodiments can be cited in the functional description of the corresponding physical device, which will not be repeated here.
  • the foregoing processor 1802 may specifically be the processor 110 shown in FIG. 3, the foregoing memory 1803 may specifically be the internal memory 121 and/or the external memory 120 shown in FIG. 3, and the foregoing display screen 1807 may specifically be FIG.
  • the touch sensor 1806 may be the touch sensor in the sensor module 180 shown in FIG. 3
  • the communication module 1808 may be the mobile communication module 150 and/or the wireless communication module 160 shown in FIG. The embodiment of this application does not impose any restriction on this.
  • the functional units in the various embodiments of the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • a computer readable storage medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: flash memory, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne le domaine technique des terminaux. L'invention concerne un procédé de commande tactile dans un scénario de partage d'écran, et un appareil électronique permettant à un appareil de destination de recevoir et de répondre à une opération de commande effectuée par un utilisateur par rapport à une interface de partage d'écran, ce qui permet d'améliorer l'expérience de commande tactile d'un utilisateur dans un scénario de partage d'écran. Le procédé consiste : à afficher, au moyen d'un appareil source, une première interface d'affichage ; en réponse à une instruction de partage d'écran entrée par un utilisateur, à projeter, au moyen de l'appareil source, N commandes sur la première interface d'affichage vers une interface de partage d'écran affichée par un premier appareil de destination, N étant un nombre entier supérieur à 0 ; à recevoir, au moyen de l'appareil source, un premier événement tactile envoyé par le premier appareil de destination ; et à exécuter, au moyen de l'appareil source, une instruction d'opération correspondant au premier événement tactile.
PCT/CN2020/093908 2019-06-05 2020-06-02 Procédé de commande tactile dans un scénario de partage d'écran, et appareil électronique WO2020244500A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910487623.9A CN110377250B (zh) 2019-06-05 2019-06-05 一种投屏场景下的触控方法及电子设备
CN201910487623.9 2019-06-05

Publications (1)

Publication Number Publication Date
WO2020244500A1 true WO2020244500A1 (fr) 2020-12-10

Family

ID=68249812

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/093908 WO2020244500A1 (fr) 2019-06-05 2020-06-02 Procédé de commande tactile dans un scénario de partage d'écran, et appareil électronique

Country Status (2)

Country Link
CN (1) CN110377250B (fr)
WO (1) WO2020244500A1 (fr)

Families Citing this family (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111666119B (zh) * 2019-03-06 2023-11-21 华为终端有限公司 Ui组件显示的方法及电子设备
CN110377250B (zh) * 2019-06-05 2021-07-16 华为技术有限公司 一种投屏场景下的触控方法及电子设备
CN110381195A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备
CN112995727A (zh) * 2019-12-17 2021-06-18 华为技术有限公司 一种多屏协同方法、系统及电子设备
CN113014614A (zh) * 2019-12-20 2021-06-22 青岛海信移动通信技术股份有限公司 一种设备控制方法、控制设备及被控设备
CN111176771A (zh) * 2019-12-24 2020-05-19 西安万像电子科技有限公司 数据处理方法、系统及设备
CN114157756A (zh) * 2020-08-20 2022-03-08 华为技术有限公司 任务处理方法及相关电子设备
CN111399789B (zh) * 2020-02-20 2021-11-19 华为技术有限公司 界面布局方法、装置及系统
CN111414097A (zh) * 2020-03-23 2020-07-14 维沃移动通信有限公司 一种交互方法、交互系统和显示设备
CN116795267A (zh) 2020-05-29 2023-09-22 华为技术有限公司 一种内容分享的方法、装置及系统
CN111880870B (zh) * 2020-06-19 2024-06-07 维沃移动通信有限公司 控制电子设备的方法、装置和电子设备
CN111970546A (zh) * 2020-07-21 2020-11-20 腾讯科技(深圳)有限公司 一种控制终端交互的方法、装置、电子设备和存储介质
CN114071207B (zh) * 2020-07-30 2023-03-24 华为技术有限公司 控制大屏设备显示的方法、装置、大屏设备和存储介质
CN112035048B (zh) * 2020-08-14 2022-03-25 广州视源电子科技股份有限公司 触摸数据处理方法、装置、设备及存储介质
CN114168235B (zh) * 2020-08-20 2024-06-11 华为技术有限公司 一种功能切换入口的确定方法与电子设备
CN114079809A (zh) * 2020-08-20 2022-02-22 华为技术有限公司 终端及其输入方法与装置
CN114185503B (zh) * 2020-08-24 2023-09-08 荣耀终端有限公司 多屏交互的系统、方法、装置和介质
WO2022042162A1 (fr) * 2020-08-25 2022-03-03 华为技术有限公司 Procédé et appareil pour mettre en œuvre une interface utilisateur
CN112134788B (zh) * 2020-09-18 2023-06-06 Oppo广东移动通信有限公司 事件处理方法、装置、存储介质、移动终端及电脑
CN114205546B (zh) * 2020-09-18 2023-05-05 华为终端有限公司 一种设备控制系统
CN114201130A (zh) * 2020-09-18 2022-03-18 青岛海信移动通信技术股份有限公司 一种投屏的方法、装置及存储介质
CN112130475B (zh) * 2020-09-22 2022-08-19 北京字节跳动网络技术有限公司 设备控制方法、装置、终端和存储介质
CN112328195B (zh) * 2020-10-10 2023-10-24 当趣网络科技(杭州)有限公司 投屏控制方法、系统、电子设备及介质
CN114500725B (zh) * 2020-11-13 2023-06-27 华为技术有限公司 目标内容传输方法、主设备、从设备和存储介质
CN112394895B (zh) * 2020-11-16 2023-10-13 Oppo广东移动通信有限公司 画面跨设备显示方法与装置、电子设备
CN112269527B (zh) * 2020-11-16 2022-07-08 Oppo广东移动通信有限公司 应用界面的生成方法及相关装置
CN112468863A (zh) * 2020-11-24 2021-03-09 北京字节跳动网络技术有限公司 投屏控制方法、设备及电子设备
CN114584828B (zh) * 2020-11-30 2024-05-17 上海新微技术研发中心有限公司 安卓投屏方法、计算机可读存储介质和设备
CN112527152B (zh) * 2020-12-18 2023-01-06 Oppo(重庆)智能科技有限公司 触控区域控制方法、装置、触控系统以及电子设备
CN112684993A (zh) * 2020-12-23 2021-04-20 北京小米移动软件有限公司 一种基于跨屏协作的显示方法、装置及介质
CN114741039B (zh) * 2020-12-24 2023-09-08 华为技术有限公司 设备控制方法和终端设备
CN112817790B (zh) * 2021-03-02 2024-06-28 腾讯音乐娱乐科技(深圳)有限公司 模拟用户行为的方法
CN115145515A (zh) * 2021-03-31 2022-10-04 华为技术有限公司 一种投屏方法及相关装置
CN113093977A (zh) * 2021-04-12 2021-07-09 Tcl通讯(宁波)有限公司 移动终端手表的设置方法、装置、智能终端及存储介质
CN113271425A (zh) * 2021-04-19 2021-08-17 瑞芯微电子股份有限公司 一种基于虚拟设备的互动系统和方法
CN115328565A (zh) * 2021-04-25 2022-11-11 华为技术有限公司 功能跳转方法及电子设备
CN113360116A (zh) * 2021-06-25 2021-09-07 阿波罗智联(北京)科技有限公司 控制终端的方法、装置、设备以及存储介质
CN113531423A (zh) * 2021-07-13 2021-10-22 读书郎教育科技有限公司 一种可交互的智能投射台灯及方法
CN113590248B (zh) * 2021-07-22 2024-07-23 上汽通用五菱汽车股份有限公司 车载终端的投屏方法、装置和可读存储介质
CN115756268A (zh) * 2021-09-03 2023-03-07 华为技术有限公司 跨设备交互的方法、装置、投屏系统及终端
CN115016697A (zh) * 2021-09-08 2022-09-06 荣耀终端有限公司 投屏方法、计算机设备、可读存储介质和程序产品
CN114040242B (zh) * 2021-09-30 2023-07-07 荣耀终端有限公司 投屏方法、电子设备和存储介质
CN114138167A (zh) * 2021-12-08 2022-03-04 武汉卡比特信息有限公司 手机互联分屏投射的触摸板系统及方法
CN115016714B (zh) * 2021-12-15 2023-04-07 荣耀终端有限公司 电子设备控制方法、系统、电子设备及存储介质
CN114461124B (zh) * 2022-01-30 2023-03-21 深圳创维-Rgb电子有限公司 投屏控制方法、装置、投屏器及计算机可读存储介质
CN114442985A (zh) * 2022-01-30 2022-05-06 深圳创维-Rgb电子有限公司 投屏发射器及接收器、电子设备、投屏系统及方法
CN116708642A (zh) * 2022-02-28 2023-09-05 广州视源电子科技股份有限公司 一种子母型智能设备
CN117234404A (zh) * 2022-06-06 2023-12-15 华为技术有限公司 一种设备控制方法及电子设备
CN115174988B (zh) * 2022-06-24 2024-04-30 长沙联远电子科技有限公司 一种基于dlna的音视频投屏控制方法
CN115499693B (zh) * 2022-08-09 2024-10-15 深圳市酷开网络科技股份有限公司 多屏异显的控制方法及装置、系统、存储介质、电子设备
WO2024113187A1 (fr) * 2022-11-29 2024-06-06 京东方科技集团股份有限公司 Procédé de commande coopérative à dispositifs multiples, dispositif d'affichage et système

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130328770A1 (en) * 2010-02-23 2013-12-12 Muv Interactive Ltd. System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
CN106095084A (zh) * 2016-06-06 2016-11-09 乐视控股(北京)有限公司 投屏方法及装置
CN106502604A (zh) * 2016-09-28 2017-03-15 北京小米移动软件有限公司 投屏切换方法及装置
CN106897038A (zh) * 2015-12-17 2017-06-27 北京传送科技有限公司 一种投屏系统
CN107071551A (zh) * 2017-04-26 2017-08-18 四川长虹电器股份有限公司 应用于智能电视系统中的多屏互动屏幕响应方法
CN110377250A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏场景下的触控方法及电子设备

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6387641B2 (ja) * 2014-01-15 2018-09-12 セイコーエプソン株式会社 プロジェクター、表示装置、表示システムおよび表示装置の制御方法
CN104978156B (zh) * 2014-04-02 2021-10-22 联想(北京)有限公司 多屏显示方法及多屏显示处理装置
JP6721951B2 (ja) * 2015-07-03 2020-07-15 シャープ株式会社 画像表示装置、画像表示制御方法、および、画像表示システム
CN108736981A (zh) * 2017-04-19 2018-11-02 阿里巴巴集团控股有限公司 一种无线投屏方法、装置及系统
CN109508162B (zh) * 2018-10-12 2021-08-13 福建星网视易信息系统有限公司 一种投屏显示方法、系统及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130328770A1 (en) * 2010-02-23 2013-12-12 Muv Interactive Ltd. System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
CN106897038A (zh) * 2015-12-17 2017-06-27 北京传送科技有限公司 一种投屏系统
CN106095084A (zh) * 2016-06-06 2016-11-09 乐视控股(北京)有限公司 投屏方法及装置
CN106502604A (zh) * 2016-09-28 2017-03-15 北京小米移动软件有限公司 投屏切换方法及装置
CN107071551A (zh) * 2017-04-26 2017-08-18 四川长虹电器股份有限公司 应用于智能电视系统中的多屏互动屏幕响应方法
CN110377250A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏场景下的触控方法及电子设备

Also Published As

Publication number Publication date
CN110377250B (zh) 2021-07-16
CN110377250A (zh) 2019-10-25

Similar Documents

Publication Publication Date Title
WO2020244500A1 (fr) Procédé de commande tactile dans un scénario de partage d'écran, et appareil électronique
WO2020244492A1 (fr) Procédé d'affichage par projection d'écran et dispositif électronique
WO2020244495A1 (fr) Procédé d'affichage par projection d'écran et dispositif électronique
US11922005B2 (en) Screen capture method and related device
US11722449B2 (en) Notification message preview method and electronic device
WO2020244497A1 (fr) Procédé d'affichage pour écran flexible et dispositif électronique
US11385857B2 (en) Method for displaying UI component and electronic device
WO2021121052A1 (fr) Procédé et système de coopération à écrans multiples et dispositif électronique
WO2021032097A1 (fr) Procédé d'interaction par commande gestuelle dans l'air et dispositif électronique associé
WO2021115194A1 (fr) Procédé d'affichage d'icône d'application et dispositif électronique
JP2022549157A (ja) データ伝送方法及び関連装置
WO2020192456A1 (fr) Procédé d'interaction vocale et dispositif électronique
WO2020155014A1 (fr) Système et un procédé de partage de dispositif domestique intelligent, et dispositif électronique
WO2024016559A1 (fr) Procédé de coopération de multiples dispositifs, dispositif électronique et produit associé
CN116360725B (zh) 显示交互系统、显示方法及设备
WO2023030099A1 (fr) Procédé et appareil d'interaction entre dispositifs, système de projection d'écran et terminal
WO2022078295A1 (fr) Procédé de recommandation de dispositif et dispositif électronique
WO2020155875A1 (fr) Procédé d'affichage destiné à un dispositif électronique, interface graphique personnalisée et dispositif électronique
CN115016697A (zh) 投屏方法、计算机设备、可读存储介质和程序产品
WO2021042881A1 (fr) Procédé de notification par message et dispositif électronique
WO2023202445A1 (fr) Système de démonstration, procédé, interface graphique et appareil associé
WO2023169237A1 (fr) Procédé de capture d'écran, dispositif électronique, et système
CN116095412B (zh) 视频处理方法及电子设备
WO2024140757A1 (fr) Procédé de division d'écran multi-dispositifs et appareil associé
WO2024022288A1 (fr) Procédé d'installation de dispositif intelligent et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20819274

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20819274

Country of ref document: EP

Kind code of ref document: A1