WO2020244500A1 - Method for touch control in screen casting scenario, and electronic apparatus - Google Patents

Method for touch control in screen casting scenario, and electronic apparatus Download PDF

Info

Publication number
WO2020244500A1
WO2020244500A1 PCT/CN2020/093908 CN2020093908W WO2020244500A1 WO 2020244500 A1 WO2020244500 A1 WO 2020244500A1 CN 2020093908 W CN2020093908 W CN 2020093908W WO 2020244500 A1 WO2020244500 A1 WO 2020244500A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch event
touch
source device
interface
display
Prior art date
Application number
PCT/CN2020/093908
Other languages
French (fr)
Chinese (zh)
Inventor
魏曦
曹原
范振华
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2020244500A1 publication Critical patent/WO2020244500A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Definitions

  • This application relates to the field of terminal technology, and in particular to a touch method and electronic equipment in a screen projection scenario.
  • the electronic device can realize the switching and display of multimedia data among multiple devices by means of screen projection.
  • the mobile phone when a user uses a video application in a mobile phone to watch a video, the mobile phone can be set as the source device, and the display interface in the source device can be sent to other destination devices that support the screen projection function for display.
  • the user needs to operate the current display interface of the video application, he still needs to perform corresponding operations in the mobile phone (ie, the source device) to update the display data of the mobile phone, and then the mobile phone projects the updated display data to the destination device for display.
  • the source device when the source device is not around the user or the user is inconvenient to operate the source device, the user cannot perform related control on the display interface that is being projected, resulting in a poor user experience when the projected screen is displayed.
  • the present application provides a touch method and electronic device in a screen projection scene.
  • the target device can receive and respond to the control operation performed by the user on the screen projection interface, thereby improving the user's touch experience in the screen projection scene.
  • this application provides a touch method in a screen projection scenario, including: a source device displays a first display interface; in response to a screen projection instruction input by a user, the source device sets N in the first display interface (N is (Integer greater than 0) controls are projected to the projection interface displayed by the first destination device; subsequently, if the source device receives the first touch event sent by the first destination device, the source device can execute the corresponding Operating instructions.
  • the destination device can generate a touch event in response to a touch operation input by the user, and send the touch event to the source device to implement corresponding functions, and realize the control function of reversely controlling the source device, thereby improving the user's casting Touch experience in the screen scene.
  • the aforementioned first touch event may include the coordinates of the touch point and the type of the touch event (for example, an event type such as a single click, a double tap, or a sliding).
  • the method further includes: the source device determines a target control corresponding to the first touch event, and the target control is among the aforementioned N controls
  • the operation instruction executed by the source device is the corresponding operation instruction when the target control is triggered on the source device.
  • the target control corresponding to the first touch event is a play button
  • the operation instruction corresponding to the first touch event is an operation instruction when the play button is triggered.
  • the aforementioned first touch event may be: a touch event generated by the first destination device when the user inputs a first touch operation on the aforementioned projection interface.
  • the first touch operation is a touch operation actually input by the user when the screen is projected.
  • the above-mentioned first touch event may also be: after the first destination device generates a touch event (such as a fifth touch event) in response to a touch operation input by the user on the projection interface, the first destination device sends the fifth touch event
  • a touch event such as a fifth touch event
  • the mapping is a touch event in the first display interface.
  • the source device may store a configuration file corresponding to the above-mentioned first display interface.
  • the configuration file records the display positions of the above-mentioned N controls in the first display interface, and where the N controls are The display position in the projection interface; at this time, the source device determines the target control corresponding to the first touch event, which specifically includes: the source device can determine the display position of the above N controls in the projection interface recorded in the configuration file The target control corresponding to the first touch event.
  • the source device may determine the first control as the target control.
  • the source device can determine the first control at the top level as the target control.
  • the method further includes: the source device maps the first touch event to the second touch event according to the above configuration file, and the second touch The event is: a touch event that the source device will generate when the user inputs a second touch operation to the target control in the first display interface; it should be noted that the user does not actually input the second touch operation in the first display interface.
  • the source device maps the second touch event corresponding to the second touch operation according to the first touch event.
  • the execution of the operation instruction corresponding to the first touch event by the source device means: the source device executes and reports the mapped second touch event to the first application (the first display interface being displayed by the source device is the interface of the first application) ), so that the first application executes the operation instruction corresponding to the second touch event.
  • the coordinate of the touch point in the first touch event is A
  • the coordinate of the touch point in the second touch event after mapping is B.
  • the source device executes the operation instruction corresponding to the first touch event, which is actually the first application responding to the coordinate This is the second touch event of B.
  • the user's touch operation on the projection interface can also reversely control the related application in the source device to implement the corresponding function.
  • the source device maps the first touch event to the second touch event according to the above configuration file, including: the source device displays the first display on the first display interface according to the target control recorded in the configuration file The corresponding relationship between the position and the second display position in the projection interface is converted from the first coordinate in the first touch event to the second coordinate to obtain the second touch event.
  • the source device can reversely calculate the second coordinate corresponding to the above-mentioned first coordinate in the first display interface according to changes in the translation, scaling, or rotation of the target control before and after the screen projection recorded in the configuration file.
  • the source device after the source device determines the target control corresponding to the first touch event, it can also report the identification of the target control and the event type of the first touch event to the first application, so that the first The application executes the first function, that is, the function corresponding to the first application when the target control is triggered by the operation indicated by the event type, so as to realize the function of the user to reversely control the related application in the source device in the projection interface.
  • the method further includes: the source device generates the target control at the first display position in the first display interface according to the target control recorded in the configuration file.
  • the third touch event, the event type of the third touch event is the same as the event type of the first touch event, and the third coordinate in the third touch event is located in the first display position; at this time, the source device executes corresponding to the first touch event
  • the operation instructions include: the source device reports the third touch event to the first application. That is to say, the source device converts the user’s first touch event on the screen projection interface into a third touch event on the first display interface. Then, the process of the source device responding to the third touch event is actually the source device responding to the third touch event. The process of the first touch event.
  • the source device determines the target control of the first touch event, and maps the first touch event to the second touch event. It is understandable that after the target device generates the first touch event, it can also determine the target control of the first touch event according to the above method, and map the first touch event to the second touch event. Furthermore, the destination device can send the mapped second touch event to the source device, and the source device reports the second touch event to the first application, so that the first application executes the operation instruction corresponding to the second touch event, that is, The operation instruction corresponding to the first touch event.
  • the source device displays the first display interface, it further includes: in response to the second screen-casting instruction input by the user, the source device sets M in the first display interface (M is an integer greater than 0). ) A control is projected to the second destination device for display; the source device receives the fourth touch event sent by the second destination device; the source device executes an operation instruction corresponding to the fourth touch event.
  • M is an integer greater than 0
  • the source device executes an operation instruction corresponding to the fourth touch event.
  • this application provides a touch method in a projection scene, including: a destination device receives a first message sent by a first source device, the first message includes a drawing instruction of a first target control, and the first target control One or more controls in the first display interface displayed for the first source device; the destination device invokes the drawing instruction of the first target control to draw the screen projection interface, the screen projection interface includes the first target control; in response to the user casting For the first touch operation input to the first target control on the screen interface, the destination device generates a first touch event; the destination device instructs the first source device to execute an operation instruction corresponding to the first touch event.
  • the destination device instructs the first source device to execute the operation instruction corresponding to the first touch event, including: the destination device sends the first touch event to the first source device, so that the first source device executes The operation instruction corresponding to the first touch event. After receiving the first touch event, the first source device can execute the operation instruction corresponding to the first touch event according to the method in the first aspect.
  • the target device after the target device generates the first touch event, it further includes: the target device maps the first touch event to a second touch event, and the second touch event is: The touch event that the first source device will generate when the first target control inputs the second touch operation.
  • the method for the destination device to map the first touch event to the second touch event is the same as the method for the source device to map the first touch event to the second touch event in the first aspect.
  • the destination device instructs the first source device to execute the operation instruction corresponding to the first touch event, including: the destination device sends the mapped second touch event to the first source device, so that the first source device executes the operation instruction corresponding to the second touch event.
  • the operation instruction corresponding to the touch event including: the destination device sends the mapped second touch event to the first source device, so that the first source device executes the operation instruction corresponding to the second touch event.
  • the above method further includes: the destination device receives a second message sent by the second source device, the second message includes a drawing instruction of the second target control, and the second target control is displayed by the second source device One or more controls in the second display interface; the destination device calls the drawing instruction of the second target control to draw the second target control in the projection interface; in response to the user input to the second target control in the projection interface In the third touch operation, the destination device generates a third touch event; the destination device instructs the second source device to execute an operation instruction corresponding to the third touch event.
  • the user can input touch operations on the controls projected from different source devices in the target device to control the corresponding
  • the source device implements the function corresponding to the touch operation, thereby improving the user's touch experience in the screen projection scene.
  • the present application provides an electronic device, including: a touch screen, one or more processors, one or more memories, and one or more computer programs; wherein the processor is coupled with the touch screen and the memory, and the above one One or more computer programs are stored in the memory, and when the electronic device is running, the processor executes one or more computer programs stored in the memory, so that the electronic device executes any one of the contact recommendation methods described above.
  • the present application provides a computer storage medium including computer instructions, which when the computer instructions run on an electronic device, cause the electronic device to execute the contact recommendation method described in any one of the first aspect.
  • this application provides a computer program product, which when the computer program product runs on an electronic device, causes the electronic device to execute the contact recommendation method described in any one of the first aspect.
  • the present application provides a touch control system, which may include at least one source device and at least one destination device; the source device may be used to perform touch control in the projection scene as described in any one of the first aspect.
  • Control method the target device can be used to execute the touch method in the projection scene as described in any one of the second aspect.
  • the electronic equipment described in the third aspect, the computer storage medium described in the fourth aspect, the computer program product described in the fifth aspect, and the touch control system described in the sixth aspect provided above are all used for execution.
  • the beneficial effects that can be achieved can refer to the beneficial effects of the corresponding method provided above, which will not be repeated here.
  • FIG. 1 is an architecture diagram 1 of a communication system provided by an embodiment of this application;
  • FIG. 2 is a second structural diagram of a communication system provided by an embodiment of this application.
  • FIG. 3 is a first structural diagram of an electronic device according to an embodiment of the application.
  • FIG. 4 is a structural diagram of an operating system in an electronic device provided by an embodiment of the application.
  • FIG. 5 is a first schematic diagram of a touch method in a projection scene provided by an embodiment of the application.
  • FIG. 6 is a schematic diagram of a second scene of a touch method in a projection scene provided by an embodiment of the application;
  • FIG. 7 is a scene schematic diagram 3 of a touch method in a projection scene provided by an embodiment of the application.
  • FIG. 8 is a scene schematic diagram 4 of a touch method in a projection scene provided by an embodiment of the application.
  • FIG. 9 is a schematic diagram five of a touch method in a projection scene provided by an embodiment of the application.
  • FIG. 10 is a scene schematic diagram 6 of a touch method in a projection scene provided by an embodiment of the application.
  • FIG. 11 is a scene schematic diagram 7 of a touch method in a projection scene provided by an embodiment of this application.
  • FIG. 12 is a scene schematic diagram eight of a touch method in a projection scene provided by an embodiment of this application.
  • FIG. 13 is a scene schematic diagram 9 of a touch method in a projection scene provided by an embodiment of the application.
  • FIG. 14 is a schematic diagram ten of a touch method in a projection scene provided by an embodiment of the application.
  • 15 is a schematic eleventh scene of a touch method in a projection scene provided by an embodiment of this application.
  • 16 is a schematic diagram 12 of a touch method in a projection scene provided by an embodiment of this application.
  • FIG. 17 is a schematic diagram 13 of a touch method in a projection scene provided by an embodiment of the application.
  • FIG. 18 is a second structural diagram of an electronic device provided by an embodiment of this application.
  • a touch method in a projection scene can be applied to a communication system 100, and the communication system 100 may include N (N>1) electronic devices.
  • the communication system 100 may include an electronic device 101 and an electronic device 102.
  • the electronic device 101 may be connected to the electronic device 102 through one or more communication networks 104.
  • the communication network 104 may be a wired network or a wireless network).
  • the aforementioned communication network 104 may be a local area network (local area networks, LAN), or a wide area network (wide area networks, WAN), such as the Internet.
  • the communication network 104 can be implemented using any known network communication protocol.
  • the above-mentioned network communication protocol can be various wired or wireless communication protocols, such as Ethernet, universal serial bus (USB), and Firewire (FIREWIRE).
  • GSM Global System for Mobile Communications
  • GPRS General Packet Radio Service
  • CDMA Code Division Multiple Access
  • WCDMA Broadband Code Division Multiple Access
  • TD-SCDMA time-division code division multiple access
  • LTE long term evolution
  • Bluetooth wireless fidelity (Wi-Fi)
  • Wi-Fi wireless fidelity
  • NFC voice over Internet protocol (VoIP) based on Internet protocol, communication protocol supporting network slicing architecture or any other suitable communication protocol.
  • VoIP voice over Internet protocol
  • the electronic device 101 may establish a Wi-Fi connection with the electronic device 102 through a Wi-Fi protocol.
  • the electronic device 101 may be used as a source device
  • the electronic device 102 may be used as a destination device
  • the electronic device 101 ie, the source device
  • the electronic device 101 may project the display content in its display interface to the electronic device 102 (ie, the destination device) for display.
  • the electronic device 102 can also be used as a source device, and the electronic device 102 projects the display content in its display interface to the electronic device 101 (that is, the destination device) for display.
  • the communication system 100 described above may further include one or more other electronic devices such as an electronic device 103.
  • the electronic device 103 may be a wearable device.
  • the electronic device 103 may also be used as a source device or a destination device for projection display.
  • both the electronic device 102 and the electronic device 103 can be used as the destination device of the electronic device 101.
  • the electronic device 101 can project the display content in its display interface to the electronic device 102 and the electronic device 103 for display at the same time.
  • one source device can simultaneously project to multiple destination devices.
  • both the electronic device 102 and the electronic device 103 can be used as the source device of the electronic device 101.
  • the electronic device 102 and the electronic device 103 can simultaneously project the display content in their display interfaces to the electronic device 101 for display.
  • a destination device can simultaneously receive and display the display content sent by multiple source devices.
  • any electronic device in the aforementioned communication system 100 can be used as a source device or a destination device, which is not limited in the embodiment of the present application.
  • the specific structures of the electronic device 101, the electronic device 102, and the electronic device 103 may be the same or different.
  • each of the above electronic devices may specifically be mobile phones, tablet computers, smart TVs, wearable electronic devices, car machines, notebook computers, ultra-mobile personal computers (UMPC), handheld computers, netbooks, and personal digital assistants. (personal digital assistant, PDA), virtual reality equipment, etc.
  • UMPC ultra-mobile personal computers
  • PDA personal digital assistant
  • virtual reality equipment etc.
  • the embodiments of the present application do not make any restrictions on this.
  • FIG. 3 shows a schematic structural diagram of the electronic device 101.
  • the electronic device 101 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2.
  • Mobile communication module 150 Wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, camera 193, display screen 194, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 101.
  • the electronic device 101 may include more or fewer components than shown in the figure, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (PCM) interface, and a universal asynchronous transmitter receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / Or Universal Serial Bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous transmitter receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive the wireless charging input through the wireless charging coil of the electronic device 101. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110.
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 101 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 101 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 101.
  • the mobile communication module 150 may include one or more filters, switches, power amplifiers, low noise amplifiers (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering and amplifying the received electromagnetic waves, and then transmitting them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 101, including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), and global navigation satellites.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating one or more communication processing modules.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic wave radiation via the antenna 2.
  • the antenna 1 of the electronic device 101 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 101 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 101 implements a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, connected to the display 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, etc.
  • the display screen 194 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active-matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the electronic device 101 may include one or N display screens 194, and N is a positive integer greater than one.
  • the electronic device 101 can realize a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back from the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transfers the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats.
  • the electronic device 101 may include 1 or N cameras 193, and N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 101 selects the frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 101 may support one or more video codecs. In this way, the electronic device 101 can play or record videos in a variety of encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 101.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store one or more computer programs, and the one or more computer programs include instructions.
  • the processor 110 can run the above-mentioned instructions stored in the internal memory 121 to enable the electronic device 101 to execute the methods provided in some embodiments of the present application, as well as various functional applications and data processing.
  • the internal memory 121 may include a storage program area and a storage data area. Among them, the storage program area can store the operating system; the storage program area can also store one or more application programs (such as a gallery, contacts, etc.) and so on.
  • the data storage area can store data (such as photos, contacts, etc.) created during the use of the electronic device 101.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, universal flash storage (UFS), etc.
  • the processor 110 executes the instructions stored in the internal memory 121 and/or the instructions stored in the memory provided in the processor to cause the electronic device 101 to execute the method provided in the embodiments of the present application. And various functional applications and data processing.
  • the electronic device 101 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called a “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 101 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 101 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can approach the microphone 170C through the mouth to make a sound, and input the sound signal to the microphone 170C.
  • the electronic device 101 may be provided with one or more microphones 170C. In other embodiments, the electronic device 101 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals. In some other embodiments, the electronic device 101 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, and a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA, CTIA
  • the sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
  • the above electronic equipment may also include one or more components such as buttons, motors, indicators, and SIM card interfaces, which are not limited in the embodiment of the present application.
  • the above-mentioned software system of the electronic device 101 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of the present application takes a layered Android system as an example to illustrate the software structure of the electronic device 101.
  • FIG. 4 is a block diagram of the software structure of the electronic device 101 according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of applications.
  • the above-mentioned applications may include APPs (applications) such as call, contact, camera, gallery, calendar, map, navigation, Bluetooth, music, video, short message, etc.
  • APPs applications
  • the application framework layer provides application programming interfaces (application programming interface, API) and programming frameworks for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a view system, a notification manager, an activity manager, a window manager, a content provider, a resource manager, an input manager, and so on.
  • the view system can be used to construct the display interface of the application.
  • Each display interface can consist of one or more controls.
  • controls can include interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, and widgets.
  • Multiple controls in the display interface can be organized hierarchically according to a tree structure to form a complete ViewTree (view tree).
  • the view system can draw the display interface according to the ViewTree of the display interface, and each control in the display interface corresponds to a set of drawing instructions, such as DrawLine, DrawPoint, DrawBitmap, etc.
  • FIG. 5 shows the chat interface 401 of the WeChat APP.
  • the bottommost control in the chat interface 401 is the root node.
  • the root node is provided with a basemap 402 control.
  • the following controls are also included: a title bar 403, a chat background 404, and an input bar 405.
  • the title bar 403 further includes a return button 406 and a title 407
  • the chat background 404 further includes an avatar 408 and a bubble 409
  • the input bar 405 further includes a voice input button icon 410, an input box 411, and a send button 412.
  • the above controls are layered in order to form a view tree A as shown in Figure 5(b).
  • the base map 402 is a child node of the root node
  • the title bar 403, the chat background 404, and the input field 405 are all child nodes of the base map 402.
  • Both the return button 406 and the title 407 are child nodes of the title bar 403.
  • Both the avatar 408 and the bubble 409 are child nodes of the chat background 404.
  • the voice input button icon 410, the input box 411, and the send button 412 are all child nodes of the input field 405.
  • the view system can call the drawing instructions of the corresponding controls layer by layer to draw each control according to the layer relationship between the controls in the view tree A, and finally form the chat interface 401.
  • the view system can split, delete, or reorganize the controls in the view tree of the current display interface to determine This time, it needs to be projected to one or more target controls displayed on the target device. Furthermore, the electronic device 101 can project the determined target control to the target device to form a screen projection interface, thereby adapting to device characteristics such as the display size of the target device, and improving the display effect and user experience of the target device in the screen projection scene.
  • the electronic device 101 projects the target control in the display interface to the projection interface of the target device (for example, the electronic device 102)
  • the user can display the control in the projection interface in the electronic device 102.
  • the target control inputs a corresponding touch operation to control the electronic device 101 to implement a function corresponding to the touch operation.
  • the APP running in the application layer obtains the touch operation input by the user on the touch screen is a process of distributing messages from the bottom layer to the top layer.
  • the touch screen can obtain the relevant information of the touch operation (for example, the coordinates of the touch point, etc.), and then the touch screen can report the original touch generated by the touch operation to the kernel layer in the form of an interrupt through the corresponding driver event.
  • the kernel layer can encapsulate the touch event into a high-level touch event (for example, action down event, action move event, action up event, etc.) that can be read by the upper layer, and combine the Advanced touch events are sent to the Framework layer.
  • the Framework layer can report the above-mentioned advanced touch events to the application process of the running application A in the application layer.
  • the application process of application A calls the corresponding library function to determine the specific control acted by the advanced touch event and the event type of the advanced touch event.
  • the event type may include single click, double tap, sliding, etc. Take the user clicking the play button as an example.
  • the process of application A can call the callback function corresponding to the touch event of clicking the play button Realize the application function corresponding to this touch operation.
  • a coordinate conversion module may be set in the application framework layer of the target device.
  • the original touch event reported by the touch screen of the target device to the kernel layer includes the coordinates (x, y) of the touch point, and the coordinates (x, y) are the user's touch position in the projection interface after the projection.
  • the touch point (x, y) in the advanced touch event reported by the kernel layer to the Framework layer is also the user's touch position in the projection interface.
  • its coordinate conversion module can map the coordinate (x, y) to the corresponding coordinate (x', y') in the display interface of the source device.
  • the destination device can send an advanced touch event carrying coordinates (x', y') to the source device, and the Framework layer of the source device reports the advanced touch event to the screen-projecting application.
  • the application receives an advanced touch event with a touch point of (x', y'), it is equivalent to receiving a touch event generated by the user on the touch coordinates (x', y') in the source device, then the application can Respond to the touch event carrying the coordinates (x', y') to realize the corresponding application function.
  • the target device can generate a first touch event that carries coordinates (x, y). Furthermore, the destination device may map the first touch event to a second touch event whose touch point is (x', y') on the display interface of the source device. In this way, after the source device receives the second touch event sent by the target device, it can execute the corresponding application function in response to the second touch event, so as to realize the reverse control function of the target device on the display interface of the source device after the screen is projected.
  • the aforementioned coordinate conversion module can also be set in the Framework layer of the source device.
  • the destination device can send the first touch event with a touch point of (x, y) to the source device, and the coordinate conversion module of the source device maps the first touch event to a touch point of (x', y')
  • the embodiment of the present application does not impose any limitation on this.
  • a touch event is detected and generated by the touch screen, and the coordinates of the touch point in the touch event are coordinated as an example. It is understandable that when the user inputs a single click, long press or sliding touch operation on the touch screen, the touch screen can detect a series of touch events. For each touch event, the destination device (or source device) can convert the coordinates of the touch point in the touch event according to the above-mentioned method, and the embodiment of the present application does not impose any limitation on this.
  • the aforementioned activity manager can be used to manage the life cycle of each application.
  • Applications usually run in the operating system in the form of activity.
  • the activity manager can schedule the activity process of the application to manage the life cycle of each application.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display, determine whether there is a status bar, lock the screen, take a screenshot, etc.
  • the content provider is used to store and retrieve data and make these data accessible to applications.
  • the data may include video, image, audio, phone calls made and received, browsing history and bookmarks, phone book, etc.
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, etc.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), three-dimensional graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem, and provides a combination of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support multiple audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, a sensor driver, etc., which are not limited in the embodiment of the present application.
  • the mobile phone may project one or more controls in its display interface to the smart watch for display.
  • the playback interface 600 includes the following controls: a base map 601, a status bar 602, a title bar 603, an album cover 604, lyrics 605, and a control bar 606.
  • the status bar 602 includes controls such as time, signal strength, and battery capacity.
  • the title bar 603 includes controls such as the song name 6031 and the singer 6032.
  • the control bar 606 includes controls such as a progress bar 6061, a pause button 6062, a previous button 6063, and a next button 6064.
  • the mobile phone can obtain the corresponding view tree when the view system draws the aforementioned playback interface 600, and the drawing instructions and drawing resources of each control in the view tree.
  • FIG. 7 it is the view tree 701 of the above-mentioned playing interface 600.
  • the view tree 701 records the layer relationship between the various controls in the aforementioned playback interface 600.
  • the root node of the playback interface 600 includes a sub-node of the base map 601.
  • the status bar 602, the title bar 603, the album cover 604, the lyrics 605, and the control bar 606 are all sub-nodes of the base map 601.
  • the song title 6031 and the singer 6041 are child nodes of the title column 603.
  • the progress bar 6061, the pause button 6062, the previous button 6063, and the next button 6064 are child nodes of the control bar 606.
  • the mobile phone can further determine one or more controls (ie, target controls) in the playback interface 600 that need to be projected to the display in the mobile phone 500.
  • controls ie, target controls
  • a configuration file corresponding to the aforementioned playback interface 600 may be preset in the mobile phone.
  • the mobile phone may obtain the configuration file corresponding to the playing interface 600 from the server.
  • the configuration file records one or more controls (ie, target controls) that need to be projected onto the smart watch in the playback interface 600 of the mobile phone.
  • the above configuration file may be stored in a mobile phone or server in a format such as JSON (JavaScript Object Notation) format, XML (Extensible Markup Language) format, or text format, and the embodiment of the present application does not impose any limitation on this.
  • JSON JavaScript Object Notation
  • XML Extensible Markup Language
  • the configuration file 1 corresponding to the playing interface 600 may be:
  • the configuration file 1 contains multiple "src” fields (for example, the aforementioned "src1" field and "src2" field).
  • Each "src” field records the specific position of a control in the playback interface 600.
  • the position of each control can be uniquely determined by the values of the 4 parameters: left, top, widht, and height.
  • left is the size of the top left corner of the control on the x axis
  • top is the size of the top left corner of the control on the y axis
  • width is the width of the control
  • height is the height of the control.
  • the one or more controls recorded in the configuration file 1 are the target controls that the mobile phone needs to project to the smart watch.
  • the mobile phone can identify the target control in the playback interface 600 that needs to be projected to the mobile phone 500 based on the view tree 701.
  • the target control includes: the song name 6031 and artist 6041 in the title bar 603, the pause button 6062 in the control bar 606, the previous button 6063 and the next button 6064, and the album cover 604.
  • the configuration file 1 may also record the specific display position of the target control on the screen projection interface after the screen is projected.
  • a "dest1" field corresponding to the "src1" field can be set in the above configuration file 1, and the "dest1" field is used to indicate the display position of the control 1 in the target device.
  • the "dest1" field is as follows:
  • the mobile phone can determine the display position of each target control in the playback interface 600 on the screen projection interface of the smart watch according to the respective "dest” fields in the configuration file 1.
  • the playback interface 600 of the mobile phone ie the source device
  • the control 1 recorded in the "src1" field is located in the first coordinate system before the screen is projected.
  • the screen projection interface of the smart watch ie, the destination device
  • the control 1 recorded in the "dest1" field is located in the area 802 of the second coordinate system before and after the projection.
  • Any position in the area 801 uniquely corresponds to a position in the area 802.
  • the configuration file 1 can also record the change of the display position of the target control before and after the screen is projected.
  • the following fields are also set for control 1 in configuration file 1:
  • the "translationx” field and the “translationy” field are used to indicate the translation distance of the control 1 on the x-axis and the y-axis after the projection; the "scalex” field and the “scaley” field are respectively used to indicate that the control 1 is in the The zoom ratio on the x-axis and y-axis; the “rotatedegree” field is used to indicate the rotation angle of control 1 after projection; the “order” field is used to indicate the layer position of control 1 after projection (for example, in the bottom layer Still in the top layer).
  • the mobile phone can also determine the display position of the control 1 on the screen projection interface of the smart watch after the screen is projected based on the change relationship of the display position of the control 1 before and after the screen is projected recorded in the above field. That is, the position of the control 1 in the first coordinate system and the position of the control 1 in the second coordinate system are determined.
  • the view tree 701 of the playback interface 600 can be split, cropped, and reorganized to generate a projection screen.
  • the mobile phone deletes nodes in the view tree 701 that are not target controls, such as the above-mentioned base map 601, status bar 602, various controls in the status bar 602, and the control bar 606 in the progress bar 6061.
  • the phone can change the song name in the title bar 603 6031 and singer 6041 are set as the child nodes of the album cover 604, and the pause button 6062, the previous button 6063 and the next button 6064 in the control bar 606 are also set as the child nodes of the album cover 604.
  • the mobile phone that is, the source device
  • the UI message includes the view tree 901 and the drawing instructions and drawing resources related to each control in the view tree 901. .
  • the smart watch After the smart watch receives the UI message corresponding to the above-mentioned playback interface 600, it can call the drawing instructions of each target control in the view tree 901 in turn according to the level and order in the view tree 901, and draw the target at the position specified in the configuration file 1. Control. Finally, as shown in (b) of FIG. 9, the smart watch can draw the screen projection interface 902 after the above-mentioned playback interface 600 is projected. Each control in the projection interface 902 corresponds to each control in the view tree 901 one-to-one.
  • the mobile phone when the mobile phone displays the above-mentioned playback interface 600 on the smart watch, it can split, delete, and reorganize the controls in the playback interface 600, so that the final screen projection is in the smart watch.
  • the interface 902 can be adapted to the display size of the display screen in the smart watch and the user's usage requirements, so as to improve the display effect and user experience during screen projection between multiple devices.
  • the user can input the corresponding touch operation on each target control after the projection in the screen projection interface 902, and the smart watch can respond to the touch operation A corresponding touch event is generated, and further, the smart watch can control the mobile phone to implement a function corresponding to the touch event.
  • the touch event may include the coordinates of the touch point and the time type of the touch event (for example, single click, double tap, slide, etc.).
  • the user wants the music APP to pause the song being played he can click the pause button 6062 in the screen projection interface 902.
  • the user wants the music APP to play the previous song he can click the previous button 6063 in the screen projection interface 902.
  • the user wants the music APP to play the next song he can click the next button 6064 in the screen projection interface 902.
  • the screen projection interface 902 displayed by the smart watch is located in the second coordinate system.
  • the touch sensor of the smart watch can detect the touch operation input by the user on the screen projection interface 902 in real time.
  • the touch sensor of the smart watch can encapsulate the detected touch information (for example, the coordinate information of touch point A, the touch time, etc.) as a first touch event, and the first touch The touch event is reported to the core layer of the smart watch.
  • the first touch event is generated by the smart watch in response to the first touch operation of the user clicking the pause button 6062 in the projection interface 902.
  • the touch sensor can encapsulate the detected touch operation as the first original touch event and report it to the kernel layer through the driver, and then the kernel layer encapsulates the original touch event as the upper layer.
  • the first high-level touch event read is reported to the application framework layer.
  • the application framework layer After the application framework layer receives the first advanced touch event carrying the coordinates A (x, y), it can determine the user's current touch target according to the display position of each control recorded in the configuration file 1 in the projection interface 902 Which one is the control?
  • the “dest1” field in the aforementioned configuration file 1 records that the pause button 6062 is located in area 1 of the projection interface 902. Then, when the coordinates A(x, y) fall into area 1, the smart watch can determine that the target control acted on by the user this touch operation is the pause button 6062.
  • the coordinates A(x, y) may fall into two controls at the same time.
  • the coordinates A(x, y) are located in the area where the pause button 6062 is located, and are also located in the area where the album cover 604 is located.
  • the smart watch can determine the uppermost control as the target control for the user's current touch operation according to the "order" field recorded in the configuration file 1.
  • the screen projection interface 902 Take the user clicking the pause button 6062 in the screen projection interface 902 as an example.
  • the application framework layer determines that the first touch event this time is a touch event on the pause button 6062
  • the screen can be projected according to the pause button 6062 recorded in the configuration file 1.
  • the front and back position relationship restores the touch point A'corresponding to the touch point A in the first coordinate system of the mobile phone (that is, the source device).
  • a pause in the second coordinate system is formed Button 6062.
  • the smart watch restores the touch point A'corresponding to the touch point A, it can perform the corresponding reverse translation, reverse zoom, or reverse rotation on the above coordinates A(x, y), so as to restore the phone
  • the playback interface 600 when the playback interface 600 is displayed in the first coordinate system where the mobile phone (i.e., the source device) is located, the coordinates of point A'in the pause button 6062 are A'(100, 20).
  • the pause button 6062 When projecting the pause button 6062, as shown in (b) in Figure 11, the pause button 6062 is translated 20 units in the negative direction on the x-axis, and the pause button 6062 is translated in the positive direction on the y-axis 30 units. And, the pause button 6062 is enlarged by 1.5 times.
  • the coordinates of point A corresponding to point A'on the pause button 6062 are A((100-20)*1.5, (20+30*1.5)) , Which is A(120,75).
  • the smart watch can reduce the coordinates of point A on the x-axis and y-axis by 1.5 times, and then reverse the x-axis coordinates by 20 Units, and translate the coordinates of point A on the y axis by 30 units in the opposite direction to obtain the coordinates A'(100, 20) corresponding to the coordinates A (120, 75) in the first coordinate system.
  • the smart watch can reversely calculate the translation distance of the touch point A this time according to the translation distance.
  • the zoom ratio of the pause button 6062 on the x-axis and the y-axis is recorded in the configuration file 1
  • the smart watch can reversely calculate the zoom ratio of the touched point A according to the zoom ratio.
  • the rotation angle of the pause button 6062 is recorded in the configuration file 1
  • the smart watch can reversely calculate the rotation angle of the touched point A according to the rotation angle.
  • the smart watch may also preset a coordinate mapping formula between the first coordinate system and the second coordinate system. In this way, after the smart watch obtains the touch point A of this touch event, it can calculate the touch point A'corresponding to the touch point A in the first coordinate system of the mobile phone according to the coordinate mapping formula.
  • the smart watch restores the touch point A'in the playback interface 600 corresponding to the touch point A on the screen projection interface 902
  • the coordinates A'(x', y') of the touch point A' can be replaced in the first touch event
  • the coordinates A(x, y) of the touch point A form a second touch event.
  • the second touch event refers to a touch event generated by the mobile phone when the user inputs a second touch operation of clicking the pause button 6062 in the playback interface 600.
  • the user does not actually click the pause button 6062 in the playback interface 600, but the smart watch simulates the user's presence in the playback interface 600 by converting touch point A to touch point A' The second touch operation of clicking the pause button 6062.
  • the smart watch can send the aforementioned second touch event to the mobile phone.
  • the application framework layer of the mobile phone can report the second touch event to the music APP running in the application layer, so that the music APP can pause the audio being played in response to the second touch event at point A'. It is understandable that the music APP can respond to the second touch event of point A', which is equivalent to the music APP responding to the user's first touch event of point A on the screen projection interface 902.
  • the user inputs the first touch operation at point A(x, y) in the projection interface 902 of the target device, and the target device generates the first touch event corresponding to the first touch operation.
  • the destination device (or source device) performs coordinate conversion on the coordinates of the touch point in the first touch event and generates a second touch event, so that the music APP in the source device thinks that the user is on the A'( of the playback interface 600 based on the second touch event).
  • x', y') point performs the second touch operation.
  • the music APP can execute the corresponding application function in response to the second touch event, so as to realize reverse control of the source device by the destination device during screen projection.
  • the smart watch (that is, the destination device) can also send touch events carrying A(x, y) points to the mobile phone (that is, the source device), and then the application framework layer in the mobile phone sends the touch event according to the above method.
  • the A(x,y) point is restored to the A'(x',y') point in the playback interface 600, and the touch event with the touch point being A'(x',y') is reported to the music APP in the phone , To achieve the function of pausing audio playback.
  • the identification of each control may also be recorded in the aforementioned configuration file 1.
  • the control corresponding to the "dest1" field is the pause button 6062
  • the identification of the pause button 6062 is 001.
  • the smart watch ie, the target device
  • the smart watch can determine that the user has performed a click operation on the pause button 6062 in the screen projection interface 902 based on the coordinates of the touch point and the touch time in the series of touch events detected.
  • the smart watch may send the identification of the pause button 6062 (for example, 001) and the determined type of touch event (for example, a click operation) to the mobile phone (that is, the source device).
  • the mobile phone can determine that the user has performed the event of clicking the pause button 6062. Then, the application framework layer in the mobile phone can report the event of the user clicking the pause button 6062 to the running music APP, so that the music APP can be called and clicked.
  • the function corresponding to the pause button 6062 realizes the function of pausing the audio playback, that is, executing the operation instruction corresponding to the first touch event.
  • the smart watch that is, the destination device determines that the user has performed a click operation on the pause button 6062 in the screen projection interface 902
  • it can also be based on the specific position of the pause button 6062 recorded in the configuration file 1 in the playback interface 600 , Generate a corresponding touch event (such as a third touch event).
  • the event type of the third touch event is the same as the event type of the first touch event, and both are click events.
  • the coordinate B of the touch point in the third touch event can be located at any position of the pause button 6062 in the play interface 600.
  • the mobile phone can also report the third touch event to the music APP running in the application layer, so that the music APP can respond to the first point of point B.
  • the three-touch event pauses the audio being played.
  • the music APP responding to the third touch event of point B is equivalent to the music APP responding to the user's first touch event of point A on the screen projection interface 902.
  • the user can also input the corresponding touch operation in the playback interface 600 displayed on the mobile phone (ie, the source device).
  • the mobile phone detects the touch event corresponding to the touch operation, the coordinate of the touch point can be converted.
  • the touch event is reported to the music APP to realize the corresponding application function.
  • the user can input touch operations in the source device to control the source device to achieve corresponding functions, or input touch operations in the destination device to control the source device to achieve corresponding functions, thereby improving the user’s ability to cast Touch experience in the screen scene.
  • the source device can continue to use the above screen projection method to project the updated display interface to the destination device for display.
  • the embodiment does not impose any limitation on this.
  • the user can project the display content in one source device to multiple different destination devices for display. Then, according to the aforementioned touch control method, the user can input a corresponding touch operation in each target device to control the source device to implement related application functions.
  • the playback interface 1201 being displayed by the video APP can be simultaneously screened to two destination devices.
  • one destination device is a smart watch
  • the other destination device is a smart TV.
  • the mobile phone can recognize that the playback interface 1201 needs to be projected to the first target control displayed on the smart watch according to the configuration file 1 corresponding to the smart watch: the control bar 1205 and the controls 1206 in the control bar 1205. Control 1207 and control 1208. Furthermore, as shown in FIG. 12, the mobile phone can project the control bar 1205 and various controls in the control bar 1205 to the smart watch to form a first projection interface 1301.
  • the mobile phone can recognize that the second target control that the playback interface 1201 needs to be projected to the smart TV display is: the video screen 1202 and the text control 1203 and the progress bar 1204 in the video screen 1202. Furthermore, as still shown in FIG. 12, the mobile phone can project the video screen 1202 and various controls in the video screen 1202 to the smart TV to form a second screen projection interface 1302.
  • the user can input a touch operation in the first projection interface 1301 to control the video APP running in the mobile phone (ie, the source device).
  • the user can also input a touch operation in the second screen projection interface 1302 to control the video APP running in the mobile phone (ie, the source device).
  • the smart watch may generate a first touch event including the touch point P1 .
  • the smart watch can convert the touch point P1 in the first screen projection interface 1301 to the touch point P1' in the playback interface 1201 according to the positional relationship of the pause button 1106 recorded in the configuration file 1 before and after the screen projection.
  • the smart watch can send the second touch event including the touch point P1' to the mobile phone, so that the video APP in the mobile phone can execute an instruction to pause the video in response to the second touch event with the touch point P1'.
  • the smart TV ie, the second destination device
  • the smart TV can generate a first touch that includes the touch point P2. event.
  • the smart watch can convert the touch point P2 in the second screen projection interface 1302 to the touch point P2' in the aforementioned playback interface 1201 according to the positional relationship of the progress bar 1104 recorded in the configuration file 2 before and after the screen projection.
  • the smart TV can send the second touch event including the touch point P2' to the mobile phone, so that the video APP in the mobile phone can respond to the second touch event with the touch point P2' to switch the video to the point P2' on the progress bar 1104 Play at the corresponding position.
  • the mobile phone can respond to each touch event in sequence according to the time sequence of receiving each touch event.
  • each target device detects a touch operation input by the user, it may also record the touch time of the touch operation.
  • the destination device sends a corresponding touch event to the source device, it can also send the touch time of the touch event. In this way, the source device can respond to touch events sent by different destination devices in sequence according to the sequence of the touch time.
  • the user can input a touch operation in any destination device to reversely control the source device to realize the control function corresponding to the touch operation. So as to improve the user's touch experience in the projection scene.
  • the user can project the display content from multiple source devices to the same destination device for display. Then, according to the above touch method, after the user inputs a corresponding touch operation on a certain control in the target device, the user can control the source device corresponding to the control to implement related application functions.
  • the user can simultaneously use the mobile phone and the smart watch as the source device of the smart TV (that is, the destination device).
  • the mobile phone can project the display content in the lock screen interface 1501 being displayed to the smart TV
  • the smart watch can project the display content in the detection interface 1502 being displayed to the smart TV.
  • smart TVs can also display the display screen of the smart TV itself.
  • the mobile phone can identify the first target control in the lock screen interface 1501 that needs to be projected to the smart TV display according to the configuration file 1 corresponding to the lock screen interface 1501 as: the icons 1512 and 1512 in the notification message 1511 Message content 1513.
  • the smart watch can identify the second target control in the detection interface 1502 that needs to be projected to the smart TV display according to the configuration file 2 corresponding to the detection interface 1502 as: heart rate information 1521 and calorie information 1522.
  • the smart TV After the smart TV receives the first target control sent by the mobile phone and the second target control sent by the smart watch, it can split and reorganize the first target control, the second target control and the control 1503 in its own display interface. . Furthermore, as still shown in FIG. 15, the smart TV may display the above-mentioned first target control, second target control, and control 1503 in the projection interface 1504. In this way, the destination device can simultaneously display the display content of multiple source devices.
  • the user can input touch operations on the corresponding controls in the projection interface 1504. If the user inputs a touch operation on the first target control in the projection interface 1504, the mobile phone (ie, the first source device) can be controlled to implement the corresponding function. If the user inputs a touch operation on the second target control in the projection interface 1504, the smart watch (ie, the second source device) can be controlled to implement the corresponding function.
  • the smart TV may generate a first touch event including the touch point Q1. Since the message content 1513 belongs to the first target control projected by the mobile phone (that is, the first source device), the smart TV can change the position of the message content 1513 recorded in the configuration file 1 before and after the screen projection to change the screen in the projection interface 1504
  • the touch point Q1 is converted into the touch point Q1' in the lock screen interface 1501.
  • the smart watch can send the second touch event including the touch point Q1' to the mobile phone, so that the mobile phone can expand the message content 1513 in response to the second touch event with the touch point P1'.
  • the smart TV ie, the destination device
  • the smart TV may generate a first touch event including the touch point Q2.
  • the heart rate information 1521 belongs to the second target control projected by the smart watch (that is, the second source device)
  • the smart TV can set the projection interface 1504 according to the position relationship of the heart rate information 1521 recorded in the profile 2 before and after the projection.
  • the touch point Q2 of is converted into the touch point Q2' in the detection interface 1502.
  • the smart TV can send the second touch event including the touch point Q2' to the smart watch, so that the smart watch can display the detailed content of the heart rate information 1521 in response to the second touch event with the touch point P2'.
  • an electronic device in a certain conference site can be used as a destination device, and various electronic devices in other conference sites can be used as a source device.
  • Each source device can project the target control to the target device for display according to the above method.
  • the user can input a corresponding control operation to the target control in the target device, thereby controlling the corresponding source device to respond to the control operation to implement reverse control during screen projection.
  • students can install the teaching assistant APP on their mobile phones or computers or tablets.
  • their electronic devices can be used as source devices to project the display content of the answer area to the teacher's mobile phone or computer or tablet for display.
  • the teacher can not only preview the answering process of multiple students in their respective answering areas in real time, but also remotely control the student's source device in their own electronic equipment, help students solve problems online, and improve the teaching effect of the teaching assistant APP.
  • the embodiment of the present application discloses an electronic device including a processor, and a memory, an input device, an output device, and a communication module connected to the processor.
  • the input device and the output device can be integrated into one device.
  • a touch sensor can be used as an input device
  • a display screen can be used as an output device
  • the touch sensor and display screen can be integrated into a touch screen.
  • the above electronic device may include: a touch screen 1801, which includes a touch sensor 1806 and a display screen 1807; one or more processors 1802; a memory 1803; a communication module 1808; one or more Application programs (not shown); and one or more computer programs 1804.
  • the above-mentioned devices can be connected through one or more communication buses 1805.
  • the one or more computer programs 1804 are stored in the aforementioned memory 1803 and configured to be executed by the one or more processors 1802, and the one or more computer programs 1804 include instructions, and the aforementioned instructions can be used to execute the aforementioned implementations.
  • the steps in the example Among them, all relevant content of the steps involved in the above method embodiments can be cited in the functional description of the corresponding physical device, which will not be repeated here.
  • the foregoing processor 1802 may specifically be the processor 110 shown in FIG. 3, the foregoing memory 1803 may specifically be the internal memory 121 and/or the external memory 120 shown in FIG. 3, and the foregoing display screen 1807 may specifically be FIG.
  • the touch sensor 1806 may be the touch sensor in the sensor module 180 shown in FIG. 3
  • the communication module 1808 may be the mobile communication module 150 and/or the wireless communication module 160 shown in FIG. The embodiment of this application does not impose any restriction on this.
  • the functional units in the various embodiments of the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • a computer readable storage medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: flash memory, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to the technical field of terminals. Provided are a method for touch control in a screen casting scenario, and an electronic apparatus allowing a destination apparatus to receive and respond to a control operation performed by a user with respect to a screen casting interface, thereby improving the touch control experience of a user in a screen casting scenario. The method comprises: a source apparatus displaying a first display interface; in response to a screen casting instruction input by a user, the source apparatus projecting N controls on the first display interface to a screen casting interface displayed by a first destination apparatus, where N is an integer greater than 0; the source apparatus receiving a first touch event sent by the first destination apparatus; and the source apparatus executing an operation instruction corresponding to the first touch event.

Description

一种投屏场景下的触控方法及电子设备Touch control method and electronic equipment in screen projection scene
本申请要求在2019年6月5日提交中国国家知识产权局、申请号为201910487623.9的中国专利申请的优先权,发明名称为“一种投屏场景下的触控方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of a Chinese patent application filed with the State Intellectual Property Office of China with application number 201910487623.9 on June 5, 2019, and a Chinese patent with the title of “a touch method and electronic device in a projection screen scenario” The priority of the application, the entire content of which is incorporated in this application by reference.
技术领域Technical field
本申请涉及终端技术领域,尤其涉及一种投屏场景下的触控方法及电子设备。This application relates to the field of terminal technology, and in particular to a touch method and electronic equipment in a screen projection scenario.
背景技术Background technique
随着智能家居技术的发展,一个用户或家庭中往往具备多个能够互相通信的电子设备。各类电子设备一般具有各自的设备特点,例如,手机的便携性更好,电视屏幕的显示效果更好,而音箱的音质效果更好。为了充分发挥不同电子设备的设备特点,电子设备可以通过投屏等方式实现多媒体数据在多个设备之间的切换和显示。With the development of smart home technology, a user or family often has multiple electronic devices that can communicate with each other. Various electronic devices generally have their own device characteristics. For example, mobile phones have better portability, TV screens have better display effects, and speakers have better sound quality. In order to give full play to the device characteristics of different electronic devices, the electronic device can realize the switching and display of multimedia data among multiple devices by means of screen projection.
示例性的,用户使用手机中的视频应用观看视频时,可将手机设置为源设备,进而将源设备中的显示界面发送至其他支持投屏功能的目的设备中进行显示。当用户需要操作视频应用当前的显示界面时,仍需要在手机(即源设备)中执行相应的操作以更新手机的显示数据,再由手机将更新后的显示数据投射至目的设备中显示。Exemplarily, when a user uses a video application in a mobile phone to watch a video, the mobile phone can be set as the source device, and the display interface in the source device can be sent to other destination devices that support the screen projection function for display. When the user needs to operate the current display interface of the video application, he still needs to perform corresponding operations in the mobile phone (ie, the source device) to update the display data of the mobile phone, and then the mobile phone projects the updated display data to the destination device for display.
那么,当源设备不在用户身边或用户不方便操作源设备时,用户无法对正在投屏的显示界面进行相关控制,导致投屏显示时用户的使用体验不高。Then, when the source device is not around the user or the user is inconvenient to operate the source device, the user cannot perform related control on the display interface that is being projected, resulting in a poor user experience when the projected screen is displayed.
发明内容Summary of the invention
本申请提供一种投屏场景下的触控方法及电子设备,目的设备可接收并响应用户对投屏界面执行的控制操作,从而提高用户在投屏场景下的触控使用体验。The present application provides a touch method and electronic device in a screen projection scene. The target device can receive and respond to the control operation performed by the user on the screen projection interface, thereby improving the user's touch experience in the screen projection scene.
为达到上述目的,本申请采用如下技术方案:In order to achieve the above objectives, this application adopts the following technical solutions:
第一方面,本申请提供一种投屏场景下的触控方法,包括:源设备显示第一显示界面;响应于用户输入的投屏指令,源设备将第一显示界面中的N(N为大于0的整数)个控件投射至第一目的设备显示的投屏界面中;后续,如果源设备接收到第一目的设备发送的第一触摸事件,则源设备可执行与第一触摸事件对应的操作指令。这样,在投屏场景下,目的设备可响应用户输入的触摸操作生成触摸事件,并将该触摸事件发送给源设备实现相应的功能,实现反向控制源设备的控制功能,从而提高用户在投屏场景下的触控使用体验。In the first aspect, this application provides a touch method in a screen projection scenario, including: a source device displays a first display interface; in response to a screen projection instruction input by a user, the source device sets N in the first display interface (N is (Integer greater than 0) controls are projected to the projection interface displayed by the first destination device; subsequently, if the source device receives the first touch event sent by the first destination device, the source device can execute the corresponding Operating instructions. In this way, in the screen projection scenario, the destination device can generate a touch event in response to a touch operation input by the user, and send the touch event to the source device to implement corresponding functions, and realize the control function of reversely controlling the source device, thereby improving the user's casting Touch experience in the screen scene.
示例性的,上述第一触摸事件中可以包括触摸点的坐标以及触摸事件的类型(例如,单击、双击或滑动等事件类型)。Exemplarily, the aforementioned first touch event may include the coordinates of the touch point and the type of the touch event (for example, an event type such as a single click, a double tap, or a sliding).
在一种可能的实现方式中,在源设备接收第一目的设备发送的第一触摸事件之后,还包括:源设备确定与第一触摸事件对应的目标控件,该目标控件为上述N个控件中的一个;此时,源设备执行的操作指令为目标控件在源设备上被触发时对应的操作指令。例如,当第一触摸事件对应的目标控件为播放按钮时,与第一触摸事件对应的操作指令为播放按钮被触发时的操作指令。In a possible implementation manner, after the source device receives the first touch event sent by the first destination device, the method further includes: the source device determines a target control corresponding to the first touch event, and the target control is among the aforementioned N controls At this time, the operation instruction executed by the source device is the corresponding operation instruction when the target control is triggered on the source device. For example, when the target control corresponding to the first touch event is a play button, the operation instruction corresponding to the first touch event is an operation instruction when the play button is triggered.
示例性的,上述第一触摸事件可以为:用户上述投屏界面中输入第一触摸操作时,第一目的设备产生的触摸事件。其中,第一触摸操作为投屏时用户实际输入的一次触摸操作。Exemplarily, the aforementioned first touch event may be: a touch event generated by the first destination device when the user inputs a first touch operation on the aforementioned projection interface. Wherein, the first touch operation is a touch operation actually input by the user when the screen is projected.
又例如,上述第一触摸事件还可以为:第一目的设备响应于用户在投屏界面中输入的触摸操作产生一个触摸事件(例如第五触摸事件)后,第一目的设备将第五触摸事件映射为在 第一显示界面中的触摸事件。For another example, the above-mentioned first touch event may also be: after the first destination device generates a touch event (such as a fifth touch event) in response to a touch operation input by the user on the projection interface, the first destination device sends the fifth touch event The mapping is a touch event in the first display interface.
在一种可能的实现方式中,源设备可存储与上述第一显示界面对应的配置文件,该配置文件中记录了上述N个控件在第一显示界面中的显示位置,以及这N个控件在投屏界面中的显示位置;此时,源设备确定与第一触摸事件对应的目标控件,具体包括:源设备可根据配置文件中记录的上述N个控件在投屏界面中的显示位置,确定与第一触摸事件对应的目标控件。In a possible implementation manner, the source device may store a configuration file corresponding to the above-mentioned first display interface. The configuration file records the display positions of the above-mentioned N controls in the first display interface, and where the N controls are The display position in the projection interface; at this time, the source device determines the target control corresponding to the first touch event, which specifically includes: the source device can determine the display position of the above N controls in the projection interface recorded in the configuration file The target control corresponding to the first touch event.
例如,当第一触摸事件中触摸点的坐标(即第一坐标)落入上述配置文件中第一控件的显示位置内时,源设备可将第一控件确定为上述目标控件。For example, when the coordinates of the touch point in the first touch event (ie, the first coordinates) fall within the display position of the first control in the configuration file, the source device may determine the first control as the target control.
又例如,当第一触摸事件中触摸点的坐标(即第一坐标)既落入上述配置文件中第一控件的显示位置内,同时又落入该配置文件中第二控件的显示位置内时,源设备可将位于顶层的第一控件确定为上述目标控件。For another example, when the coordinates of the touch point in the first touch event (that is, the first coordinates) both fall within the display position of the first control in the above configuration file, and at the same time fall within the display position of the second control in the configuration file , The source device can determine the first control at the top level as the target control.
在一种可能的实现方式中,在源设备确定出与第一触摸事件对应的目标控件之后,还包括:源设备根据上述配置文件,将第一触摸事件映射为第二触摸事件,第二触摸事件为:用户在第一显示界面中向目标控件输入第二触摸操作时源设备将会产生的触摸事件;需要说明的是,用户实际并没有在第一显示界面中输入该第二触摸操作,源设备是根据第一触摸事件映射出了与第二触摸操作对应的第二触摸事件。In a possible implementation manner, after the source device determines the target control corresponding to the first touch event, the method further includes: the source device maps the first touch event to the second touch event according to the above configuration file, and the second touch The event is: a touch event that the source device will generate when the user inputs a second touch operation to the target control in the first display interface; it should be noted that the user does not actually input the second touch operation in the first display interface. The source device maps the second touch event corresponding to the second touch operation according to the first touch event.
此时,源设备执行与第一触摸事件对应的操作指令是指:源设备执行与映射后的第二触摸事件上报给第一应用(源设备正在显示的第一显示界面为第一应用的界面),使得第一应用执行与第二触摸事件对应的操作指令。例如,第一触摸事件中触摸点的坐标为A,映射后第二触摸事件中触摸点的坐标为B,那么,源设备执行与第一触摸事件对应的操作指令实际是第一应用在响应坐标为B的第二触摸事件。这样,用户在投屏界面中的触摸操作也可以反向控制源设备中的相关应用实现对应的功能。At this time, the execution of the operation instruction corresponding to the first touch event by the source device means: the source device executes and reports the mapped second touch event to the first application (the first display interface being displayed by the source device is the interface of the first application) ), so that the first application executes the operation instruction corresponding to the second touch event. For example, the coordinate of the touch point in the first touch event is A, and the coordinate of the touch point in the second touch event after mapping is B. Then, the source device executes the operation instruction corresponding to the first touch event, which is actually the first application responding to the coordinate This is the second touch event of B. In this way, the user's touch operation on the projection interface can also reversely control the related application in the source device to implement the corresponding function.
在一种可能的实现方式中,源设备根据上述配置文件,将第一触摸事件映射为第二触摸事件,包括:源设备根据配置文件中记录的目标控件在第一显示界面中的第一显示位置以及在该投屏界面中的第二显示位置之间的对应关系,将第一触摸事件中的第一坐标转换为第二坐标,得到第二触摸事件。例如,源设备可根据配置文件中记录的目标控件在投屏前后发生的平移、缩放或旋转等变化,反向推算在第一显示界面中与上述第一坐标对应的第二坐标。In a possible implementation manner, the source device maps the first touch event to the second touch event according to the above configuration file, including: the source device displays the first display on the first display interface according to the target control recorded in the configuration file The corresponding relationship between the position and the second display position in the projection interface is converted from the first coordinate in the first touch event to the second coordinate to obtain the second touch event. For example, the source device can reversely calculate the second coordinate corresponding to the above-mentioned first coordinate in the first display interface according to changes in the translation, scaling, or rotation of the target control before and after the screen projection recorded in the configuration file.
在一种可能的实现方式中,源设备确定出与第一触摸事件对应的目标控件后,还可以将该目标控件的标识和第一触摸事件的事件类型上报给第一应用,以使得第一应用执行第一功能,即第一应用在目标控件被上述事件类型所指示的操作触发时所对应的功能,从而实现用户在投屏界面中反向控制源设备中的相关应用的功能。In a possible implementation, after the source device determines the target control corresponding to the first touch event, it can also report the identification of the target control and the event type of the first touch event to the first application, so that the first The application executes the first function, that is, the function corresponding to the first application when the target control is triggered by the operation indicated by the event type, so as to realize the function of the user to reversely control the related application in the source device in the projection interface.
在一种可能的实现方式中,在源设备确定与第一触摸事件对应的目标控件之后,还包括:源设备根据上述配置文件中记录的目标控件在第一显示界面中的第一显示位置生成第三触摸事件,第三触摸事件的事件类型与第一触摸事件的事件类型相同,第三触摸事件中的第三坐标位于第一显示位置内;此时,源设备执行与第一触摸事件对应的操作指令,包括:源设备将第三触摸事件上报给第一应用。也就是说,源设备将用户在投屏界面中的第一触摸事件转换为了在第一显示界面中的第三触摸事件,那么,源设备响应该第三触摸事件的过程实际为源设备响应该第一触摸事件的过程。In a possible implementation manner, after the source device determines the target control corresponding to the first touch event, the method further includes: the source device generates the target control at the first display position in the first display interface according to the target control recorded in the configuration file. The third touch event, the event type of the third touch event is the same as the event type of the first touch event, and the third coordinate in the third touch event is located in the first display position; at this time, the source device executes corresponding to the first touch event The operation instructions include: the source device reports the third touch event to the first application. That is to say, the source device converts the user’s first touch event on the screen projection interface into a third touch event on the first display interface. Then, the process of the source device responding to the third touch event is actually the source device responding to the third touch event. The process of the first touch event.
需要说明的是,上述实现方式中是以源设备确定第一触摸事件的目标控件,并将第一触摸事件映射为第二触摸事件举例说明的。可以理解的是,目的设备接产生第一触摸事件后,也可以按照上述方法确定第一触摸事件的目标控件,并将第一触摸事件映射为第二触摸事件。 进而,目的设备可将映射后的第二触摸事件发送给源设备,又源设备将该第二触摸事件上报给第一应用,使得第一应用执行与第二触摸事件对应的操作指令,也即与第一触摸事件对应的操作指令。It should be noted that in the foregoing implementation manner, the source device determines the target control of the first touch event, and maps the first touch event to the second touch event. It is understandable that after the target device generates the first touch event, it can also determine the target control of the first touch event according to the above method, and map the first touch event to the second touch event. Furthermore, the destination device can send the mapped second touch event to the source device, and the source device reports the second touch event to the first application, so that the first application executes the operation instruction corresponding to the second touch event, that is, The operation instruction corresponding to the first touch event.
在一种可能的实现方式中,在源设备显示第一显示界面之后,还包括:响应于用户输入的第二投屏指令,源设备将第一显示界面中的M(M为大于0的整数)个控件投射至第二目的设备中显示;源设备接收第二目的设备发送的第四触摸事件;源设备执行与第四触摸事件对应的操作指令。也就是说,当源设备将其显示界面中的显示内容同时投射至多个目的设备中显示时,用户可在任意目的设备中输入触摸操作反向控制源设备实现与该触摸操作对应的控制功能,从而提高用户在投屏场景下的触控使用体验。In a possible implementation manner, after the source device displays the first display interface, it further includes: in response to the second screen-casting instruction input by the user, the source device sets M in the first display interface (M is an integer greater than 0). ) A control is projected to the second destination device for display; the source device receives the fourth touch event sent by the second destination device; the source device executes an operation instruction corresponding to the fourth touch event. In other words, when the source device simultaneously projects the display content in its display interface to multiple destination devices for display, the user can input a touch operation in any destination device to reversely control the source device to realize the control function corresponding to the touch operation. So as to improve the user's touch experience in the projection scene.
第二方面,本申请提供一种投屏场景下的触控方法,包括:目的设备接收第一源设备发送的第一消息,第一消息中包括第一目标控件的绘制指令,第一目标控件为第一源设备显示的第一显示界面中的一个或多个控件;目的设备调用第一目标控件的绘制指令绘制投屏界面,该投屏界面中包括第一目标控件;响应于用户在投屏界面中向第一目标控件输入的第一触摸操作,目的设备生成第一触摸事件;目的设备指示第一源设备执行与第一触摸事件对应的操作指令。In a second aspect, this application provides a touch method in a projection scene, including: a destination device receives a first message sent by a first source device, the first message includes a drawing instruction of a first target control, and the first target control One or more controls in the first display interface displayed for the first source device; the destination device invokes the drawing instruction of the first target control to draw the screen projection interface, the screen projection interface includes the first target control; in response to the user casting For the first touch operation input to the first target control on the screen interface, the destination device generates a first touch event; the destination device instructs the first source device to execute an operation instruction corresponding to the first touch event.
在一种可能的实现方式中,目的设备指示第一源设备执行与第一触摸事件对应的操作指令,包括:目的设备将第一触摸事件发送给第一源设备,以使得第一源设备执行与第一触摸事件对应的操作指令。第一源设备接收到该第一触摸事件后,可按照上述第一方面中的方法执行与第一触摸事件对应的操作指令。In a possible implementation manner, the destination device instructs the first source device to execute the operation instruction corresponding to the first touch event, including: the destination device sends the first touch event to the first source device, so that the first source device executes The operation instruction corresponding to the first touch event. After receiving the first touch event, the first source device can execute the operation instruction corresponding to the first touch event according to the method in the first aspect.
在一种可能的实现方式中,在目的设备生成第一触摸事件之后,还包括:目的设备将第一触摸事件映射为第二触摸事件,第二触摸事件为:用户在第一显示界面中向第一目标控件输入第二触摸操作时第一源设备将产生的触摸事件。其中,目的设备将第一触摸事件映射为第二触摸事件的方法,与第一方面中源设备将第一触摸事件映射为第二触摸事件的方法相同。In a possible implementation manner, after the target device generates the first touch event, it further includes: the target device maps the first touch event to a second touch event, and the second touch event is: The touch event that the first source device will generate when the first target control inputs the second touch operation. The method for the destination device to map the first touch event to the second touch event is the same as the method for the source device to map the first touch event to the second touch event in the first aspect.
此时,目的设备指示第一源设备执行与第一触摸事件对应的操作指令,包括:目的设备将映射后的第二触摸事件发送给第一源设备,以使得第一源设备执行与第二触摸事件对应的操作指令。At this time, the destination device instructs the first source device to execute the operation instruction corresponding to the first touch event, including: the destination device sends the mapped second touch event to the first source device, so that the first source device executes the operation instruction corresponding to the second touch event. The operation instruction corresponding to the touch event.
在一种可能的实现方式中,上述方法还包括:目的设备接收第二源设备发送的第二消息,第二消息中包括第二目标控件的绘制指令,第二目标控件为第二源设备显示的第二显示界面中的一个或多个控件;目的设备调用第二目标控件的绘制指令在该投屏界面中绘制第二目标控件;响应于用户在该投屏界面中向第二目标控件输入的第三触摸操作,目的设备生成第三触摸事件;目的设备指示第二源设备执行与第三触摸事件对应的操作指令。也就是说,当多个源设备同时将其显示界面中的显示内容投射至同一个目的设备中显示时,用户可在目的设备中对不同源设备投射来的控件输入触摸操作,从而控制相应的源设备实现与该触摸操作对应的功能,从而提高用户在投屏场景下的触控使用体验。In a possible implementation, the above method further includes: the destination device receives a second message sent by the second source device, the second message includes a drawing instruction of the second target control, and the second target control is displayed by the second source device One or more controls in the second display interface; the destination device calls the drawing instruction of the second target control to draw the second target control in the projection interface; in response to the user input to the second target control in the projection interface In the third touch operation, the destination device generates a third touch event; the destination device instructs the second source device to execute an operation instruction corresponding to the third touch event. In other words, when multiple source devices simultaneously project the display content in their display interface to the same target device for display, the user can input touch operations on the controls projected from different source devices in the target device to control the corresponding The source device implements the function corresponding to the touch operation, thereby improving the user's touch experience in the screen projection scene.
第三方面,本申请提供一种电子设备,包括:触摸屏、一个或多个处理器、一个或多个存储器、以及一个或多个计算机程序;其中,处理器与触摸屏以及存储器均耦合,上述一个或多个计算机程序被存储在存储器中,当电子设备运行时,该处理器执行该存储器存储的一个或多个计算机程序,以使电子设备执行上述任一项所述的联系人的推荐方法。In a third aspect, the present application provides an electronic device, including: a touch screen, one or more processors, one or more memories, and one or more computer programs; wherein the processor is coupled with the touch screen and the memory, and the above one One or more computer programs are stored in the memory, and when the electronic device is running, the processor executes one or more computer programs stored in the memory, so that the electronic device executes any one of the contact recommendation methods described above.
第四方面,本申请提供一种计算机存储介质,包括计算机指令,当计算机指令在电子设备上运行时,使得电子设备执行如第一方面中任一项所述的联系人的推荐方法。In a fourth aspect, the present application provides a computer storage medium including computer instructions, which when the computer instructions run on an electronic device, cause the electronic device to execute the contact recommendation method described in any one of the first aspect.
第五方面,本申请提供一种计算机程序产品,当计算机程序产品在电子设备上运行时, 使得电子设备执行如第一方面中任一项所述的联系人的推荐方法。In a fifth aspect, this application provides a computer program product, which when the computer program product runs on an electronic device, causes the electronic device to execute the contact recommendation method described in any one of the first aspect.
第六方面,本申请提供一种触控系统,该系统中可包括至少一个源设备和至少一个目的设备;源设备可用于执行如第一方面中任一项所述的投屏场景下的触控方法,目的设备可用于执行如第二方面中任一项所述的投屏场景下的触控方法。In a sixth aspect, the present application provides a touch control system, which may include at least one source device and at least one destination device; the source device may be used to perform touch control in the projection scene as described in any one of the first aspect. Control method, the target device can be used to execute the touch method in the projection scene as described in any one of the second aspect.
可以理解地,上述提供的第三方面所述的电子设备、第四方面所述的计算机存储介质、第五方面所述的计算机程序产品以及第六方面所述的触控系统均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。Understandably, the electronic equipment described in the third aspect, the computer storage medium described in the fourth aspect, the computer program product described in the fifth aspect, and the touch control system described in the sixth aspect provided above are all used for execution. For the corresponding method provided in the text, therefore, the beneficial effects that can be achieved can refer to the beneficial effects of the corresponding method provided above, which will not be repeated here.
附图说明Description of the drawings
图1为本申请实施例提供的一种通信系统的架构图一;FIG. 1 is an architecture diagram 1 of a communication system provided by an embodiment of this application;
图2为本申请实施例提供的一种通信系统的架构图二;FIG. 2 is a second structural diagram of a communication system provided by an embodiment of this application;
图3为本申请实施例提供的一种电子设备的结构示意图一;FIG. 3 is a first structural diagram of an electronic device according to an embodiment of the application;
图4为本申请实施例提供的一种电子设备内操作系统的架构图;4 is a structural diagram of an operating system in an electronic device provided by an embodiment of the application;
图5为本申请实施例提供的一种投屏场景下的触控方法的场景示意图一;FIG. 5 is a first schematic diagram of a touch method in a projection scene provided by an embodiment of the application;
图6为本申请实施例提供的一种投屏场景下的触控方法的场景示意图二;6 is a schematic diagram of a second scene of a touch method in a projection scene provided by an embodiment of the application;
图7为本申请实施例提供的一种投屏场景下的触控方法的场景示意图三;FIG. 7 is a scene schematic diagram 3 of a touch method in a projection scene provided by an embodiment of the application;
图8为本申请实施例提供的一种投屏场景下的触控方法的场景示意图四;FIG. 8 is a scene schematic diagram 4 of a touch method in a projection scene provided by an embodiment of the application;
图9为本申请实施例提供的一种投屏场景下的触控方法的场景示意图五;9 is a schematic diagram five of a touch method in a projection scene provided by an embodiment of the application;
图10为本申请实施例提供的一种投屏场景下的触控方法的场景示意图六;FIG. 10 is a scene schematic diagram 6 of a touch method in a projection scene provided by an embodiment of the application;
图11为本申请实施例提供的一种投屏场景下的触控方法的场景示意图七;FIG. 11 is a scene schematic diagram 7 of a touch method in a projection scene provided by an embodiment of this application;
图12为本申请实施例提供的一种投屏场景下的触控方法的场景示意图八;FIG. 12 is a scene schematic diagram eight of a touch method in a projection scene provided by an embodiment of this application;
图13为本申请实施例提供的一种投屏场景下的触控方法的场景示意图九;FIG. 13 is a scene schematic diagram 9 of a touch method in a projection scene provided by an embodiment of the application;
图14为本申请实施例提供的一种投屏场景下的触控方法的场景示意图十;14 is a schematic diagram ten of a touch method in a projection scene provided by an embodiment of the application;
图15为本申请实施例提供的一种投屏场景下的触控方法的场景示意图十一;15 is a schematic eleventh scene of a touch method in a projection scene provided by an embodiment of this application;
图16为本申请实施例提供的一种投屏场景下的触控方法的场景示意图十二;16 is a schematic diagram 12 of a touch method in a projection scene provided by an embodiment of this application;
图17为本申请实施例提供的一种投屏场景下的触控方法的场景示意图十三;FIG. 17 is a schematic diagram 13 of a touch method in a projection scene provided by an embodiment of the application;
图18为本申请实施例提供的一种电子设备的结构示意图二。FIG. 18 is a second structural diagram of an electronic device provided by an embodiment of this application.
具体实施方式Detailed ways
下面将结合附图对本实施例的实施方式进行详细描述。The implementation of this embodiment will be described in detail below with reference to the accompanying drawings.
如图1所示,本申请实施例提供的一种投屏场景下的触控方法可应用于通信系统100,通信系统100中可以包括N(N>1)个电子设备。例如,通信系统100中可包括电子设备101和电子设备102。As shown in FIG. 1, a touch method in a projection scene provided by an embodiment of the present application can be applied to a communication system 100, and the communication system 100 may include N (N>1) electronic devices. For example, the communication system 100 may include an electronic device 101 and an electronic device 102.
示例性地,电子设备101可以通过一个或多个通信网络104与电子设备102连接。Exemplarily, the electronic device 101 may be connected to the electronic device 102 through one or more communication networks 104.
通信网络104可以是有线网络,也可以是无线网络)。例如,上述通信网络104可以是局域网(local area networks,LAN),也可以是广域网(wide area networks,WAN),例如互联网。该通信网络104可使用任何已知的网络通信协议来实现,上述网络通信协议可以是各种有线或无线通信协议,诸如以太网、通用串行总线(universal serial bus,USB)、火线(FIREWIRE)、全球移动通讯系统(global system for mobile communications,GSM)、通用分组无线服务(general packet radio service,GPRS)、码分多址接入(code division multiple access,CDMA)、宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE)、蓝牙、无线保真 (wireless fidelity,Wi-Fi)、NFC、基于互联网协议的语音通话(voice over Internet protocol,VoIP)、支持网络切片架构的通信协议或任何其他合适的通信协议。示例性地,在一些实施例中,电子设备101可以通过Wi-Fi协议与电子设备102建立Wi-Fi连接。The communication network 104 may be a wired network or a wireless network). For example, the aforementioned communication network 104 may be a local area network (local area networks, LAN), or a wide area network (wide area networks, WAN), such as the Internet. The communication network 104 can be implemented using any known network communication protocol. The above-mentioned network communication protocol can be various wired or wireless communication protocols, such as Ethernet, universal serial bus (USB), and Firewire (FIREWIRE). , Global System for Mobile Communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Broadband Code Division Multiple Access (wideband) code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), Bluetooth, wireless fidelity (Wi-Fi) , NFC, voice over Internet protocol (VoIP) based on Internet protocol, communication protocol supporting network slicing architecture or any other suitable communication protocol. Exemplarily, in some embodiments, the electronic device 101 may establish a Wi-Fi connection with the electronic device 102 through a Wi-Fi protocol.
示例性的,电子设备101可以作为源设备,电子设备102可以作为目的设备,电子设备101(即源设备)可将其显示界面中的显示内容投射至电子设备102(即目的设备)中显示。当然,也可以将电子设备102作为源设备,由电子设备102将其显示界面中的显示内容投射至电子设备101(即目的设备)中显示。Exemplarily, the electronic device 101 may be used as a source device, the electronic device 102 may be used as a destination device, and the electronic device 101 (ie, the source device) may project the display content in its display interface to the electronic device 102 (ie, the destination device) for display. Of course, the electronic device 102 can also be used as a source device, and the electronic device 102 projects the display content in its display interface to the electronic device 101 (that is, the destination device) for display.
仍如图1所示,上述通信系统100中还可以包括电子设备103等其他一个或多个电子设备,例如,电子设备103可以为可穿戴设备。示例性的,电子设备103也可以作为源设备或目的设备进行投屏显示。As still shown in FIG. 1, the communication system 100 described above may further include one or more other electronic devices such as an electronic device 103. For example, the electronic device 103 may be a wearable device. Exemplarily, the electronic device 103 may also be used as a source device or a destination device for projection display.
示例性的,如图2中的(a)所示,以电子设备101为源设备举例,电子设备102和电子设备103均可作为电子设备101的目的设备。电子设备101可将其显示界面中的显示内容投同时射至电子设备102和电子设备103中显示。也就是说,一个源设备可以同时向多个目的设备进行投屏显示。Exemplarily, as shown in (a) of FIG. 2, taking the electronic device 101 as an example of the source device, both the electronic device 102 and the electronic device 103 can be used as the destination device of the electronic device 101. The electronic device 101 can project the display content in its display interface to the electronic device 102 and the electronic device 103 for display at the same time. In other words, one source device can simultaneously project to multiple destination devices.
又或者,如图2中的(b)所示,以电子设备101为目的设备举例,电子设备102和电子设备103均可作为电子设备101的源设备。此时,电子设备102和电子设备103可同时将其显示界面中的显示内容投射至电子设备101中显示。也就是说,一个目的设备可以同时接收并显示多个源设备发来的显示内容。Or, as shown in (b) in FIG. 2, taking the electronic device 101 as an example of the destination device, both the electronic device 102 and the electronic device 103 can be used as the source device of the electronic device 101. At this time, the electronic device 102 and the electronic device 103 can simultaneously project the display content in their display interfaces to the electronic device 101 for display. In other words, a destination device can simultaneously receive and display the display content sent by multiple source devices.
需要说明的是,源设备向目的设备投射显示内容时,可以将其显示界面中的所有控件均投射至目的设备中显示,也可以将其显示界面中的部分控件投射至目的设备中显示,本申请实施例对此不做任何限制。另外,上述通信系统100中的任意电子设备均可作为源设备或目的设备,本申请实施例对此不做任何限制。It should be noted that when the source device projects the display content to the destination device, all the controls in its display interface can be projected to the destination device for display, or some controls in its display interface can be projected to the destination device for display. The application embodiment does not impose any restriction on this. In addition, any electronic device in the aforementioned communication system 100 can be used as a source device or a destination device, which is not limited in the embodiment of the present application.
在一些实施例中,上述电子设备101、电子设备102以及电子设备103的具体结构可以是相同的,也可以是不同的。In some embodiments, the specific structures of the electronic device 101, the electronic device 102, and the electronic device 103 may be the same or different.
例如,上述各个电子设备具体可以是手机、平板电脑、智能电视、可穿戴电子设备、车机、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、手持计算机、上网本、个人数字助理(personal digital assistant,PDA)、虚拟现实设备等,本申请实施例对此不做任何限制。For example, each of the above electronic devices may specifically be mobile phones, tablet computers, smart TVs, wearable electronic devices, car machines, notebook computers, ultra-mobile personal computers (UMPC), handheld computers, netbooks, and personal digital assistants. (personal digital assistant, PDA), virtual reality equipment, etc. The embodiments of the present application do not make any restrictions on this.
以电子设备101举例,图3示出了电子设备101的结构示意图。Taking the electronic device 101 as an example, FIG. 3 shows a schematic structural diagram of the electronic device 101.
电子设备101可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,摄像头193,显示屏194等。The electronic device 101 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2. , Mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, camera 193, display screen 194, etc.
可以理解的是,本发明实施例示意的结构并不构成对电子设备101的具体限定。在本申请另一些实施例中,电子设备101可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。It can be understood that the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 101. In other embodiments of the present application, the electronic device 101 may include more or fewer components than shown in the figure, or combine certain components, or split certain components, or arrange different components. The illustrated components can be implemented in hardware, software, or a combination of software and hardware.
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit, NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。The processor 110 may include one or more processing units. For example, the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc. Among them, the different processing units may be independent devices or integrated in one or more processors.
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。A memory may also be provided in the processor 110 to store instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。In some embodiments, the processor 110 may include one or more interfaces. The interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (PCM) interface, and a universal asynchronous transmitter receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / Or Universal Serial Bus (USB) interface, etc.
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备101的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。The charging management module 140 is used to receive charging input from the charger. Among them, the charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive the charging input of the wired charger through the USB interface 130. In some embodiments of wireless charging, the charging management module 140 may receive the wireless charging input through the wireless charging coil of the electronic device 101. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160. The power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance). In some other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may also be provided in the same device.
电子设备101的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。The wireless communication function of the electronic device 101 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
天线1和天线2用于发射和接收电磁波信号。电子设备101中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。The antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals. Each antenna in the electronic device 101 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization. For example, antenna 1 can be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna can be used in combination with a tuning switch.
移动通信模块150可以提供应用在电子设备101上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括一个或多个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。The mobile communication module 150 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 101. The mobile communication module 150 may include one or more filters, switches, power amplifiers, low noise amplifiers (LNA), etc. The mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering and amplifying the received electromagnetic waves, and then transmitting them to the modem processor for demodulation. The mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation via the antenna 1. In some embodiments, at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110. In some embodiments, at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。The modem processor may include a modulator and a demodulator. Among them, the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal. The demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing. The low-frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194. In some embodiments, the modem processor may be an independent device. In other embodiments, the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
无线通信模块160可以提供应用在电子设备101上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(Bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成一个或多个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。The wireless communication module 160 can provide applications on the electronic device 101, including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), and global navigation satellites. System (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions. The wireless communication module 160 may be one or more devices integrating one or more communication processing modules. The wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110. The wireless communication module 160 can also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic wave radiation via the antenna 2.
在一些实施例中,电子设备101的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备101可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。In some embodiments, the antenna 1 of the electronic device 101 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 101 can communicate with the network and other devices through wireless communication technology. The wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc. The GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
电子设备101通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。The electronic device 101 implements a display function through a GPU, a display screen 194, and an application processor. The GPU is a microprocessor for image processing, connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备101可以包括1个或N个显示屏194,N为大于1的正整数。The display screen 194 is used to display images, videos, etc. The display screen 194 includes a display panel. The display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active-matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode). AMOLED, flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc. In some embodiments, the electronic device 101 may include one or N display screens 194, and N is a positive integer greater than one.
电子设备101可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。The electronic device 101 can realize a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。The ISP is used to process the data fed back from the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transfers the electrical signal to the ISP for processing and is converted into an image visible to the naked eye. ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be provided in the camera 193.
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备101可以包括1个或N个摄像头193,N为大于1的正整数。The camera 193 is used to capture still images or videos. The object generates an optical image through the lens and projects it to the photosensitive element. The photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal. ISP outputs digital image signals to DSP for processing. DSP converts digital image signals into standard RGB, YUV and other formats. In some embodiments, the electronic device 101 may include 1 or N cameras 193, and N is a positive integer greater than 1.
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字 信号。例如,当电子设备101在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 101 selects the frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
视频编解码器用于对数字视频压缩或解压缩。电子设备101可以支持一种或多种视频编解码器。这样,电子设备101可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。Video codecs are used to compress or decompress digital video. The electronic device 101 may support one or more video codecs. In this way, the electronic device 101 can play or record videos in a variety of encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备101的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 101. The external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
内部存储器121可以用于存储一个或多个计算机程序,该一个或多个计算机程序包括指令。处理器110可以通过运行存储在内部存储器121的上述指令,从而使得电子设备101执行本申请一些实施例中所提供的方法,以及各种功能应用和数据处理等。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统;该存储程序区还可以存储一个或多个应用程序(比如图库、联系人等)等。存储数据区可存储电子设备101使用过程中所创建的数据(比如照片,联系人等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如一个或多个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。在另一些实施例中,处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,来使得电子设备101执行本申请实施例中提供的方法,以及各种功能应用和数据处理。The internal memory 121 may be used to store one or more computer programs, and the one or more computer programs include instructions. The processor 110 can run the above-mentioned instructions stored in the internal memory 121 to enable the electronic device 101 to execute the methods provided in some embodiments of the present application, as well as various functional applications and data processing. The internal memory 121 may include a storage program area and a storage data area. Among them, the storage program area can store the operating system; the storage program area can also store one or more application programs (such as a gallery, contacts, etc.) and so on. The data storage area can store data (such as photos, contacts, etc.) created during the use of the electronic device 101. In addition, the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, universal flash storage (UFS), etc. In other embodiments, the processor 110 executes the instructions stored in the internal memory 121 and/or the instructions stored in the memory provided in the processor to cause the electronic device 101 to execute the method provided in the embodiments of the present application. And various functional applications and data processing.
电子设备101可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。The electronic device 101 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。The audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal. The audio module 170 can also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备101可以通过扬声器170A收听音乐,或收听免提通话。The speaker 170A, also called a "speaker", is used to convert audio electrical signals into sound signals. The electronic device 101 can listen to music through the speaker 170A, or listen to a hands-free call.
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备101接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。The receiver 170B, also called "earpiece", is used to convert audio electrical signals into sound signals. When the electronic device 101 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备101可以设置一个或多个麦克风170C。在另一些实施例中,电子设备101可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备101还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。The microphone 170C, also called "microphone", "microphone", is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can approach the microphone 170C through the mouth to make a sound, and input the sound signal to the microphone 170C. The electronic device 101 may be provided with one or more microphones 170C. In other embodiments, the electronic device 101 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals. In some other embodiments, the electronic device 101 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。The earphone interface 170D is used to connect wired earphones. The earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, and a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
传感器模块180可以包括压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器,骨传导传感器等。The sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
另外,上述电子设备中还可以包括按键、马达、指示器以及SIM卡接口等一种或多种部 件,本申请实施例对此不做任何限制。In addition, the above electronic equipment may also include one or more components such as buttons, motors, indicators, and SIM card interfaces, which are not limited in the embodiment of the present application.
上述电子设备101的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明电子设备101的软件结构。The above-mentioned software system of the electronic device 101 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. The embodiment of the present application takes a layered Android system as an example to illustrate the software structure of the electronic device 101.
图4是本申请实施例的电子设备101的软件结构框图。4 is a block diagram of the software structure of the electronic device 101 according to an embodiment of the present application.
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。The layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
1、应用程序层1. Application layer
应用程序层可以包括一系列应用程序。The application layer can include a series of applications.
如图4所示,上述应用程序可以包括通话,联系人,相机,图库,日历,地图,导航,蓝牙,音乐,视频,短信息等APP(应用,application)。As shown in FIG. 4, the above-mentioned applications may include APPs (applications) such as call, contact, camera, gallery, calendar, map, navigation, Bluetooth, music, video, short message, etc.
2、应用程序框架层2. Application framework layer
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。The application framework layer provides application programming interfaces (application programming interface, API) and programming frameworks for applications in the application layer. The application framework layer includes some predefined functions.
如图4所示,应用程序框架层中可以包括视图系统(view system),通知管理器,活动管理器,窗口管理器,内容提供器,资源管理器,输入管理器等。As shown in Figure 4, the application framework layer may include a view system, a notification manager, an activity manager, a window manager, a content provider, a resource manager, an input manager, and so on.
其中,视图系统可用于构建应用程序的显示界面。每个显示界面可以由一个或多个控件组成。一般而言,控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、微件(Widget)等界面元素。显示界面中的多个控件可按照树状结构分层组织,形成一个完整的ViewTree(视图树)。视图系统可根据显示界面的ViewTree绘制显示界面,在绘制显示界面中的每一个控件时都对应一组绘制指令,例如DrawLine、DrawPoint、DrawBitmap等。Among them, the view system can be used to construct the display interface of the application. Each display interface can consist of one or more controls. Generally speaking, controls can include interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, and widgets. Multiple controls in the display interface can be organized hierarchically according to a tree structure to form a complete ViewTree (view tree). The view system can draw the display interface according to the ViewTree of the display interface, and each control in the display interface corresponds to a set of drawing instructions, such as DrawLine, DrawPoint, DrawBitmap, etc.
例如,图5中的(a)示出了微信APP的聊天界面401,聊天界面401中最底层的控件为根节点(root),根节点下设置有底图402这一控件,底图402中还包括以下控件:标题栏403、聊天背景404以及输入栏405。其中,标题栏403中进一步包括返回按钮406和标题407,聊天背景404中进一步包括头像408和气泡409,输入栏405中进一步包括语音输入按钮图标410、输入框411以及发送按钮412。For example, (a) in Figure 5 shows the chat interface 401 of the WeChat APP. The bottommost control in the chat interface 401 is the root node. The root node is provided with a basemap 402 control. In the basemap 402 The following controls are also included: a title bar 403, a chat background 404, and an input bar 405. Wherein, the title bar 403 further includes a return button 406 and a title 407, the chat background 404 further includes an avatar 408 and a bubble 409, and the input bar 405 further includes a voice input button icon 410, an input box 411, and a send button 412.
上述控件按照顺序分层可形成如图5中(b)所示的视图树A。其中,底图402为根节点的子节点,标题栏403、聊天背景404以及输入栏405均为底图402的子节点。返回按钮406和标题407均为标题栏403的子节点。头像408和气泡409均为聊天背景404的子节点。语音输入按钮图标410、输入框411以及发送按钮412均为输入栏405的子节点。视图系统在显示聊天界面401时可按照视图树A中各个控件之间的图层关系,从根节点开始逐层调用对应控件的绘制指令绘制每个控件,最终形成聊天界面401。The above controls are layered in order to form a view tree A as shown in Figure 5(b). Among them, the base map 402 is a child node of the root node, and the title bar 403, the chat background 404, and the input field 405 are all child nodes of the base map 402. Both the return button 406 and the title 407 are child nodes of the title bar 403. Both the avatar 408 and the bubble 409 are child nodes of the chat background 404. The voice input button icon 410, the input box 411, and the send button 412 are all child nodes of the input field 405. When displaying the chat interface 401, the view system can call the drawing instructions of the corresponding controls layer by layer to draw each control according to the layer relationship between the controls in the view tree A, and finally form the chat interface 401.
在本申请实施例中,如果电子设备101为源设备,当电子设备101向目的设备投屏时,视图系统可对当前显示界面的视图树中的控件进行拆分、删减或重组,从而确定本次需要投射至目的设备中显示的一个或多个目标控件。进而,电子设备101可将确定出的目标控件投射至目的设备中形成投屏界面,从而适应目的设备的显示尺寸等设备特点,提高投屏场景下目的设备的显示效果和用户体验。In the embodiment of the present application, if the electronic device 101 is the source device, when the electronic device 101 projects a screen to the destination device, the view system can split, delete, or reorganize the controls in the view tree of the current display interface to determine This time, it needs to be projected to one or more target controls displayed on the target device. Furthermore, the electronic device 101 can project the determined target control to the target device to form a screen projection interface, thereby adapting to device characteristics such as the display size of the target device, and improving the display effect and user experience of the target device in the screen projection scene.
另外,在本申请实施例中,电子设备101将显示界面中的目标控件投射至目的设备(例如电子设备102)的投屏界面中显示后,用户可在电子设备102中对投屏界面中的目标控件 输入相应的触摸操作,以控制电子设备101实现与该触摸操作对应的功能。In addition, in the embodiment of the present application, after the electronic device 101 projects the target control in the display interface to the projection interface of the target device (for example, the electronic device 102), the user can display the control in the projection interface in the electronic device 102. The target control inputs a corresponding touch operation to control the electronic device 101 to implement a function corresponding to the touch operation.
示例性的,应用程序层中运行的APP(以应用A为例)获取到触摸屏上用户输入的触摸操作是一个从底层向上层逐层分发消息的过程。Exemplarily, the APP running in the application layer (taking application A as an example) obtains the touch operation input by the user on the touch screen is a process of distributing messages from the bottom layer to the top layer.
用户手指与触摸屏接触时,触摸屏可得到这一触摸操作的相关信息(例如,触摸点的坐标等),进而,触摸屏可通过相应的驱动以中断的形式向内核层上报该触摸操作产生的原始触摸事件。如图4所示,内核层得到上述原始触摸事件后,可将该触摸事件封装为上层能够读取的高级触摸事件(例如,action down事件、action move事件以及action up事件等),并将该高级触摸事件发送给Framework层。进而,Framework层可将上述高级触摸事件上报至应用程序层中正在运行的应用A的应用进程。应用A的应用进程调用相应的库函数确定该高级触摸事件所作用的具体控件,以及该高级触摸事件的事件类型,例如,该事件类型可以包括单击、双击、滑动等。以用户单击播放按钮举例,应用A确定出本次高级触摸事件作用的控件为播放按钮、事件类型为单击后,应用A的进程可调用与单击播放按钮这一触摸事件对应的回调函数实现与本次触摸操作对应的应用功能。When the user's finger is in contact with the touch screen, the touch screen can obtain the relevant information of the touch operation (for example, the coordinates of the touch point, etc.), and then the touch screen can report the original touch generated by the touch operation to the kernel layer in the form of an interrupt through the corresponding driver event. As shown in Figure 4, after the kernel layer obtains the above-mentioned original touch event, it can encapsulate the touch event into a high-level touch event (for example, action down event, action move event, action up event, etc.) that can be read by the upper layer, and combine the Advanced touch events are sent to the Framework layer. Furthermore, the Framework layer can report the above-mentioned advanced touch events to the application process of the running application A in the application layer. The application process of application A calls the corresponding library function to determine the specific control acted by the advanced touch event and the event type of the advanced touch event. For example, the event type may include single click, double tap, sliding, etc. Take the user clicking the play button as an example. After application A determines that the control for this advanced touch event is the play button and the event type is click, the process of application A can call the callback function corresponding to the touch event of clicking the play button Realize the application function corresponding to this touch operation.
在本申请实施例中,仍如图4所示,可以在目的设备的应用程序框架层中设置坐标转换模块。目的设备的触摸屏向内核层上报的原始触摸事件中包括触摸点的坐标(x,y),该坐标(x,y)是用户在投屏后的投屏界面中的触摸位置。同样,内核层向Framework层上报的高级触摸事件中触摸点(x,y)也是用户在投屏界面中的触摸位置。Framework层接收到该高级触摸事件后,其坐标转换模块可将该坐标(x,y)映射为在源设备的显示界面中相应的坐标(x’,y’)。进而,目的设备可将携带坐标(x’,y’)的高级触摸事件发送给源设备,由源设备的Framework层将该高级触摸事件上报给正在投屏的应用。该应用接收到触摸点为(x’,y’)的高级触摸事件后,相当于接收到用户在源设备中对触摸坐标(x’,y’)所产生的触摸事件,那么,该应用可响应携带坐标(x’,y’)的触摸事件实现对应的应用功能。In the embodiment of the present application, as shown in FIG. 4, a coordinate conversion module may be set in the application framework layer of the target device. The original touch event reported by the touch screen of the target device to the kernel layer includes the coordinates (x, y) of the touch point, and the coordinates (x, y) are the user's touch position in the projection interface after the projection. Similarly, the touch point (x, y) in the advanced touch event reported by the kernel layer to the Framework layer is also the user's touch position in the projection interface. After the Framework layer receives the advanced touch event, its coordinate conversion module can map the coordinate (x, y) to the corresponding coordinate (x', y') in the display interface of the source device. Furthermore, the destination device can send an advanced touch event carrying coordinates (x', y') to the source device, and the Framework layer of the source device reports the advanced touch event to the screen-projecting application. After the application receives an advanced touch event with a touch point of (x', y'), it is equivalent to receiving a touch event generated by the user on the touch coordinates (x', y') in the source device, then the application can Respond to the touch event carrying the coordinates (x', y') to realize the corresponding application function.
也就是说,用户在目的设备中向投屏界面中输入触摸点为(x,y)的第一触摸操作后,目的设备可生成携带坐标(x,y)第一触摸事件。进而,目的设备可将第一触摸事件映射为在源设备的显示界面中触摸点为(x’,y’)的第二触摸事件。这样,源设备接收到目的设备发来的第二触摸事件后,可响应该第二触摸事件执行对应的应用功能,实现投屏后目的设备对源设备中显示界面的反向控制功能。That is to say, after the user inputs a first touch operation with a touch point of (x, y) into the projection interface on the target device, the target device can generate a first touch event that carries coordinates (x, y). Furthermore, the destination device may map the first touch event to a second touch event whose touch point is (x', y') on the display interface of the source device. In this way, after the source device receives the second touch event sent by the target device, it can execute the corresponding application function in response to the second touch event, so as to realize the reverse control function of the target device on the display interface of the source device after the screen is projected.
当然,也可以将上述坐标转换模块设置在源设备的Framework层中。此时,目的设备可将触摸点为(x,y)的第一触摸事件发送给源设备,进而由源设备的坐标转换模将第一触摸事件映射为触摸点为(x’,y’)的第二触摸事件,并响应第二触摸事件执行对应的应用功能,本申请实施例对此不做任何限制。Of course, the aforementioned coordinate conversion module can also be set in the Framework layer of the source device. At this time, the destination device can send the first touch event with a touch point of (x, y) to the source device, and the coordinate conversion module of the source device maps the first touch event to a touch point of (x', y') In response to the second touch event and execute the corresponding application function in response to the second touch event, the embodiment of the present application does not impose any limitation on this.
需要说明的是,上述实施例是以触摸屏检测并生成一件触摸事件,并对该触摸事件中触摸点的坐标进行坐标转换举例说明的。可以理解的是,用户在触摸屏上输入单击、长按或滑动触摸操作时,触摸屏可检测到一系列触摸事件。对于每件触摸事件,目的设备(或源设备)均可按照上述方法转换触摸事件中触摸点的坐标,本申请实施例对此不做任何限制。It should be noted that, in the above embodiment, a touch event is detected and generated by the touch screen, and the coordinates of the touch point in the touch event are coordinated as an example. It is understandable that when the user inputs a single click, long press or sliding touch operation on the touch screen, the touch screen can detect a series of touch events. For each touch event, the destination device (or source device) can convert the coordinates of the touch point in the touch event according to the above-mentioned method, and the embodiment of the present application does not impose any limitation on this.
另外,上述活动管理器可用于管理每个应用的生命周期。应用通常以activity的形式运行在操作系统中。活动管理器可以调度应用的activity进程管理每个应用的生命周期。窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等。In addition, the aforementioned activity manager can be used to manage the life cycle of each application. Applications usually run in the operating system in the form of activity. The activity manager can schedule the activity process of the application to manage the life cycle of each application. The window manager is used to manage window programs. The window manager can obtain the size of the display, determine whether there is a status bar, lock the screen, take a screenshot, etc. The content provider is used to store and retrieve data and make these data accessible to applications. The data may include video, image, audio, phone calls made and received, browsing history and bookmarks, phone book, etc. The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, etc.
3、Android runtime和系统库3. Android runtime and system libraries
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。The core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。The application layer and the application framework layer run in a virtual machine. The virtual machine executes the java files of the application layer and the application framework layer as binary files. The virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。The system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), three-dimensional graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
其中,表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。2D图形引擎是2D绘图的绘图引擎。Among them, the surface manager is used to manage the display subsystem, and provides a combination of 2D and 3D layers for multiple applications. The media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files. The media library can support multiple audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc. The 3D graphics processing library is used to realize 3D graphics drawing, image rendering, synthesis, and layer processing. The 2D graphics engine is a drawing engine for 2D drawing.
4、内核层4. The kernel layer
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动等,本申请实施例对此不做任何限制。The kernel layer is the layer between hardware and software. The kernel layer includes at least a display driver, a camera driver, an audio driver, a sensor driver, etc., which are not limited in the embodiment of the present application.
以下将结合附图详细阐述本申请实施例提供的一种投屏场景下的触控方法。The following describes in detail a touch method in a projection scene provided by an embodiment of the present application with reference to the accompanying drawings.
示例性的,以手机为源设备,智能手表为目的设备举例,手机可将自身显示界面中的一个或多个控件投射至智能手表中显示。Exemplarily, taking a mobile phone as a source device and a smart watch as an example of a destination device, the mobile phone may project one or more controls in its display interface to the smart watch for display.
如图6所示,如果手机开启向智能手表的投屏功能后显示音乐APP的播放界面600,说明此时需要将播放界面600中的显示内容投射至智能手表中显示。示例性的,播放界面600中包括以下控件:底图601、状态栏602、标题栏603、专辑封面604、歌词605以及控制栏606。其中,状态栏602中包括时间、信号强度以及电池容量等控件。标题栏603中包括歌曲名称6031和演唱者6032等控件。控制栏606中包括进度条6061、暂停按钮6062、上一首按钮6063以及下一首按钮6064等控件。As shown in FIG. 6, if the mobile phone turns on the screen projection function to the smart watch and displays the play interface 600 of the music APP, it means that the display content in the play interface 600 needs to be projected to the smart watch for display at this time. Exemplarily, the playback interface 600 includes the following controls: a base map 601, a status bar 602, a title bar 603, an album cover 604, lyrics 605, and a control bar 606. Among them, the status bar 602 includes controls such as time, signal strength, and battery capacity. The title bar 603 includes controls such as the song name 6031 and the singer 6032. The control bar 606 includes controls such as a progress bar 6061, a pause button 6062, a previous button 6063, and a next button 6064.
进而,手机可获取view system绘制上述播放界面600时对应的视图树,以及视图树中各个控件的绘制指令和绘制资源。例如,如图7所示,为上述播放界面600的视图树701。视图树701记录了上述播放界面600中各个控件的之间的图层关系。在视图树701中,播放界面600的根节点下包括底图601这一子节点,状态栏602、标题栏603、专辑封面604、歌词605以及控制栏606均为底图601的子节点。歌曲名称6031和演唱者6041为标题栏603的子节点。进度条6061、暂停按钮6062、上一首按钮6063以及下一首按钮6064为控制栏606的子节点。Furthermore, the mobile phone can obtain the corresponding view tree when the view system draws the aforementioned playback interface 600, and the drawing instructions and drawing resources of each control in the view tree. For example, as shown in FIG. 7, it is the view tree 701 of the above-mentioned playing interface 600. The view tree 701 records the layer relationship between the various controls in the aforementioned playback interface 600. In the view tree 701, the root node of the playback interface 600 includes a sub-node of the base map 601. The status bar 602, the title bar 603, the album cover 604, the lyrics 605, and the control bar 606 are all sub-nodes of the base map 601. The song title 6031 and the singer 6041 are child nodes of the title column 603. The progress bar 6061, the pause button 6062, the previous button 6063, and the next button 6064 are child nodes of the control bar 606.
手机基于上述播放界面600的视图树701,可进一步确定播放界面600中需要投射至手机500中显示的一个或多个控件(即目标控件)。Based on the view tree 701 of the above-mentioned playback interface 600, the mobile phone can further determine one or more controls (ie, target controls) in the playback interface 600 that need to be projected to the display in the mobile phone 500.
示例性的,手机中可预先设置与上述播放界面600对应的配置文件。或者,手机可从服务器中获取与播放界面600对应的配置文件。该配置文件中记录了手机在播放界面600中需要投射至智能手表上的一个或多个控件(即目标控件)。Exemplarily, a configuration file corresponding to the aforementioned playback interface 600 may be preset in the mobile phone. Alternatively, the mobile phone may obtain the configuration file corresponding to the playing interface 600 from the server. The configuration file records one or more controls (ie, target controls) that need to be projected onto the smart watch in the playback interface 600 of the mobile phone.
其中,上述配置文件可以采用JSON(JavaScript Object Notation)格式、XML(Extensible Markup Language)格式或文本格式等格式存储在手机中或服务器中,本申请实施例对此不做任何限制。The above configuration file may be stored in a mobile phone or server in a format such as JSON (JavaScript Object Notation) format, XML (Extensible Markup Language) format, or text format, and the embodiment of the present application does not impose any limitation on this.
示例性的,与播放界面600对应的配置文件1可以为:Exemplarily, the configuration file 1 corresponding to the playing interface 600 may be:
Figure PCTCN2020093908-appb-000001
Figure PCTCN2020093908-appb-000001
可以看出,配置文件1中包含多个“src”字段(例如上述“src1”字段和“src2”字段)。每个“src”字段中记录了播放界面600中一个控件的具体位置。例如,每个控件的位置均可通过left,top,widht,height这4个参数的取值唯一确定。其中,left为控件左上角顶点在x轴的大小,top为控件左上角顶点在y轴的大小,width为控件的宽度,height为控件的高度。配置文件1中记录的一个或多个控件即为手机需要投射至智能手表中显示的目标控件。It can be seen that the configuration file 1 contains multiple "src" fields (for example, the aforementioned "src1" field and "src2" field). Each "src" field records the specific position of a control in the playback interface 600. For example, the position of each control can be uniquely determined by the values of the 4 parameters: left, top, widht, and height. Among them, left is the size of the top left corner of the control on the x axis, top is the size of the top left corner of the control on the y axis, width is the width of the control, and height is the height of the control. The one or more controls recorded in the configuration file 1 are the target controls that the mobile phone needs to project to the smart watch.
那么,手机根据配置文件1中“src”字段记录的各个控件的位置,可基于视图树701识别出播放界面600中需要投射至手机500中显示的目标控件。例如,该目标控件包括:标题栏603中的歌曲名称6031和演唱者6041,控制栏606中的暂停按钮6062、上一首按钮6063以及下一首按钮6064,以及专辑封面604。Then, according to the position of each control recorded in the "src" field in the configuration file 1, the mobile phone can identify the target control in the playback interface 600 that needs to be projected to the mobile phone 500 based on the view tree 701. For example, the target control includes: the song name 6031 and artist 6041 in the title bar 603, the pause button 6062 in the control bar 606, the previous button 6063 and the next button 6064, and the album cover 604.
在一些实施例中,上述配置文件1中还可以记录目标控件投屏后在投屏界面中的具体显示位置。例如,可以在上述配置文件1设置与“src1”字段对应的“dest1”字段,“dest1”字段用于指示控件1在目的设备中的显示位置。示例性的,“dest1”字段如下所示:In some embodiments, the configuration file 1 may also record the specific display position of the target control on the screen projection interface after the screen is projected. For example, a "dest1" field corresponding to the "src1" field can be set in the above configuration file 1, and the "dest1" field is used to indicate the display position of the control 1 in the target device. Exemplarily, the "dest1" field is as follows:
Figure PCTCN2020093908-appb-000002
Figure PCTCN2020093908-appb-000002
那么,手机根据配置文件1中的各个“dest”字段,可以确定出每播放界面600中的每个目标控件投屏后在智能手表的投屏界面中的显示位置。示例性的,如图8中的(a)所示,手机(即源设备)的播放界面600位于第一坐标系中,“src1”字段记录的控件1在投屏前位于第一坐标系的区域801中。如图8中的(b)所示,智能手表(即目的设备)的投屏界面位于第二坐标系中,“dest1”字段记录的控件1在投屏前后位于第二坐标系的区域802中。区域801中的任意位置与区域802中的一个位置唯一对应。Then, the mobile phone can determine the display position of each target control in the playback interface 600 on the screen projection interface of the smart watch according to the respective "dest" fields in the configuration file 1. Exemplarily, as shown in Figure 8(a), the playback interface 600 of the mobile phone (ie the source device) is located in the first coordinate system, and the control 1 recorded in the "src1" field is located in the first coordinate system before the screen is projected. In area 801. As shown in Figure 8(b), the screen projection interface of the smart watch (ie, the destination device) is located in the second coordinate system, and the control 1 recorded in the "dest1" field is located in the area 802 of the second coordinate system before and after the projection. . Any position in the area 801 uniquely corresponds to a position in the area 802.
在另一些实施例中,上述配置文件1中还可以记录目标控件在投屏前后显示位置的变化 关系。例如,配置文件1中针对控件1还设置有如下字段:In other embodiments, the configuration file 1 can also record the change of the display position of the target control before and after the screen is projected. For example, the following fields are also set for control 1 in configuration file 1:
Figure PCTCN2020093908-appb-000003
Figure PCTCN2020093908-appb-000003
其中,“translationx”字段和“translationy”字段分别用于指示控件1投屏后在x轴和y轴上的平移距离;“scalex”字段和“scaley”字段分别用于指示控件1投屏后在x轴和y轴上的缩放比例;“rotatedegree”字段用于指示控件1投屏后的旋转角度;“order”字段用于指示控件1投屏后所在的图层位置(例如在最底层图层还是在最顶层图层中)。Among them, the "translationx" field and the "translationy" field are used to indicate the translation distance of the control 1 on the x-axis and the y-axis after the projection; the "scalex" field and the "scaley" field are respectively used to indicate that the control 1 is in the The zoom ratio on the x-axis and y-axis; the "rotatedegree" field is used to indicate the rotation angle of control 1 after projection; the "order" field is used to indicate the layer position of control 1 after projection (for example, in the bottom layer Still in the top layer).
同样,手机根据上述字段中记录的控件1在投屏前后显示位置的变化关系,也可以确定出控件1投屏后在智能手表的投屏界面中的显示位置。也即确定控件1在第一坐标系中的位置和控件1在第二坐标系中的位置。Similarly, the mobile phone can also determine the display position of the control 1 on the screen projection interface of the smart watch after the screen is projected based on the change relationship of the display position of the control 1 before and after the screen is projected recorded in the above field. That is, the position of the control 1 in the first coordinate system and the position of the control 1 in the second coordinate system are determined.
示例性的,手机识别出播放界面600中的目标控件,以及各个目标控件投屏后的具体显示位置后,可对播放界面600的视图树701进行拆分、裁剪和重组等操作,生成投屏后在智能手表上显示的投屏界面的视图树901。如图9中的(a)所示,在视图树901中,手机删除了视图树701中不是目标控件的节点,例如上述底图601、状态栏602、状态栏602中的各个控件以及控制栏606中的进度条6061。并且,如果配置文件1中记录了投屏后标题栏603和控制栏606中的目标控件位于专辑封面604的图层之上,则在视图树901中,手机可将标题栏603中的歌曲名称6031和演唱者6041设置为专辑封面604的子节点,并将控制栏606中的暂停按钮6062、上一首按钮6063以及下一首按钮6064也设置为专辑封面604的子节点。Exemplarily, after the mobile phone recognizes the target control in the playback interface 600 and the specific display position of each target control after being screened, the view tree 701 of the playback interface 600 can be split, cropped, and reorganized to generate a projection screen. The view tree 901 of the projection interface displayed on the smart watch. As shown in Figure 9(a), in the view tree 901, the mobile phone deletes nodes in the view tree 701 that are not target controls, such as the above-mentioned base map 601, status bar 602, various controls in the status bar 602, and the control bar 606 in the progress bar 6061. Also, if the target controls in the title bar 603 and the control bar 606 after the projection are recorded in the profile 1 are located on the layer of the album cover 604, then in the view tree 901, the phone can change the song name in the title bar 603 6031 and singer 6041 are set as the child nodes of the album cover 604, and the pause button 6062, the previous button 6063 and the next button 6064 in the control bar 606 are also set as the child nodes of the album cover 604.
进而,手机(即源设备)可通过上述通信网络104向智能手表(即目的设备)发送UI消息,该UI消息中包括上述视图树901以及视图树901中每个控件相关的绘制指令和绘图资源。Furthermore, the mobile phone (that is, the source device) can send a UI message to the smart watch (that is, the destination device) through the communication network 104. The UI message includes the view tree 901 and the drawing instructions and drawing resources related to each control in the view tree 901. .
智能手表接收到与上述播放界面600对应的UI消息后,可按照视图树901中的层次和顺序,依次调用视图树901中每一个目标控件的绘制指令,在配置文件1中指定的位置绘制目标控件。最终,如图9中的(b)所示,智能手表可绘制出上述播放界面600投屏后的投屏界面902。投屏界面902中的各个控件与视图树901中各个控件一一对应。After the smart watch receives the UI message corresponding to the above-mentioned playback interface 600, it can call the drawing instructions of each target control in the view tree 901 in turn according to the level and order in the view tree 901, and draw the target at the position specified in the configuration file 1. Control. Finally, as shown in (b) of FIG. 9, the smart watch can draw the screen projection interface 902 after the above-mentioned playback interface 600 is projected. Each control in the projection interface 902 corresponds to each control in the view tree 901 one-to-one.
可以看出,手机在将上述播放界面600投屏至智能手表中显示时,可对播放界面600中的控件进行拆分、删减和重组等操作,使得最终投屏在智能手表中的投屏界面902能够适用智能手表中显示屏的显示尺寸以及用户的使用需求,从而提高多设备之间投屏时的显示效果和用户体验。It can be seen that when the mobile phone displays the above-mentioned playback interface 600 on the smart watch, it can split, delete, and reorganize the controls in the playback interface 600, so that the final screen projection is in the smart watch. The interface 902 can be adapted to the display size of the display screen in the smart watch and the user's usage requirements, so as to improve the display effect and user experience during screen projection between multiple devices.
仍以手机将播放界面600投屏至智能手表中显示上述投屏界面902举例,用户可在投屏界面902中对投屏后的各个目标控件输入相应的触摸操作,智能手表可响应该触摸操作生成对应的触摸事件,进而,智能手表可控制手机实现与该触摸事件对应的功能。例如,该触摸事件中可以包括触摸点的坐标以及触摸事件的时间类型(例如单击、双击、滑动等)。Still taking the mobile phone to project the playback interface 600 to the smart watch to display the above-mentioned screen projection interface 902 as an example, the user can input the corresponding touch operation on each target control after the projection in the screen projection interface 902, and the smart watch can respond to the touch operation A corresponding touch event is generated, and further, the smart watch can control the mobile phone to implement a function corresponding to the touch event. For example, the touch event may include the coordinates of the touch point and the time type of the touch event (for example, single click, double tap, slide, etc.).
例如,当用户希望音乐APP暂停正在播放的歌曲时,可点击投屏界面902中的暂停按钮6062。又例如,当用户希望音乐APP播放上一首歌曲时,可点击投屏界面902中的上一首按钮6063。又例如,当用户希望音乐APP播放下一首歌曲时,可点击投屏界面902中的下一首按钮6064。For example, when the user wants the music APP to pause the song being played, he can click the pause button 6062 in the screen projection interface 902. For another example, when the user wants the music APP to play the previous song, he can click the previous button 6063 in the screen projection interface 902. For another example, when the user wants the music APP to play the next song, he can click the next button 6064 in the screen projection interface 902.
以用户点击投屏界面902中的暂停按钮6062举例,如图10所示,智能手表显示的投屏界面902位于第二坐标系中。智能手表的触摸传感器可实时检测用户在投屏界面902上输入的触摸操作。当检测到用户手指接触投屏界面902时,智能手表的触摸传感器可将检测到的触摸信息(例如,触摸点A的坐标信息、触摸时间等)封装为第一触摸事件,并将该第一触摸事件上报给智能手表的内核层。其中,该第一触摸事件是智能手表响应于用户点击投屏界面902中的暂停按钮6062这一第一触摸操作产生的。Taking the user clicking the pause button 6062 in the screen projection interface 902 as an example, as shown in FIG. 10, the screen projection interface 902 displayed by the smart watch is located in the second coordinate system. The touch sensor of the smart watch can detect the touch operation input by the user on the screen projection interface 902 in real time. When it is detected that the user's finger touches the projection interface 902, the touch sensor of the smart watch can encapsulate the detected touch information (for example, the coordinate information of touch point A, the touch time, etc.) as a first touch event, and the first touch The touch event is reported to the core layer of the smart watch. The first touch event is generated by the smart watch in response to the first touch operation of the user clicking the pause button 6062 in the projection interface 902.
以触摸点的坐标为A(x,y)举例,触摸传感器通过驱动可将检测到的触摸操作封装为第一原始触摸事件上报给内核层,进而由内核层将该原始触摸事件封装为上层可读取的第一高级触摸事件上报给应用程序框架层。应用程序框架层接收到携带有坐标A(x,y)的第一高级触摸事件后,可根据上述配置文件1中记录的各个控件在投屏界面902中的显示位置确定用户本次触摸的目标控件为哪一个。Taking the coordinates of the touch point as A(x, y) for example, the touch sensor can encapsulate the detected touch operation as the first original touch event and report it to the kernel layer through the driver, and then the kernel layer encapsulates the original touch event as the upper layer. The first high-level touch event read is reported to the application framework layer. After the application framework layer receives the first advanced touch event carrying the coordinates A (x, y), it can determine the user's current touch target according to the display position of each control recorded in the configuration file 1 in the projection interface 902 Which one is the control?
例如,上述配置文件1中的“dest1”字段记录了暂停按钮6062位于投屏界面902的区域1中。那么,当坐标A(x,y)落入区域1时,智能手表可确定用户本次触摸操作所作用的目标控件为暂停按钮6062。For example, the “dest1” field in the aforementioned configuration file 1 records that the pause button 6062 is located in area 1 of the projection interface 902. Then, when the coordinates A(x, y) fall into area 1, the smart watch can determine that the target control acted on by the user this touch operation is the pause button 6062.
在一些实施例中,坐标A(x,y)可能会同时落入两个控件中。例如,坐标A(x,y)既位于暂停按钮6062所在的区域内,同时也位于专辑封面604所在的区域内。此时,智能手表可根据配置文件1中记录的“order”字段,将位于最上层的控件作确定为用户本次触摸操作所作用的目标控件。In some embodiments, the coordinates A(x, y) may fall into two controls at the same time. For example, the coordinates A(x, y) are located in the area where the pause button 6062 is located, and are also located in the area where the album cover 604 is located. At this time, the smart watch can determine the uppermost control as the target control for the user's current touch operation according to the "order" field recorded in the configuration file 1.
以用户点击投屏界面902中的暂停按钮6062举例,应用程序框架层确定出本次第一触摸事件是对暂停按钮6062的触摸事件后,可根据配置文件1中记录的暂停按钮6062在投屏前后的位置关系,还原出上述触摸点A在手机(即源设备)的第一坐标系内对应的触摸点A’点。Take the user clicking the pause button 6062 in the screen projection interface 902 as an example. After the application framework layer determines that the first touch event this time is a touch event on the pause button 6062, the screen can be projected according to the pause button 6062 recorded in the configuration file 1. The front and back position relationship restores the touch point A'corresponding to the touch point A in the first coordinate system of the mobile phone (that is, the source device).
示例性的,手机在向智能手表投射暂停按钮6062时,对第一坐标系中的暂停按钮6062执行了平移、缩放或旋转等一项或多项操作后,形成了第二坐标系中的暂停按钮6062。那么,智能手表在还原与触摸点A对应的触摸点A’时,可对上述坐标A(x,y)执行相应的反向平移、反向缩放或反向旋转等操作,从而还原出在手机的播放界面600中与坐标A(x,y)对应的A’(x’,y’)点。Exemplarily, when the mobile phone projects the pause button 6062 to the smart watch, after performing one or more operations such as translation, zooming, or rotation on the pause button 6062 in the first coordinate system, a pause in the second coordinate system is formed Button 6062. Then, when the smart watch restores the touch point A'corresponding to the touch point A, it can perform the corresponding reverse translation, reverse zoom, or reverse rotation on the above coordinates A(x, y), so as to restore the phone The point A'(x', y') corresponding to the coordinates A (x, y) in the playback interface 600 of.
例如,如图11中的(a)所示,播放界面600显示在手机(即源设备)所在的第一坐标系时,暂停按钮6062中A’点的坐标为A’(100,20)。在投射暂停按钮6062时,如图11中的(b)所示,将暂停按钮6062在x轴上的负方向上平移了20个单位,并将暂停按钮6062在y轴上的正方向上平移了30个单位。并且,将暂停按钮6062放大了1.5倍。暂停按钮6062投射至投屏界面902所在的第二坐标系后,暂停按钮6062上与A’点对应的A点的坐标为A((100-20)*1.5,(20+30*1.5)),即A(120,75)。那么,智能手表检测到用户对暂停按钮6062的A点输入触摸事件后,智能手表可将A点在x轴和y轴上的坐标缩小1.5倍,再将x轴上的坐标反向平移20个单位,并将A点在y轴上的坐标反向平移30个单位,得到坐标A(120,75)在第一坐标系中对应的坐标A’(100,20)。For example, as shown in (a) of FIG. 11, when the playback interface 600 is displayed in the first coordinate system where the mobile phone (i.e., the source device) is located, the coordinates of point A'in the pause button 6062 are A'(100, 20). When projecting the pause button 6062, as shown in (b) in Figure 11, the pause button 6062 is translated 20 units in the negative direction on the x-axis, and the pause button 6062 is translated in the positive direction on the y-axis 30 units. And, the pause button 6062 is enlarged by 1.5 times. After the pause button 6062 is projected to the second coordinate system where the projection interface 902 is located, the coordinates of point A corresponding to point A'on the pause button 6062 are A((100-20)*1.5, (20+30*1.5)) , Which is A(120,75). Then, after the smart watch detects that the user inputs a touch event to point A of the pause button 6062, the smart watch can reduce the coordinates of point A on the x-axis and y-axis by 1.5 times, and then reverse the x-axis coordinates by 20 Units, and translate the coordinates of point A on the y axis by 30 units in the opposite direction to obtain the coordinates A'(100, 20) corresponding to the coordinates A (120, 75) in the first coordinate system.
示例性的,当配置文件1中记录了暂停按钮6062在x轴和y轴上的平移距离时,智能手表可按照该平移距离反向推算本次触摸点A点的平移距离。当配置文件1中记录了暂停按钮6062在x轴和y轴上的缩放比例时,智能手表可按照该缩放比例反向推算本次触摸点A点的缩放比例。当配置文件1中记录了暂停按钮6062的旋转角度时,智能手表可按照该旋转角度反向推算本次触摸点A点的旋转角度。Exemplarily, when the translation distance of the pause button 6062 on the x-axis and the y-axis is recorded in the configuration file 1, the smart watch can reversely calculate the translation distance of the touch point A this time according to the translation distance. When the zoom ratio of the pause button 6062 on the x-axis and the y-axis is recorded in the configuration file 1, the smart watch can reversely calculate the zoom ratio of the touched point A according to the zoom ratio. When the rotation angle of the pause button 6062 is recorded in the configuration file 1, the smart watch can reversely calculate the rotation angle of the touched point A according to the rotation angle.
又或者,智能手表也可以预先设置第一坐标系与第二坐标系之间的坐标映射公式。这样,智能手表获取到本次触摸事件的触摸点A后,可根据该坐标映射公式计算触摸点A在手机的第一坐标系中对应的触摸点A’。Alternatively, the smart watch may also preset a coordinate mapping formula between the first coordinate system and the second coordinate system. In this way, after the smart watch obtains the touch point A of this touch event, it can calculate the touch point A'corresponding to the touch point A in the first coordinate system of the mobile phone according to the coordinate mapping formula.
智能手表还原出与投屏界面902上的触摸点A对应在播放界面600中的触摸点A’后,可将触摸点A’的坐标A’(x’,y’)代替第一触摸事件中触摸点A的坐标A(x,y),形成第二触摸事件。此时,第二触摸事件是指当用户在播放界面600中输入点击暂停按钮6062这一第二触摸操作时,手机将产生的触摸事件。可以理解的是,在本申请实施例中,用户实际并没有在播放界面600中点击暂停按钮6062,而是智能手表通过将触摸点A转换为触摸点A’模拟出了用户在播放界面600中点击暂停按钮6062的第二触摸操作。After the smart watch restores the touch point A'in the playback interface 600 corresponding to the touch point A on the screen projection interface 902, the coordinates A'(x', y') of the touch point A'can be replaced in the first touch event The coordinates A(x, y) of the touch point A form a second touch event. At this time, the second touch event refers to a touch event generated by the mobile phone when the user inputs a second touch operation of clicking the pause button 6062 in the playback interface 600. It is understandable that, in this embodiment of the application, the user does not actually click the pause button 6062 in the playback interface 600, but the smart watch simulates the user's presence in the playback interface 600 by converting touch point A to touch point A' The second touch operation of clicking the pause button 6062.
进而,智能手表可将上述第二触摸事件发送给手机。手机的应用程序框架层可将该第二触摸事件上报给应用层中正在运行的音乐APP,使得音乐APP可以响应A’点的第二触摸事件暂停正在播放的音频。可以理解的是,音乐APP可响应A’点的第二触摸事件相当于音乐APP响应了用户在投屏界面902中A点的第一触摸事件。Furthermore, the smart watch can send the aforementioned second touch event to the mobile phone. The application framework layer of the mobile phone can report the second touch event to the music APP running in the application layer, so that the music APP can pause the audio being played in response to the second touch event at point A'. It is understandable that the music APP can respond to the second touch event of point A', which is equivalent to the music APP responding to the user's first touch event of point A on the screen projection interface 902.
也就是说,用户在目的设备的投屏界面902中输入了在A(x,y)点的第一触摸操作,目的设备生成了与第一触摸操作对应的第一触摸事件。目的设备(或源设备)对第一触摸事件中触摸点的坐标进行坐标转换后生成了第二触摸事件,使得源设备中的音乐APP基于第二触摸事件认为用户在播放界面600的A’(x’,y’)点执行了第二触摸操作。进而,音乐APP可响应该第二触摸事件执行对应的应用功能,实现投屏时目的设备对源设备的反向控制。That is, the user inputs the first touch operation at point A(x, y) in the projection interface 902 of the target device, and the target device generates the first touch event corresponding to the first touch operation. The destination device (or source device) performs coordinate conversion on the coordinates of the touch point in the first touch event and generates a second touch event, so that the music APP in the source device thinks that the user is on the A'( of the playback interface 600 based on the second touch event). x', y') point performs the second touch operation. Furthermore, the music APP can execute the corresponding application function in response to the second touch event, so as to realize reverse control of the source device by the destination device during screen projection.
在另一些实施例中,智能手表(即目的设备)也可将携带A(x,y)点的触摸事件发送给手机(即源设备),进而由手机中的应用程序框架层按照上述方法将A(x,y)点还原为播放界面600中的A’(x’,y’)点,并将触摸点为A’(x’,y’)点的触摸事件上报给手机中的音乐APP,实现暂停播放音频的功能。In other embodiments, the smart watch (that is, the destination device) can also send touch events carrying A(x, y) points to the mobile phone (that is, the source device), and then the application framework layer in the mobile phone sends the touch event according to the above method. The A(x,y) point is restored to the A'(x',y') point in the playback interface 600, and the touch event with the touch point being A'(x',y') is reported to the music APP in the phone , To achieve the function of pausing audio playback.
在另一些实施例中,还可以在上述配置文件1中记录每个控件的标识,例如“dest1”字段对应的控件为暂停按钮6062,暂停按钮6062的标识为001。那么,智能手表(即目的设备)根据检测到的一些列触摸事件中触摸点的坐标、触摸时间等信息,可以确定出用户对投屏界面902中的暂停按钮6062执行了单击操作。进而,智能手表可将暂停按钮6062的标识(例如001)以及确定出的触摸事件的类型(例如单击操作)发送给手机(即源设备)。进而,手机可确定出用户执行了单击暂停按钮6062的事件,那么,手机中的应用程序框架层可向正在运行的音乐APP上报用户单击暂停按钮6062的事件,使得音乐APP调用与单击暂停按钮6062对应的函数实现暂停播放音频的功能,即执行与第一触摸事件对应的操作指令。In other embodiments, the identification of each control may also be recorded in the aforementioned configuration file 1. For example, the control corresponding to the "dest1" field is the pause button 6062, and the identification of the pause button 6062 is 001. Then, the smart watch (ie, the target device) can determine that the user has performed a click operation on the pause button 6062 in the screen projection interface 902 based on the coordinates of the touch point and the touch time in the series of touch events detected. Furthermore, the smart watch may send the identification of the pause button 6062 (for example, 001) and the determined type of touch event (for example, a click operation) to the mobile phone (that is, the source device). Furthermore, the mobile phone can determine that the user has performed the event of clicking the pause button 6062. Then, the application framework layer in the mobile phone can report the event of the user clicking the pause button 6062 to the running music APP, so that the music APP can be called and clicked. The function corresponding to the pause button 6062 realizes the function of pausing the audio playback, that is, executing the operation instruction corresponding to the first touch event.
又或者,智能手表(即目的设备)确定出用户对投屏界面902中的暂停按钮6062执行了单击操作后,还可以根据配置文件1中记录的暂停按钮6062在播放界面600中的具体位置,生成对应的触摸事件(例如第三触摸事件)。其中,第三触摸事件的事件类型与第一触摸事件的事件类型相同,均为单击事件。第三触摸事件中触摸点的坐标B可以位于暂停按钮6062在播放界面600内的任意位置。这样,智能手表将生成的第三触摸事件发送给手机(即源设备)后,手机也可将该第三触摸事件上报给应用层中正在运行的音乐APP,使得音乐APP可以响应B点的第三触摸事件暂停正在播放的音频。同样,音乐APP响应B点的第三触摸事件相当于音乐APP响应了用户在投屏界面902中A点的第一触摸事件。Or, after the smart watch (that is, the destination device) determines that the user has performed a click operation on the pause button 6062 in the screen projection interface 902, it can also be based on the specific position of the pause button 6062 recorded in the configuration file 1 in the playback interface 600 , Generate a corresponding touch event (such as a third touch event). The event type of the third touch event is the same as the event type of the first touch event, and both are click events. The coordinate B of the touch point in the third touch event can be located at any position of the pause button 6062 in the play interface 600. In this way, after the smart watch sends the generated third touch event to the mobile phone (that is, the source device), the mobile phone can also report the third touch event to the music APP running in the application layer, so that the music APP can respond to the first point of point B. The three-touch event pauses the audio being played. Similarly, the music APP responding to the third touch event of point B is equivalent to the music APP responding to the user's first touch event of point A on the screen projection interface 902.
当然,用户也可以在手机(即源设备)显示的播放界面600中输入相应的触摸操作,手机检测到与该触摸操作对应的触摸事件后,无需对触摸点的坐标进行转换,便可将该触摸事件上报给音乐APP实现相应的应用功能。Of course, the user can also input the corresponding touch operation in the playback interface 600 displayed on the mobile phone (ie, the source device). After the mobile phone detects the touch event corresponding to the touch operation, the coordinate of the touch point can be converted. The touch event is reported to the music APP to realize the corresponding application function.
也就是说,在投屏场景下,用户既可以在源设备中输入触摸操作控制源设备实现相应的功能,也可以在目的设备中输入触摸操作控制源设备实现相应的功能,从而提高用户在投屏场景下的触控使用体验。That is to say, in the screen projection scenario, the user can input touch operations in the source device to control the source device to achieve corresponding functions, or input touch operations in the destination device to control the source device to achieve corresponding functions, thereby improving the user’s ability to cast Touch experience in the screen scene.
另外,如果源设备响应用户在源设备或目的设备中输入的触摸操作更新了自身的显示画面,则源设备可继续使用上述投屏方法将更新后的显示界面投射至目的设备中显示,本申请实施例对此不做任何限制。In addition, if the source device updates its display screen in response to the touch operation input by the user on the source device or the destination device, the source device can continue to use the above screen projection method to project the updated display interface to the destination device for display. The embodiment does not impose any limitation on this.
在一些使用场景下,用户可以将一个源设备中的显示内容投射到多个不同的目的设备中显示。那么,按照上述触控方法,用户在每个目的设备中均可输入相应的触摸操作控制源设备实现相关的应用功能。In some usage scenarios, the user can project the display content in one source device to multiple different destination devices for display. Then, according to the aforementioned touch control method, the user can input a corresponding touch operation in each target device to control the source device to implement related application functions.
示例性的,如图12所示,手机作为源设备开启投屏功能后,可将视频APP正在显示的播放界面1201同时投屏至两个目的设备中。其中,一个目的设备为智能手表,另一个目的设备为智能电视。Exemplarily, as shown in FIG. 12, after the mobile phone is used as the source device to turn on the screen projection function, the playback interface 1201 being displayed by the video APP can be simultaneously screened to two destination devices. Among them, one destination device is a smart watch, and the other destination device is a smart TV.
与上述投屏方法类似的,手机可根据与智能手表对应的配置文件1,识别出播放界面1201需要投射至智能手表显示的第一目标控件为:控制栏1205以及控制栏1205中的控件1206、控件1207以及控件1208。进而,如图12所示,手机可将控制栏1205以及控制栏1205中的各个控件投射至智能手表中,形成第一投屏界面1301。Similar to the above-mentioned projection method, the mobile phone can recognize that the playback interface 1201 needs to be projected to the first target control displayed on the smart watch according to the configuration file 1 corresponding to the smart watch: the control bar 1205 and the controls 1206 in the control bar 1205. Control 1207 and control 1208. Furthermore, as shown in FIG. 12, the mobile phone can project the control bar 1205 and various controls in the control bar 1205 to the smart watch to form a first projection interface 1301.
同时,手机可根据与智能电视对应的配置文件2,识别出播放界面1201需要投射至智能电视显示的第二目标控件为:视频画面1202以及视频画面1202中的文本控件1203和进度条1204。进而,仍如图12所示,手机可将视频画面1202以及视频画面1202中的各个控件投射至智能电视中,形成第二投屏界面1302。At the same time, according to the configuration file 2 corresponding to the smart TV, the mobile phone can recognize that the second target control that the playback interface 1201 needs to be projected to the smart TV display is: the video screen 1202 and the text control 1203 and the progress bar 1204 in the video screen 1202. Furthermore, as still shown in FIG. 12, the mobile phone can project the video screen 1202 and various controls in the video screen 1202 to the smart TV to form a second screen projection interface 1302.
那么,用户可以在第一投屏界面1301中输入触摸操作,以控制手机(即源设备)中运行的视频APP。并且,用户还可以在第二投屏界面1302中输入触摸操作,以控制手机(即源设备)中运行的视频APP。Then, the user can input a touch operation in the first projection interface 1301 to control the video APP running in the mobile phone (ie, the source device). In addition, the user can also input a touch operation in the second screen projection interface 1302 to control the video APP running in the mobile phone (ie, the source device).
示例性的,如图13所示,如果智能手表(即第一目的设备)检测到用户点击第一投屏界面1301中的暂停按钮1106,则智能手表可生成包含触摸点P1的第一触摸事件。并且,智能手表可根据配置文件1中记录的暂停按钮1106在投屏前后的位置关系,将第一投屏界面1301中的触摸点P1转换为上述播放界面1201中的触摸点P1’。进而,智能手表可将包含触摸点P1’的第二触摸事件发送给手机,使得手机中的视频APP可响应触摸点为P1’的第二触摸事件执行暂停视频的指令。Exemplarily, as shown in FIG. 13, if the smart watch (ie, the first destination device) detects that the user clicks the pause button 1106 in the first screen projection interface 1301, the smart watch may generate a first touch event including the touch point P1 . In addition, the smart watch can convert the touch point P1 in the first screen projection interface 1301 to the touch point P1' in the playback interface 1201 according to the positional relationship of the pause button 1106 recorded in the configuration file 1 before and after the screen projection. Furthermore, the smart watch can send the second touch event including the touch point P1' to the mobile phone, so that the video APP in the mobile phone can execute an instruction to pause the video in response to the second touch event with the touch point P1'.
示例性的,如图14所示,如果智能电视(即第二目的设备)检测到用户拖动第二投屏界面1302中的进度条1104,则智能电视可生成包含触摸点P2的第一触摸事件。并且,智能手表可根据配置文件2中记录的进度条1104在投屏前后的位置关系,将第二投屏界面1302中的触摸点P2转换为上述播放界面1201中的触摸点P2’。进而,智能电视可将包含触摸点P2’的第二触摸事件发送给手机,使得手机中的视频APP可响应触摸点为P2’的第二触摸事件将视频切换至与进度条1104上P2’点对应的位置播放。Exemplarily, as shown in FIG. 14, if the smart TV (ie, the second destination device) detects that the user drags the progress bar 1104 in the second projection interface 1302, the smart TV can generate a first touch that includes the touch point P2. event. In addition, the smart watch can convert the touch point P2 in the second screen projection interface 1302 to the touch point P2' in the aforementioned playback interface 1201 according to the positional relationship of the progress bar 1104 recorded in the configuration file 2 before and after the screen projection. Furthermore, the smart TV can send the second touch event including the touch point P2' to the mobile phone, so that the video APP in the mobile phone can respond to the second touch event with the touch point P2' to switch the video to the point P2' on the progress bar 1104 Play at the corresponding position.
另外,如果手机不仅接收到第一目的设备发来的触摸事件,还接收到第二目的设备发来的触摸事件,则手机可按照接收各个触摸事件的先后时间顺序依次响应每个触摸事件。又或者,每个目的设备检测到用户输入的触摸操作时,还可以记录该触摸操作的触摸时间。进而,目的设备在向源设备发送对应的触摸事件时,可一并发送该触摸事件的触摸时间。这样,源设备可根据触摸时间的先后顺序依次响应不同目的设备发来的触摸事件。In addition, if the mobile phone not only receives the touch event sent by the first destination device, but also the touch event sent by the second destination device, the mobile phone can respond to each touch event in sequence according to the time sequence of receiving each touch event. Or, when each target device detects a touch operation input by the user, it may also record the touch time of the touch operation. Furthermore, when the destination device sends a corresponding touch event to the source device, it can also send the touch time of the touch event. In this way, the source device can respond to touch events sent by different destination devices in sequence according to the sequence of the touch time.
可以看出,当源设备将其显示界面中的显示内容同时投射至多个目的设备中显示时,用 户可在任意目的设备中输入触摸操作反向控制源设备实现与该触摸操作对应的控制功能,从而提高用户在投屏场景下的触控使用体验。It can be seen that when the source device simultaneously projects the display content in its display interface to multiple destination devices for display, the user can input a touch operation in any destination device to reversely control the source device to realize the control function corresponding to the touch operation. So as to improve the user's touch experience in the projection scene.
在一些使用场景下,用户可以将多个源设备中的显示内容投射到同一个目的设备中显示。那么,按照上述触控方法,用户在目的设备中对某一控件输入相应的触摸操作后,可控制与该控件对应的源设备实现相关的应用功能。In some usage scenarios, the user can project the display content from multiple source devices to the same destination device for display. Then, according to the above touch method, after the user inputs a corresponding touch operation on a certain control in the target device, the user can control the source device corresponding to the control to implement related application functions.
示例性的,如图15所示,用户可将手机和智能手表同时作为智能电视(即目的设备)的源设备。其中,手机可将正在显示的锁屏界面1501中的显示内容投射至智能电视中,同时,智能手表可将正在显示的检测界面1502中的显示内容投射至智能电视中。当然,智能电视除了显示手机和智能手表投射来的内容外,还可以显示智能电视自身的显示画面。Exemplarily, as shown in FIG. 15, the user can simultaneously use the mobile phone and the smart watch as the source device of the smart TV (that is, the destination device). Among them, the mobile phone can project the display content in the lock screen interface 1501 being displayed to the smart TV, and at the same time, the smart watch can project the display content in the detection interface 1502 being displayed to the smart TV. Of course, in addition to displaying content projected from mobile phones and smart watches, smart TVs can also display the display screen of the smart TV itself.
与上述投屏方法类似的,手机可根据与锁屏界面1501对应的配置文件1,识别出锁屏界面1501中需要投射至智能电视显示的第一目标控件为:通知消息1511中的图标1512和消息内容1513。并且,智能手表可根据与检测界面1502对应的配置文件2,识别出检测界面1502中需要投射至智能电视显示的第二目标控件为:心率信息1521和热量信息1522。Similar to the above screen projection method, the mobile phone can identify the first target control in the lock screen interface 1501 that needs to be projected to the smart TV display according to the configuration file 1 corresponding to the lock screen interface 1501 as: the icons 1512 and 1512 in the notification message 1511 Message content 1513. In addition, the smart watch can identify the second target control in the detection interface 1502 that needs to be projected to the smart TV display according to the configuration file 2 corresponding to the detection interface 1502 as: heart rate information 1521 and calorie information 1522.
智能电视接收到手机发来的第一目标控件和智能手表发来的第二目标控件后,可对第一目标控件、第二目标控件以及自身显示界面中的控件1503进行拆分、重组等操作。进而,仍如图15所示,智能电视可在投屏界面1504中显示上述第一目标控件、第二目标控件和控件1503。这样,目的设备可以同时显示多个源设备中的显示内容。After the smart TV receives the first target control sent by the mobile phone and the second target control sent by the smart watch, it can split and reorganize the first target control, the second target control and the control 1503 in its own display interface. . Furthermore, as still shown in FIG. 15, the smart TV may display the above-mentioned first target control, second target control, and control 1503 in the projection interface 1504. In this way, the destination device can simultaneously display the display content of multiple source devices.
进一步地,用户可以在投屏界面1504中对相应的控件输入触摸操作。如果用户对投屏界面1504中的第一目标控件输入触摸操作,则可控制手机(即第一源设备)实现相应的功能。如果用户对投屏界面1504中的第二目标控件输入触摸操作,则可控制智能手表(即第二源设备)实现相应的功能。Further, the user can input touch operations on the corresponding controls in the projection interface 1504. If the user inputs a touch operation on the first target control in the projection interface 1504, the mobile phone (ie, the first source device) can be controlled to implement the corresponding function. If the user inputs a touch operation on the second target control in the projection interface 1504, the smart watch (ie, the second source device) can be controlled to implement the corresponding function.
示例性的,如图16所示,智能电视(即目的设备)检测到用户点击投屏界面1504中的消息内容1513后,可生成包含触摸点Q1的第一触摸事件。由于消息内容1513属于手机(即第一源设备)投射的第一目标控件,因此,智能电视可根据配置文件1中记录的消息内容1513在投屏前后的位置关系,将投屏界面1504中的触摸点Q1转换为上述锁屏界面1501中的触摸点Q1’。进而,智能手表可将包含触摸点Q1’的第二触摸事件发送给手机,使得手机可响应触摸点为P1’的第二触摸事件展开消息内容1513。Exemplarily, as shown in FIG. 16, after detecting that the user clicks the message content 1513 in the projection interface 1504, the smart TV (ie, the destination device) may generate a first touch event including the touch point Q1. Since the message content 1513 belongs to the first target control projected by the mobile phone (that is, the first source device), the smart TV can change the position of the message content 1513 recorded in the configuration file 1 before and after the screen projection to change the screen in the projection interface 1504 The touch point Q1 is converted into the touch point Q1' in the lock screen interface 1501. Furthermore, the smart watch can send the second touch event including the touch point Q1' to the mobile phone, so that the mobile phone can expand the message content 1513 in response to the second touch event with the touch point P1'.
示例性的,如图17所示,智能电视(即目的设备)检测到用户点击投屏界面1504中的心率信息1521后,可生成包含触摸点Q2的第一触摸事件。由于心率信息1521属于智能手表(即第二源设备)投射的第二目标控件,因此,智能电视可根据配置文件2中记录的心率信息1521在投屏前后的位置关系,将投屏界面1504中的触摸点Q2转换为上述检测界面1502中的触摸点Q2’。进而,智能电视可将包含触摸点Q2’的第二触摸事件发送给智能手表,使得智能手表可响应触摸点为P2’的第二触摸事件显示心率信息1521的详情内容。Exemplarily, as shown in FIG. 17, after the smart TV (ie, the destination device) detects that the user clicks on the heart rate information 1521 in the projection interface 1504, it may generate a first touch event including the touch point Q2. Since the heart rate information 1521 belongs to the second target control projected by the smart watch (that is, the second source device), the smart TV can set the projection interface 1504 according to the position relationship of the heart rate information 1521 recorded in the profile 2 before and after the projection. The touch point Q2 of is converted into the touch point Q2' in the detection interface 1502. Furthermore, the smart TV can send the second touch event including the touch point Q2' to the smart watch, so that the smart watch can display the detailed content of the heart rate information 1521 in response to the second touch event with the touch point P2'.
可以看出,当多个源设备同时将其显示界面中的显示内容投射至同一个目的设备中显示时,用户可在目的设备中对不同源设备投射来的控件输入触摸操作,从而控制相应的源设备实现与该触摸操作对应的控制功能,从而提高用户在投屏场景下的触控使用体验。It can be seen that when multiple source devices simultaneously project the display content in their display interface to the same target device for display, the user can input touch operations on the controls projected from different source devices in the target device to control the corresponding The source device implements the control function corresponding to the touch operation, thereby improving the user's touch experience in the screen projection scene.
需要说明的是,上述实施例中仅示例性的示出了在目的设备中输入触摸操作反向控制源设备的应用场景,可以理解的是,上述投屏时的触控方法还可以应用在其他场景中,本申请实施例对此不做任何限制。It should be noted that the above embodiments only exemplarily show the application scenario of inputting a touch operation in the destination device to reversely control the source device. It should be understood that the above touch method during projection can also be applied to other In the scenario, the embodiment of the present application does not impose any restriction on this.
示例性的,在召开视频会议时,某一会场中的电子设备可作为目的设备,其他会场中的各个电子设备可作为源设备。各个源设备可按照上述方法将目标控件投射至目的设备中显示。 进而,用户可在目的设备中向目标控件输入相应的控制操作,从而控制相应的源设备响应该控制操作实现投屏时的反向控制。Exemplarily, when a video conference is held, an electronic device in a certain conference site can be used as a destination device, and various electronic devices in other conference sites can be used as a source device. Each source device can project the target control to the target device for display according to the above method. Furthermore, the user can input a corresponding control operation to the target control in the target device, thereby controlling the corresponding source device to respond to the control operation to implement reverse control during screen projection.
又例如,学生可以在自己的手机或电脑或平板中安装教辅APP。学生在使用教辅APP答题时,其电子设备可作为源设备将答题区域的显示内容投射至老师使用的手机或电脑或平板中显示。那么,老师不仅可以实时预览多个学生在各自答题区域中的答题过程,还可以在自己的电子设备中远程控制学生的源设备,在线辅导学生解题,提高教辅APP的教学效果。For another example, students can install the teaching assistant APP on their mobile phones or computers or tablets. When students use the teaching assistant APP to answer questions, their electronic devices can be used as source devices to project the display content of the answer area to the teacher's mobile phone or computer or tablet for display. Then, the teacher can not only preview the answering process of multiple students in their respective answering areas in real time, but also remotely control the student's source device in their own electronic equipment, help students solve problems online, and improve the teaching effect of the teaching assistant APP.
本申请实施例公开了一种电子设备,包括处理器,以及与处理器相连的存储器、输入设备、输出设备和通信模块。其中,输入设备和输出设备可集成为一个设备,例如,可将触摸传感器作为输入设备,将显示屏作为输出设备,并将触摸传感器和显示屏集成为触摸屏。The embodiment of the present application discloses an electronic device including a processor, and a memory, an input device, an output device, and a communication module connected to the processor. Among them, the input device and the output device can be integrated into one device. For example, a touch sensor can be used as an input device, a display screen can be used as an output device, and the touch sensor and display screen can be integrated into a touch screen.
此时,如图18所示,上述电子设备可以包括:触摸屏1801,所述触摸屏1801包括触摸传感器1806和显示屏1807;一个或多个处理器1802;存储器1803;通信模块1808;一个或多个应用程序(未示出);以及一个或多个计算机程序1804,上述各器件可以通过一个或多个通信总线1805连接。其中该一个或多个计算机程序1804被存储在上述存储器1803中并被配置为被该一个或多个处理器1802执行,该一个或多个计算机程序1804包括指令,上述指令可以用于执行上述实施例中的各个步骤。其中,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应实体器件的功能描述,在此不再赘述。At this time, as shown in FIG. 18, the above electronic device may include: a touch screen 1801, which includes a touch sensor 1806 and a display screen 1807; one or more processors 1802; a memory 1803; a communication module 1808; one or more Application programs (not shown); and one or more computer programs 1804. The above-mentioned devices can be connected through one or more communication buses 1805. The one or more computer programs 1804 are stored in the aforementioned memory 1803 and configured to be executed by the one or more processors 1802, and the one or more computer programs 1804 include instructions, and the aforementioned instructions can be used to execute the aforementioned implementations. The steps in the example. Among them, all relevant content of the steps involved in the above method embodiments can be cited in the functional description of the corresponding physical device, which will not be repeated here.
示例性的,上述处理器1802具体可以为图3所示的处理器110,上述存储器1803具体可以为图3所示的内部存储器121和/或外部存储器120,上述显示屏1807具体可以为图3所示的显示屏194,上述触摸传感器1806具体可以为图3所示的传感器模块180中的触摸传感器,上述通信模块1808具体可以为图3所示的移动通信模块150和/或无线通信模块160,本申请实施例对此不做任何限制。Exemplarily, the foregoing processor 1802 may specifically be the processor 110 shown in FIG. 3, the foregoing memory 1803 may specifically be the internal memory 121 and/or the external memory 120 shown in FIG. 3, and the foregoing display screen 1807 may specifically be FIG. As shown in the display 194, the touch sensor 1806 may be the touch sensor in the sensor module 180 shown in FIG. 3, and the communication module 1808 may be the mobile communication module 150 and/or the wireless communication module 160 shown in FIG. The embodiment of this application does not impose any restriction on this.
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。Through the description of the above embodiments, those skilled in the art can clearly understand that for the convenience and brevity of the description, only the division of the above-mentioned functional modules is used as an example for illustration. In practical applications, the above-mentioned functions can be allocated as needed. It is completed by different functional modules, that is, the internal structure of the device is divided into different functional modules to complete all or part of the functions described above. For the specific working process of the system, device, and unit described above, reference may be made to the corresponding process in the foregoing method embodiment, which is not repeated here.
在本申请实施例各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。The functional units in the various embodiments of the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit. The above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:快闪存储器、移动硬盘、只读存储器、随机存取存储器、磁碟或者光盘等各种可以存储程序代码的介质。If the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium. Based on this understanding, the technical solutions of the embodiments of the present application essentially or the part that contributes to the prior art or all or part of the technical solutions can be embodied in the form of software products, and the computer software products are stored in a storage The medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application. The aforementioned storage media include: flash memory, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk and other media that can store program codes.
以上所述,仅为本申请实施例的具体实施方式,但本申请实施例的保护范围并不局限于此,任何在本申请实施例揭露的技术范围内的变化或替换,都应涵盖在本申请实施例的保护范围之内。因此,本申请实施例的保护范围应以所述权利要求的保护范围为准。The above are only specific implementations of the embodiments of the present application, but the protection scope of the embodiments of the present application is not limited to this. Any changes or substitutions within the technical scope disclosed in the embodiments of the present application shall be covered by this Within the protection scope of the application embodiments. Therefore, the protection scope of the embodiments of the present application should be subject to the protection scope of the claims.

Claims (20)

  1. 一种投屏场景下的触控方法,其特征在于,包括:A touch method in a projection scene, characterized in that it includes:
    源设备显示第一显示界面;The source device displays the first display interface;
    响应于用户输入的投屏指令,所述源设备将所述第一显示界面中的N个控件投射至第一目的设备显示的投屏界面中,N为大于0的整数;In response to the screen projection instruction input by the user, the source device projects the N controls in the first display interface to the screen projection interface displayed by the first destination device, where N is an integer greater than 0;
    所述源设备接收所述第一目的设备发送的第一触摸事件;Receiving, by the source device, a first touch event sent by the first destination device;
    所述源设备执行与所述第一触摸事件对应的操作指令。The source device executes an operation instruction corresponding to the first touch event.
  2. 根据权利要求1所述的方法,其特征在于,在所述源设备接收所述第一目的设备发送的第一触摸事件之后,还包括:The method according to claim 1, wherein after the source device receives the first touch event sent by the first destination device, the method further comprises:
    所述源设备确定与所述第一触摸事件对应的目标控件,所述目标控件为所述N个控件中的一个;Determining, by the source device, a target control corresponding to the first touch event, where the target control is one of the N controls;
    其中,所述操作指令为所述目标控件在所述源设备上被触发时对应的操作指令。Wherein, the operation instruction is a corresponding operation instruction when the target control is triggered on the source device.
  3. 根据权利要求1所述的方法,其特征在于,所述第一触摸事件为:所述第一目的设备响应于用户在所述投屏界面中输入的触摸操作产生第五触摸事件后,将所述第五触摸事件映射为在所述第一显示界面中的触摸事件。The method according to claim 1, wherein the first touch event is: after the first destination device generates a fifth touch event in response to a touch operation input by the user on the projection interface, the The fifth touch event is mapped to a touch event on the first display interface.
  4. 根据权利要求2所述的方法,其特征在于,所述第一触摸事件为:用户在所述投屏界面中输入第一触摸操作时所述第一目的设备产生的触摸事件。The method according to claim 2, wherein the first touch event is: a touch event generated by the first destination device when a user inputs a first touch operation on the screen projection interface.
  5. 根据权利要求4所述的方法,其特征在于,所述源设备存储有与所述第一显示界面对应的配置文件,所述配置文件中记录了所述N个控件在所述第一显示界面中的显示位置,以及所述N个控件在所述投屏界面中的显示位置;The method according to claim 4, wherein the source device stores a configuration file corresponding to the first display interface, and the configuration file records that the N controls are displayed on the first display interface. The display position in, and the display position of the N controls in the projection interface;
    其中,所述源设备确定与所述第一触摸事件对应的目标控件,包括:Wherein, the source device determining the target control corresponding to the first touch event includes:
    所述源设备根据所述配置文件中记录的所述N个控件在所述投屏界面中的显示位置,确定与所述第一触摸事件对应的目标控件。The source device determines the target control corresponding to the first touch event according to the display positions of the N controls recorded in the configuration file on the projection interface.
  6. 根据权利要求5所述的方法,其特征在于,所述第一触摸事件包括所述第一触摸操作在所述投屏界面中的第一坐标;The method of claim 5, wherein the first touch event comprises a first coordinate of the first touch operation in the projection interface;
    其中,所述源设备根据所述配置文件中记录的所述N个控件在所述投屏界面中的显示位置,确定与所述第一触摸事件对应的目标控件,包括:Wherein, the source device determining the target control corresponding to the first touch event according to the display positions of the N controls recorded in the configuration file on the projection interface includes:
    当所述第一坐标落入所述配置文件中第一控件的显示位置内时,所述源设备将所述第一控件确定为所述目标控件。When the first coordinate falls within the display position of the first control in the configuration file, the source device determines the first control as the target control.
  7. 根据权利要求5所述的方法,其特征在于,所述第一触摸事件包括所述第一触摸操作在所述投屏界面中的第一坐标;The method of claim 5, wherein the first touch event comprises a first coordinate of the first touch operation in the projection interface;
    其中,所述源设备根据所述配置文件中记录的所述N个控件在所述投屏界面中的显示位置,确定与所述第一触摸事件对应的目标控件,包括:Wherein, the source device determining the target control corresponding to the first touch event according to the display positions of the N controls recorded in the configuration file on the projection interface includes:
    当所述第一坐标既落入所述配置文件中第一控件的显示位置内,同时又落入所述配置文件中第二控件的显示位置内时,所述源设备将位于顶层的第一控件确定为所述目标控件。When the first coordinate falls within the display position of the first control in the configuration file, and at the same time falls within the display position of the second control in the configuration file, the source device will be located at the first The control is determined to be the target control.
  8. 根据权利要求5-7中任一项所述的方法,其特征在于,在所述源设备确定与所述第一触摸事件对应的目标控件之后,还包括:The method according to any one of claims 5-7, wherein after the source device determines the target control corresponding to the first touch event, the method further comprises:
    所述源设备根据所述配置文件,将所述第一触摸事件映射为第二触摸事件,所述第二触摸事件为:用户在所述第一显示界面中向所述目标控件输入第二触摸操作时所述源设备将产生的触摸事件;The source device maps the first touch event to a second touch event according to the configuration file, and the second touch event is: the user inputs a second touch to the target control in the first display interface The touch event that the source device will generate during operation;
    其中,所述源设备执行与所述第一触摸事件对应的操作指令,包括:Wherein, the source device executing an operation instruction corresponding to the first touch event includes:
    所述源设备将所述第二触摸事件上报给第一应用;所述第一显示界面为所述第一应用的界面。The source device reports the second touch event to the first application; the first display interface is an interface of the first application.
  9. 根据权利要求8所述的方法,其特征在于,所述源设备根据所述配置文件,将所述第一触摸事件映射为第二触摸事件,包括:The method according to claim 8, wherein the source device mapping the first touch event to the second touch event according to the configuration file comprises:
    所述源设备根据所述配置文件中记录的所述目标控件在所述第一显示界面中的第一显示位置以及在所述投屏界面中的第二显示位置之间的对应关系,将所述第一触摸事件中的第一坐标转换为第二坐标,得到所述第二触摸事件。According to the corresponding relationship between the first display position of the target control in the first display interface and the second display position in the projection interface, the source device records the target control in the configuration file. The first coordinate in the first touch event is converted to the second coordinate to obtain the second touch event.
  10. 根据权利要求5-6中任一项所述的方法,其特征在于,所述源设备执行与所述第一触摸事件对应的操作指令,包括:The method according to any one of claims 5-6, wherein the source device executing an operation instruction corresponding to the first touch event comprises:
    所述源设备将所述目标控件的标识和所述第一触摸事件的事件类型上报给第一应用,以使得所述第一应用执行第一功能,所述第一功能为所述第一应用在所述目标控件被所述事件类型所指示的操作触发时,所对应的功能;所述第一显示界面为所述第一应用的界面。The source device reports the identifier of the target control and the event type of the first touch event to a first application, so that the first application executes a first function, and the first function is the first application When the target control is triggered by the operation indicated by the event type, the corresponding function; the first display interface is the interface of the first application.
  11. 根据权利要求5-6中任一项所述的方法,其特征在于,在所述源设备确定与所述第一触摸事件对应的目标控件之后,还包括:The method according to any one of claims 5-6, wherein after the source device determines the target control corresponding to the first touch event, the method further comprises:
    所述源设备根据所述配置文件中记录的所述目标控件在所述第一显示界面中的第一显示位置生成第三触摸事件,所述第三触摸事件的事件类型与所述第一触摸事件的事件类型相同,所述第三触摸事件中的第三坐标位于所述第一显示位置内;The source device generates a third touch event according to the first display position of the target control in the first display interface recorded in the configuration file, and the event type of the third touch event is the same as that of the first touch The event types of the events are the same, and the third coordinate in the third touch event is located in the first display position;
    其中,所述源设备执行与所述第一触摸事件对应的操作指令,包括:Wherein, the source device executing an operation instruction corresponding to the first touch event includes:
    所述源设备将所述第三触摸事件上报给第一应用;所述第一显示界面为所述第一应用的界面。The source device reports the third touch event to the first application; the first display interface is an interface of the first application.
  12. 根据权利要求1-11中任一项所述的方法,其特征在于,在源设备显示第一显示界面之后,还包括:The method according to any one of claims 1-11, wherein after the source device displays the first display interface, the method further comprises:
    响应于用户输入的第二投屏指令,所述源设备将所述第一显示界面中的M个控件投射至第二目的设备中显示,M为大于0的整数;In response to the second screen projection instruction input by the user, the source device projects the M controls in the first display interface to the second destination device for display, where M is an integer greater than 0;
    所述源设备接收所述第二目的设备发送的第四触摸事件;Receiving, by the source device, a fourth touch event sent by the second destination device;
    所述源设备执行与所述第四触摸事件对应的操作指令。The source device executes an operation instruction corresponding to the fourth touch event.
  13. 一种投屏场景下的触控方法,其特征在于,包括:A touch method in a projection scene, characterized in that it includes:
    目的设备接收第一源设备发送的第一消息,所述第一消息中包括第一目标控件的绘制指令,所述第一目标控件为所述第一源设备显示的第一显示界面中的一个或多个控件;The destination device receives a first message sent by the first source device, the first message includes a drawing instruction of a first target control, and the first target control is one of the first display interfaces displayed by the first source device Or multiple controls;
    所述目的设备调用所述第一目标控件的绘制指令绘制投屏界面,所述投屏界面中包括所述第一目标控件;The destination device invokes the drawing instruction of the first target control to draw a screen projection interface, and the screen projection interface includes the first target control;
    响应于用户在所述投屏界面中向所述第一目标控件输入的第一触摸操作,所述目的设备生成第一触摸事件;In response to the first touch operation input by the user to the first target control in the projection interface, the target device generates a first touch event;
    所述目的设备指示所述第一源设备执行与所述第一触摸事件对应的操作指令。The destination device instructs the first source device to execute an operation instruction corresponding to the first touch event.
  14. 根据权利要求13所述的方法,其特征在于,所述目的设备指示所述第一源设备执行与所述第一触摸事件对应的操作指令,包括:The method according to claim 13, wherein the destination device instructs the first source device to execute an operation instruction corresponding to the first touch event, comprising:
    所述目的设备将所述第一触摸事件发送给所述第一源设备,以使得所述第一源设备执行与所述第一触摸事件对应的操作指令。The destination device sends the first touch event to the first source device, so that the first source device executes an operation instruction corresponding to the first touch event.
  15. 根据权利要求13所述的方法,其特征在于,在所述目的设备生成第一触摸事件之后,还包括:The method according to claim 13, wherein after the target device generates the first touch event, the method further comprises:
    所述目的设备将所述第一触摸事件映射为第二触摸事件,所述第二触摸事件为:用户在 所述第一显示界面中向所述第一目标控件输入第二触摸操作时所述第一源设备将产生的触摸事件;The target device maps the first touch event to a second touch event, where the second touch event is: when a user inputs a second touch operation to the first target control on the first display interface The touch event to be generated by the first source device;
    其中,所述目的设备指示所述第一源设备执行与所述第一触摸事件对应的操作指令,包括:Wherein, the destination device instructing the first source device to execute an operation instruction corresponding to the first touch event includes:
    所述目的设备将所述第二触摸事件发送给所述第一源设备,以使得所述第一源设备执行与所述第二触摸事件对应的操作指令。The destination device sends the second touch event to the first source device, so that the first source device executes an operation instruction corresponding to the second touch event.
  16. 根据权利要求13-15中任一项所述的方法,其特征在于,所述方法还包括:The method according to any one of claims 13-15, wherein the method further comprises:
    所述目的设备接收第二源设备发送的第二消息,所述第二消息中包括第二目标控件的绘制指令,所述第二目标控件为所述第二源设备显示的第二显示界面中的一个或多个控件;The destination device receives a second message sent by a second source device, the second message includes a drawing instruction of a second target control, and the second target control is in a second display interface displayed by the second source device One or more controls;
    所述目的设备调用所述第二目标控件的绘制指令在所述投屏界面中绘制所述第二目标控件;The target device invokes the drawing instruction of the second target control to draw the second target control on the projection interface;
    响应于用户在所述投屏界面中向所述第二目标控件输入的第三触摸操作,所述目的设备生成第三触摸事件;In response to a third touch operation input by the user to the second target control in the projection interface, the target device generates a third touch event;
    所述目的设备指示所述第二源设备执行与所述第三触摸事件对应的操作指令。The destination device instructs the second source device to execute an operation instruction corresponding to the third touch event.
  17. 一种电子设备,其特征在于,包括:An electronic device, characterized in that it comprises:
    触摸屏,所述触摸屏包括触摸传感器和显示屏;A touch screen, the touch screen including a touch sensor and a display screen;
    通信模块;Communication module
    一个或多个处理器;One or more processors;
    一个或多个存储器;One or more memories;
    以及一个或多个计算机程序,其中所述一个或多个计算机程序被存储在所述一个或多个存储器中,所述一个或多个计算机程序包括指令,当所述指令被所述电子设备执行时,使得所述电子设备执行如权利要求1-12或权利要求13-16中任一项所述的投屏场景下的触控方法。And one or more computer programs, wherein the one or more computer programs are stored in the one or more memories, and the one or more computer programs include instructions, when the instructions are executed by the electronic device When the time, the electronic device is caused to execute the touch method in the projection scene according to any one of claims 1-12 or claims 13-16.
  18. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1-12或权利要求13-16中任一项所述的投屏场景下的触控方法。A computer-readable storage medium having instructions stored in the computer-readable storage medium, characterized in that, when the instructions are executed on an electronic device, the electronic device is caused to execute as claimed in claim 1-12 or claim The touch method in the projection scene described in any one of 13-16.
  19. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求1-12或权利要求13-16中任一项所述的投屏场景下的触控方法。A computer program product containing instructions, characterized in that, when the computer program product runs on an electronic device, the electronic device is caused to execute any one of claims 1-12 or 13-16. The touch method in the screen projection scene.
  20. 一种投屏场景下的触控系统,其特征在于,所述系统包括至少一个源设备和至少一个目的设备;其中,所述源设备用于执行如权利要求1-12中任一项所述的投屏场景下的触控方法,所述目的设备用于执行如权利要求13-16中任一项所述的投屏场景下的触控方法。A touch control system in a projection scene, wherein the system includes at least one source device and at least one destination device; wherein, the source device is used to execute any one of claims 1-12 The touch method in the projection scene, the target device is used to execute the touch method in the projection scene according to any one of claims 13-16.
PCT/CN2020/093908 2019-06-05 2020-06-02 Method for touch control in screen casting scenario, and electronic apparatus WO2020244500A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910487623.9 2019-06-05
CN201910487623.9A CN110377250B (en) 2019-06-05 2019-06-05 Touch method in screen projection scene and electronic equipment

Publications (1)

Publication Number Publication Date
WO2020244500A1 true WO2020244500A1 (en) 2020-12-10

Family

ID=68249812

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/093908 WO2020244500A1 (en) 2019-06-05 2020-06-02 Method for touch control in screen casting scenario, and electronic apparatus

Country Status (2)

Country Link
CN (1) CN110377250B (en)
WO (1) WO2020244500A1 (en)

Families Citing this family (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112231025B (en) 2019-03-06 2022-12-06 华为终端有限公司 UI component display method and electronic equipment
CN110381195A (en) 2019-06-05 2019-10-25 华为技术有限公司 A kind of throwing screen display methods and electronic equipment
CN110377250B (en) * 2019-06-05 2021-07-16 华为技术有限公司 Touch method in screen projection scene and electronic equipment
CN112995727A (en) * 2019-12-17 2021-06-18 华为技术有限公司 Multi-screen coordination method and system and electronic equipment
CN113014614A (en) * 2019-12-20 2021-06-22 青岛海信移动通信技术股份有限公司 Equipment control method, control equipment and controlled equipment
CN114157756A (en) * 2020-08-20 2022-03-08 华为技术有限公司 Task processing method and related electronic equipment
CN111399789B (en) * 2020-02-20 2021-11-19 华为技术有限公司 Interface layout method, device and system
CN111414097A (en) * 2020-03-23 2020-07-14 维沃移动通信有限公司 Interaction method, interaction system and display equipment
CN111880870A (en) * 2020-06-19 2020-11-03 维沃移动通信有限公司 Method and device for controlling electronic equipment and electronic equipment
CN111970546A (en) * 2020-07-21 2020-11-20 腾讯科技(深圳)有限公司 Method and device for controlling terminal interaction, electronic equipment and storage medium
CN114071207B (en) * 2020-07-30 2023-03-24 华为技术有限公司 Method and device for controlling display of large-screen equipment, large-screen equipment and storage medium
CN112035048B (en) * 2020-08-14 2022-03-25 广州视源电子科技股份有限公司 Touch data processing method, device, equipment and storage medium
CN114168235A (en) * 2020-08-20 2022-03-11 华为技术有限公司 Function switching entry determining method and electronic equipment
CN114079809A (en) * 2020-08-20 2022-02-22 华为技术有限公司 Terminal and input method and device thereof
CN114185503B (en) * 2020-08-24 2023-09-08 荣耀终端有限公司 Multi-screen interaction system, method, device and medium
WO2022042162A1 (en) * 2020-08-25 2022-03-03 华为技术有限公司 Method and apparatus for implementing user interface
CN112134788B (en) * 2020-09-18 2023-06-06 Oppo广东移动通信有限公司 Event processing method, device, storage medium, mobile terminal and computer
CN114205546B (en) * 2020-09-18 2023-05-05 华为终端有限公司 Equipment control system
CN112130475B (en) * 2020-09-22 2022-08-19 北京字节跳动网络技术有限公司 Equipment control method, device, terminal and storage medium
CN112328195B (en) * 2020-10-10 2023-10-24 当趣网络科技(杭州)有限公司 Screen projection control method, system, electronic equipment and medium
CN114500725B (en) * 2020-11-13 2023-06-27 华为技术有限公司 Target content transmission method, master device, slave device, and storage medium
CN112269527B (en) * 2020-11-16 2022-07-08 Oppo广东移动通信有限公司 Application interface generation method and related device
CN112394895B (en) * 2020-11-16 2023-10-13 Oppo广东移动通信有限公司 Picture cross-device display method and device and electronic device
CN112468863A (en) * 2020-11-24 2021-03-09 北京字节跳动网络技术有限公司 Screen projection control method and device and electronic device
CN114584828B (en) * 2020-11-30 2024-05-17 上海新微技术研发中心有限公司 Android screen-throwing method, computer readable storage medium and equipment
CN112527152B (en) * 2020-12-18 2023-01-06 Oppo(重庆)智能科技有限公司 Touch area control method and device, touch system and electronic equipment
CN112684993A (en) * 2020-12-23 2021-04-20 北京小米移动软件有限公司 Display method, device and medium based on cross-screen cooperation
CN114741039B (en) * 2020-12-24 2023-09-08 华为技术有限公司 Equipment control method and terminal equipment
CN115145515A (en) * 2021-03-31 2022-10-04 华为技术有限公司 Screen projection method and related device
CN113093977A (en) * 2021-04-12 2021-07-09 Tcl通讯(宁波)有限公司 Setting method and device of mobile terminal watch, intelligent terminal and storage medium
CN115328565A (en) * 2021-04-25 2022-11-11 华为技术有限公司 Function skipping method and electronic equipment
CN113360116A (en) * 2021-06-25 2021-09-07 阿波罗智联(北京)科技有限公司 Method, device and equipment for controlling terminal and storage medium
CN113531423A (en) * 2021-07-13 2021-10-22 读书郎教育科技有限公司 Interactive intelligent projection table lamp and method
CN113590248A (en) * 2021-07-22 2021-11-02 上汽通用五菱汽车股份有限公司 Screen projection method and device of vehicle-mounted terminal and readable storage medium
CN115756268A (en) * 2021-09-03 2023-03-07 华为技术有限公司 Cross-device interaction method and device, screen projection system and terminal
CN115016697A (en) * 2021-09-08 2022-09-06 荣耀终端有限公司 Screen projection method, computer device, readable storage medium, and program product
CN114040242B (en) * 2021-09-30 2023-07-07 荣耀终端有限公司 Screen projection method, electronic equipment and storage medium
CN114138167A (en) * 2021-12-08 2022-03-04 武汉卡比特信息有限公司 Touch pad system and method for mobile phone interconnection split screen projection
CN115016714B (en) * 2021-12-15 2023-04-07 荣耀终端有限公司 Electronic device control method, system, electronic device and storage medium
CN114442985A (en) * 2022-01-30 2022-05-06 深圳创维-Rgb电子有限公司 Screen projection transmitter and receiver, electronic equipment, screen projection system and method
CN114461124B (en) * 2022-01-30 2023-03-21 深圳创维-Rgb电子有限公司 Screen projection control method and device, screen projector and computer readable storage medium
CN115174988B (en) * 2022-06-24 2024-04-30 长沙联远电子科技有限公司 Audio and video screen-throwing control method based on DLNA
CN115499693A (en) * 2022-08-09 2022-12-20 深圳市酷开网络科技股份有限公司 Multi-screen different display control method, device and system, storage medium and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130328770A1 (en) * 2010-02-23 2013-12-12 Muv Interactive Ltd. System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
CN106095084A (en) * 2016-06-06 2016-11-09 乐视控股(北京)有限公司 Throw screen method and device
CN106502604A (en) * 2016-09-28 2017-03-15 北京小米移动软件有限公司 Throw screen changing method and device
CN106897038A (en) * 2015-12-17 2017-06-27 北京传送科技有限公司 One kind throws screen system
CN107071551A (en) * 2017-04-26 2017-08-18 四川长虹电器股份有限公司 Applied to the multi-screen interactive screen response method in intelligent television system
CN110377250A (en) * 2019-06-05 2019-10-25 华为技术有限公司 A kind of touch control method and electronic equipment thrown under screen scene

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6387641B2 (en) * 2014-01-15 2018-09-12 セイコーエプソン株式会社 Projector, display device, display system, and display device control method
CN104978156B (en) * 2014-04-02 2021-10-22 联想(北京)有限公司 Multi-screen display method and multi-screen display processing device
JP6721951B2 (en) * 2015-07-03 2020-07-15 シャープ株式会社 Image display device, image display control method, and image display system
CN108736981A (en) * 2017-04-19 2018-11-02 阿里巴巴集团控股有限公司 It is a kind of wirelessly to throw screen method, apparatus and system
CN109508162B (en) * 2018-10-12 2021-08-13 福建星网视易信息系统有限公司 Screen projection display method, system and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130328770A1 (en) * 2010-02-23 2013-12-12 Muv Interactive Ltd. System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
CN106897038A (en) * 2015-12-17 2017-06-27 北京传送科技有限公司 One kind throws screen system
CN106095084A (en) * 2016-06-06 2016-11-09 乐视控股(北京)有限公司 Throw screen method and device
CN106502604A (en) * 2016-09-28 2017-03-15 北京小米移动软件有限公司 Throw screen changing method and device
CN107071551A (en) * 2017-04-26 2017-08-18 四川长虹电器股份有限公司 Applied to the multi-screen interactive screen response method in intelligent television system
CN110377250A (en) * 2019-06-05 2019-10-25 华为技术有限公司 A kind of touch control method and electronic equipment thrown under screen scene

Also Published As

Publication number Publication date
CN110377250A (en) 2019-10-25
CN110377250B (en) 2021-07-16

Similar Documents

Publication Publication Date Title
WO2020244500A1 (en) Method for touch control in screen casting scenario, and electronic apparatus
WO2020244492A1 (en) Screen projection display method and electronic device
WO2020244495A1 (en) Screen projection display method and electronic device
US11922005B2 (en) Screen capture method and related device
US11722449B2 (en) Notification message preview method and electronic device
WO2020244497A1 (en) Display method for flexible screen and electronic device
US11385857B2 (en) Method for displaying UI component and electronic device
WO2021115194A1 (en) Application icon display method and electronic device
JP2022549157A (en) DATA TRANSMISSION METHOD AND RELATED DEVICE
WO2020192456A1 (en) Voice interaction method and electronic device
WO2021032097A1 (en) Air gesture interaction method and electronic device
WO2020155014A1 (en) Smart home device sharing system and method, and electronic device
WO2021121052A1 (en) Multi-screen cooperation method and system, and electronic device
WO2024016559A1 (en) Multi-device cooperation method, electronic device and related product
CN116360725B (en) Display interaction system, display method and device
WO2022078295A1 (en) Device recommendation method and electronic device
WO2020155875A1 (en) Display method for electronic device, graphic user interface and electronic device
WO2023030099A1 (en) Cross-device interaction method and apparatus, and screen projection system and terminal
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
WO2021042881A1 (en) Message notification method and electronic device
WO2022188632A1 (en) Theme display method and apparatus, terminal, and computer storage medium
WO2023169237A1 (en) Screen capture method, electronic device, and system
CN116095412B (en) Video processing method and electronic equipment
WO2024022288A1 (en) Method for installing smart device, and electronic device
WO2021013246A1 (en) Wireless access point deployment method and apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20819274

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20819274

Country of ref document: EP

Kind code of ref document: A1