WO2020216156A1 - Procédé de projection d'écran et dispositif informatique - Google Patents

Procédé de projection d'écran et dispositif informatique Download PDF

Info

Publication number
WO2020216156A1
WO2020216156A1 PCT/CN2020/085499 CN2020085499W WO2020216156A1 WO 2020216156 A1 WO2020216156 A1 WO 2020216156A1 CN 2020085499 W CN2020085499 W CN 2020085499W WO 2020216156 A1 WO2020216156 A1 WO 2020216156A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
screen
content
touch
touch object
Prior art date
Application number
PCT/CN2020/085499
Other languages
English (en)
Chinese (zh)
Inventor
徐致欣
许浩维
周明
王同波
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2020216156A1 publication Critical patent/WO2020216156A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Definitions

  • This application relates to the field of communication technology, and in particular to a screen projection method and computing device.
  • the video, picture and/or audio content on the mobile phone is usually projected to other electronic devices, such as televisions, projectors and other electronic devices through projection devices.
  • other electronic devices such as televisions, projectors and other electronic devices through projection devices.
  • users can more conveniently display the content in the mobile phone on other electronic devices, and provide users with a better viewing experience.
  • This application provides a screen projection method and computing device to solve the problem that after the content on the mobile phone is projected to other electronic devices, when the user looks up at other electronic devices, the visual range is limited to the other electronic devices.
  • To operate the screen content you still need to lower your head to operate the mobile phone screen, which is inconvenient to use and poor user experience.
  • the present application provides a screen projection method.
  • the first electronic device detects a touch object hovering on the screen of the first electronic device to determine whether the touch object meets a response condition, wherein the response condition includes touch
  • the projected content is projected onto the second electronic device to trigger the second electronic device to display the above-mentioned content to be projected and the projection point indicating the above-mentioned location, so that the user can use the projected screen when looking up at the second electronic device.
  • the projection point on the second electronic device determine the position of the touch object on the screen of the first electronic device. At this time, if the user needs to operate the projected content, he can set it on the first electronic device based on the determined position. Perform corresponding operations accurately without looking down at the screen of the first electronic device, which is convenient to use and improves user experience.
  • a possible design, the projection of the location information and the content to be projected onto the second electronic device includes:
  • the wireless transmission speed of the 60GHz wireless communication spectrum is much higher than that of Bluetooth technology.
  • the 60GHz wireless communication spectrum is used for file transmission and data synchronization between devices to reduce information transmission delay.
  • a possible design is that after the said location information and the content to be projected are projected onto the second electronic device, the method further includes:
  • the new content to be cast corresponding to the response result of the touch operation is determined, and the new content to be cast is projected to the second electronic device for display.
  • the second electronic device displays the content to be projected and the projection point indicating the aforementioned location. If the user needs to perform corresponding operations, The position of the touch object is determined by the above-mentioned projection point, so that based on the position, a corresponding touch operation is performed on the screen of the first electronic device without looking down at the screen of the first electronic device, which is convenient to use.
  • a possible design, before the projection of the location information and the content to be projected onto the second electronic device, further includes:
  • Compression processing is performed on the location information and the content to be projected.
  • the amount of storage required to represent the original content to be projected is reduced, facilitating the transmission and storage of data.
  • the response condition further includes: the time that the touch object is suspended on the screen of the first electronic device is greater than a time threshold.
  • the subsequent operation is performed only when the time that the touch object is suspended on the screen of the first electronic device is greater than the time threshold, which reduces the probability of false touch and is suitable for practical applications.
  • the screen of the first electronic device includes a self-capacitance touch sensor.
  • the screen of the first electronic device adopts a self-capacitance touch sensor to implement touch technology, where the self-capacitance touch sensor can implement floating touch on the touch screen to meet the application requirements of floating touch.
  • this application provides another screen projection method, including:
  • the new content to be cast corresponding to the response result of the touch operation is determined, and the new content to be cast is projected to the second electronic device for display.
  • this application provides a screen projection device, which includes:
  • the detection module is used to detect the touch object suspended on the screen of the first electronic device
  • a judging module configured to judge whether the touch object meets a response condition, and the response condition includes that the distance between the touch object and the screen of the first electronic device meets a preset distance condition;
  • a determining module configured to determine the position of the touch object on the screen of the first electronic device
  • the projection module is configured to, in response to determining that the touch object meets the response condition, project the location information and the content to be projected onto the second electronic device to trigger the second electronic device to display the The content to be projected, and the projection point indicating the location.
  • the projection module is specifically used for:
  • the screen projection module projects the location information and the content to be screened onto the second electronic device, it is also used to:
  • the new content to be cast corresponding to the response result of the touch operation is determined, and the new content to be cast is projected to the second electronic device for display.
  • a possible design further comprising a compression module, used to screen the location information and the content to be projected on the second electronic device before the projection module projects the location information and the The content to be projected is compressed.
  • the response condition further includes: the time that the touch object is suspended on the screen of the first electronic device is greater than a time threshold.
  • the screen of the first electronic device includes a self-capacitance touch sensor.
  • this application provides another screen projection device, which includes:
  • the startup module is used to start the target game application
  • the search module is used to search for screen projection devices and establish a screen projection connection with the searched second electronic device;
  • the detection module is used to activate the floating touch function and detect the touch object floating on the screen of the first electronic device
  • a judging module configured to judge whether the touch object meets a response condition, and the response condition includes that the distance between the touch object and the screen of the first electronic device meets a preset distance condition;
  • a determining module configured to determine the position of the touch object on the screen of the first electronic device
  • the compression module is configured to compress the location information and the content to be cast on the screen in response to determining that the touch object meets the response condition, wherein the content to be cast on the screen is determined according to the target game application;
  • the projection module is used to project the processed location information and the content to be projected onto the second electronic device through the 60GHz wireless communication spectrum to trigger the second electronic device to display the content to be projected , And the projection point indicating the position; acquire the touch operation of the touch object on the screen of the first electronic device; determine the new content to be projected corresponding to the response result of the touch operation, and change the The new content to be projected is projected to the second electronic device for display.
  • the present application provides a computing device, which includes a processor and a memory.
  • the memory stores computer instructions; the processor executes the computer instructions stored in the memory, so that the computing device executes the foregoing first aspect or the methods provided by various possible designs of the first aspect, so that the computing device deploys the foregoing third aspect or the first aspect.
  • the three possible designs provide the projection device.
  • the present application provides a computing device, which includes a processor and a memory.
  • the memory stores computer instructions; the processor executes the computer instructions stored in the memory, so that the computing device executes the method provided in the above second aspect, so that the computing device deploys the screen projection device provided in the above fourth aspect.
  • the present application provides a computer-readable storage medium having computer instructions stored in the computer-readable storage medium, and the computer instructions instruct the computing device to execute the foregoing first aspect or various possible designs provided by the first aspect.
  • the method or the computer instruction instructs the computing device to deploy the aforementioned third aspect or various possible designs of the third aspect to provide the screen projection device.
  • the present application provides a computer-readable storage medium having computer instructions stored in the computer-readable storage medium, the computer instructions instructing the computing device to perform the method provided in the second aspect, or the computer instructions instructing the calculation
  • the fourth aspect of equipment deployment provides the projection device.
  • the present application provides a computer program product, which includes computer instructions, when the computer program product runs on an electronic device, the electronic device is made to perform the first aspect or various possibilities of the first aspect.
  • the method provided by the design allows the electronic device to deploy the above-mentioned third aspect or various possible designs of the third aspect to provide the screen projection device.
  • the present application provides a computer program product, the computer program product including computer instructions, when the computer program product runs on an electronic device, the electronic device is caused to execute the method provided in the second aspect, so that the electronic device
  • the above-mentioned fourth aspect is deployed to provide the projection device.
  • FIG. 1 is a schematic diagram of the architecture of a screen projection system provided by an embodiment of the application
  • FIG. 2 is a schematic structural diagram of a first electronic device provided by an embodiment of the application.
  • FIG. 3 is a schematic diagram of an application scenario provided by an embodiment of the application.
  • FIG. 4 is a schematic flowchart of a screen projection method provided by an embodiment of the application.
  • FIG. 5 is a schematic flowchart of another screen projection method provided by an embodiment of the application.
  • FIG. 6 is a schematic diagram of touch points on the screen of an electronic device according to an embodiment of the application.
  • FIG. 7 is a schematic flowchart of another screen projection method provided by an embodiment of the application.
  • FIG. 8 is a schematic structural diagram of a screen projection device provided by this application.
  • FIG. 9 is a schematic structural diagram of another screen projection device provided by this application.
  • FIG. 10 is a schematic structural diagram of still another screen projection device provided by this application.
  • FIG. 11 is a schematic diagram of the basic hardware architecture of a computing device provided by this application.
  • FIG. 12 is a schematic diagram of the basic hardware architecture of another computing device provided by this application.
  • first and second are only used for descriptive purposes, and cannot be understood as implying or implying relative importance or implicitly specifying the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, “multiple” The meaning is two or more.
  • the screen projection referred to in this application refers to the transmission of content (such as audio, video, picture, text, etc.) on an electronic device to another electronic device for presentation, achieving the effect of simultaneously displaying the same content among multiple electronic devices.
  • the projections involved in this application may include wired projections and wireless projections.
  • the wired projections can establish connections between multiple electronic devices through a high definition multimedia interface (HDMI), and transmit media through HDMI transmission lines data.
  • Wireless projection can establish a connection between multiple electronic devices through the Miracast protocol, and transmit media data through WIFI.
  • the projection system of the present application includes at least two electronic devices and one projection port, where the projection port may include a wired port and/or a wireless port.
  • the wired port can be HDMI; the wireless port can be an application programming interface (API).
  • FIG. 1 is a schematic structural diagram of a screen projection system provided by an embodiment of the present application.
  • the projection system includes a first electronic device 100, a second electronic device 200, a first wired port 101, a first wireless port 102, a second wired port 201, and a second wireless port 202.
  • the first wired port 101 and the first wireless port 102 may be integrated on the first electronic device 100 or may exist independently of the first electronic device 100.
  • the second wired port 201 and the second wireless port 202 may be integrated on the second electronic device 200, or may exist independently of the second electronic device 200, which is not limited in the embodiment of the present application.
  • the first electronic device 100 and the second electronic device 200 can establish a screen projection connection through the aforementioned ports.
  • the first electronic device 100 has at least a screen projection capability.
  • the second electronic device 100 has at least a projection screen receiving capability, an image display capability, and a sound output capability.
  • the first electronic device 100 may be an electronic device such as a mobile phone, a tablet computer, a personal digital assistant (PDA) or a desktop computer.
  • the second electronic device 200 may be an electronic device such as a TV, a tablet computer, or a desktop computer.
  • the screen projection method of the present application can be applied to the first electronic device 100.
  • the first electronic device 100 detects a touch object suspended on the screen of the first electronic device to determine whether the touch object meets a response condition, where the response condition includes that the distance between the touch object and the touch screen of the first electronic device meets a preset
  • the response condition includes that the distance between the touch object and the touch screen of the first electronic device meets a preset
  • the second electronic device 200 is triggered to display the above-mentioned content to be projected and the projection point indicating the above-mentioned position, thereby enabling the user to use the above-mentioned projection on the second electronic device 200 when looking up at the second electronic device 200.
  • FIG. 2 is a schematic structural diagram of a first electronic device according to an embodiment of the present application.
  • the first electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, and a power management module 141, Battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 171, receiver 172, microphone 173, earphone interface 174, sensor module 180, button 190, motor 191, indicator 192, Camera 193, display screen 194, subscriber identification module (SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the first electronic device 100.
  • the first electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components, specifically Determined according to the actual application scenario, there is no restriction here.
  • the components shown in Figure 2 can be implemented in hardware, software, or a combination of software and hardware.
  • the foregoing processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), an image signal processor (ISP), a controller, and a memory.
  • Video codec digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc.
  • the different processing units may be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the first electronic device 100.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the digital signal processor DSP is used to process digital signals. In addition to digital image signals, it can also process other digital signals.
  • Video codecs are used to compress or decompress digital video.
  • the first electronic device 100 may support one or more video codecs. In this way, the first electronic device 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • NPU is a neural-network (NN) computing processor.
  • the transfer mode between human brain neurons By drawing on the structure of biological neural networks, for example, the transfer mode between human brain neurons, it can quickly process input information and can continuously learn by itself.
  • applications such as intelligent cognition of the first electronic device 100 can be realized, such as: image recognition, face recognition, voice recognition, text understanding, etc.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the instructions stored in the memory are used by the first electronic device 100 to execute the screen projection method in the embodiment of the present application.
  • the data stored in the memory may include media data, and the media data may be audio and video data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the aforementioned memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • Interfaces may include integrated circuit (I2C) interfaces, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interfaces, pulse code modulation (PCM) interfaces, universal asynchronous receivers /transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and/or Universal serial bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous receivers /transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal serial bus
  • the interface connection relationship between the modules illustrated in the embodiment of the present application is merely illustrative and does not constitute a structural limitation of the first electronic device 100.
  • the first electronic device 100 may also adopt different interface connection manners in the above-mentioned embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is configured to receive charging input from the charger and charge the power management module 141 of the first electronic device 100.
  • the charger can be a wireless charger or a wired charger.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, and the wireless communication module 160.
  • the wireless communication function of the first electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the first electronic device 100 may use wireless communication functions to communicate with other devices.
  • the first electronic device 100 can communicate with the second electronic device 200, the first electronic device 100 establishes a projection connection with the second electronic device 200, and the first electronic device 100 outputs projection data to the second electronic device 200 and so on.
  • the projection data output by the first electronic device 100 may be audio and video data, pictures, text, and so on.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the first electronic device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the first electronic device 100.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering, amplifying and transmitting the received electromagnetic waves to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker 171, a receiver 172, etc.), or displays an image or video through the display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the first electronic device 100, including wireless local area networks (WLAN) such as Wi-Fi networks, Bluetooth (bluetooth, BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • Bluetooth blue, BT
  • global navigation satellite system global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic wave radiation via the antenna 2.
  • the antenna 1 of the first electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the first electronic device 100 can communicate with the network and other devices (for example, The second electronic device 200, etc.) communicate.
  • Wireless communication technologies can include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), and broadband code division multiple access. (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/ Or IR technology, etc.
  • the aforementioned GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), and quasi-zenith satellite system (quasi-zenith). Satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the first electronic device 100 implements a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, connected to the display 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, etc.
  • the display screen 194 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active-matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED organic light-emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the first electronic device 100 may include one or N display screens 194, and N is a positive integer greater than one.
  • the display screen 194 may be used to display various interfaces output by the system of the first electronic device 100. For each interface output by the first electronic device 100, reference may be made to related descriptions in subsequent embodiments.
  • the first electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back from the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transfers the above-mentioned electrical signal to the ISP for processing, which is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some feasible implementation manners, the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats.
  • the first electronic device 100 may include 1 or N cameras 193, and N is a positive integer greater than 1.
  • the camera 193 is used to obtain a real image of the user.
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the first electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes various functional applications and data processing of the first electronic device 100 by running instructions stored in the internal memory 121.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, at least one application program (such as a sound playback function, an image playback function, etc.) required by at least one function.
  • the data storage area can store data (such as audio data, etc.) created during the use of the first electronic device 100.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), etc.
  • UFS universal flash storage
  • the first electronic device 100 can implement audio functions through the audio module 170, the speaker 171, the receiver 172, the microphone 173, the earphone interface 174, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 may be used to play the sound corresponding to the video. For example, when the display screen 194 displays a video playback screen, the audio module 170 outputs the sound of the video playback.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the speaker 171, also called “speaker”, is used to convert audio electrical signals into sound signals.
  • the receiver 172 also called “earpiece”, is used to convert audio electrical signals into sound signals.
  • the microphone 173, also called “microphone” or “microphone”, is used to convert sound signals into electrical signals.
  • the earphone interface 174 is used to connect wired earphones.
  • the earphone interface 174 may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, and a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • the pressure sensor is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor may be provided on the display screen 194.
  • the gyroscope sensor may be used to determine the movement posture of the first electronic device 100.
  • the air pressure sensor is used to measure air pressure.
  • the acceleration sensor can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three-axis).
  • Distance sensor used to measure distance.
  • the ambient light sensor is used to sense the brightness of the ambient light.
  • the fingerprint sensor is used to collect fingerprints.
  • the temperature sensor is used to detect temperature. Touch sensor, also called "touch panel”.
  • the touch sensor may be provided on the display screen 194, and the touch screen is composed of the touch sensor and the display screen 194, which is also called a “touch screen”.
  • the touch sensor is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the first electronic device 100, which is different from the position of the display screen 194.
  • the button 190 includes a power button, a volume button, and so on.
  • the button 190 may be a mechanical button or a touch button.
  • the motor 191 can generate vibration prompts.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 195 is used to connect to the SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact and separation with the first electronic device 100.
  • the first electronic device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the first electronic device 100 and cannot be separated from the first electronic device 100.
  • users can display content from electronic devices such as mobile phones on electronic devices such as TVs, providing users with a better viewing experience.
  • this projection method to project the corresponding content
  • the visual range is limited to the TV.
  • the user needs to operate the projected content, he still needs to lower his head to perform the corresponding operation on the phone. Inconvenient to use and poor user experience.
  • an embodiment of the present application provides a screen projection method in which the first electronic device detects a touch object suspended on the screen of the first electronic device to determine whether the touch object meets the response condition, wherein:
  • the response condition includes that the distance between the touch object and the screen of the first electronic device meets a preset distance condition, the position of the touch object on the screen of the first electronic device is determined, and in response to determining that the touch object meets the response condition, the The location information and the content to be projected are projected onto the second electronic device to trigger the second electronic device to display the content to be projected and the projection point indicating the location, thereby enabling the user to look up at the second electronic device.
  • the position of the touch object on the screen of the first electronic device is determined by the projection point of the projection on the second electronic device. At this time, if the user needs to operate the projected content, it can be based on the determined position Corresponding operations are performed accurately on the first electronic device without looking down at the screen of the first electronic device, which is convenient to use and improves user experience.
  • Fig. 3 is a schematic diagram of a possible application scenario of an embodiment of the present application.
  • this preset distance range is preset and can be other values.
  • the projection point corresponding to the position is displayed on the mobile phone screen, or only the projection point corresponding to the position is determined but the projection point is not displayed on the mobile phone screen (according to the projection mode).
  • the mobile phone can display the same content as on the big screen, can also display different content, and can also not display on the mobile phone but only display on the big screen, etc. This is the prior art and will not be repeated here).
  • the location information and the content to be projected are projected onto the TV display screen to trigger the display screen to display the content to be projected and the projection point corresponding to the location information.
  • the information of this location may be only the location coordinates, or it may be location + indication information.
  • the TV For only the position coordinates, after the TV receives it, it needs to determine the position and determine the display information indicating the position, and display the visual element indicating the position at the determined position. For example, the TV determines the corresponding on its display screen according to the position coordinates. Position, and determine and display visual elements according to their own preset display strategy. For example, visual elements such as cursors, arrows, circles, etc. are displayed at the corresponding positions, so that the user can intuitively understand that there is a touch object floating on the position. For location + indication information, the TV only needs to parse the information, determine the location and visual elements contained in the information, and display it accordingly.
  • the mobile phone can also acquire the touch operation of the user's finger on the screen of the mobile phone, perform a response operation corresponding to the touch operation, and project the execution result on the TV to trigger the TV to display the above execution result.
  • a user uses a mobile phone to play a game and the screen is projected to a TV scene, as shown in Figure 3, the user controls the game on the TV display on the mobile phone.
  • the user hoveres his finger on the mobile phone screen Can see a cursor indicating the position of the finger, so that the user can accurately know whether the position of his finger is the position he wants without looking down at the phone, and can make adjustments and decisions quickly.
  • FIG. 3 only takes the user playing a game on a mobile phone as an example, and shows a screencasting scene of a mobile phone game. In essence, there are many other screencasting scenes, and any screencasting scene is within the scope of this application.
  • FIG. 4 is a schematic flowchart of a method for projecting a screen according to an embodiment of this application.
  • the execution subject of this embodiment may be the first electronic device 100 in the embodiment shown in FIG. 2.
  • the method may include :
  • S401 Detect a touch object suspended on the screen of the first electronic device.
  • the touch object is an object located above the screen of the first electronic device, such as a user's finger, a conductive object, and so on.
  • the screen of the first electronic device includes at least a self-capacitance touch sensor, and may further include a mutual-capacitance touch sensor.
  • mutual capacitance can realize direct contact touch
  • self-capacitance can realize floating touch on the touch screen to meet application requirements in different scenarios.
  • the method further includes: receiving a hover touch activation instruction, and activate the hover touch function through self-capacitance according to the instruction. Further, the first electronic device can sense touch objects within a certain range above the screen of the first electronic device through the hover touch function, and then perform subsequent operations based on the sensed touch objects.
  • the target identifier can be information capable of identifying the target identity, such as the target name and code.
  • the first electronic device receives the start instruction, and starts the corresponding target according to the target identifier in the instruction.
  • the user can send a start instruction to the mobile phone.
  • the start instruction carries a target identifier.
  • the mobile phone receives the start instruction and starts the corresponding target according to the target identifier.
  • multiple applications may be installed in the first electronic device, such as game applications, video applications, and social applications.
  • the game application can be used for users to play games
  • the video application can be used for users to watch videos, live broadcasts, novels and/or comics, etc.
  • the social application can be used for users to conduct video chats, voice chats, and/or text chats.
  • the game application, the video application, and the social application may be pre-installed when the first electronic device leaves the factory, or may be installed by the user after downloading.
  • the game application may be a game application (application, APP) developed by a manufacturer of the first electronic device, or a game APP developed by a third-party manufacturer.
  • the video application may be a video application developed by a manufacturer of the first electronic device, or a video APP developed by a third-party manufacturer.
  • the social application may be a social APP developed by a manufacturer of the first electronic device, or a social APP developed by a third-party manufacturer.
  • the aforementioned target can be any of the aforementioned applications (such as a game APP).
  • the first electronic device receives a certain application start instruction, it starts the corresponding application according to the target identifier in the start instruction, and after the corresponding application is started, the corresponding content is projected to the second electronic device for display, thereby realizing the first Multi-screen interaction between the electronic device and the second electronic device, and the display content can be shared simultaneously on different platforms or devices, enriching the multimedia life of the user.
  • the user can search or select the content he wants in the main interface of the application, and the first electronic device can search the cloud platform for the user’s
  • the information entered in the search bar, and the icon of the searched content is displayed.
  • the user selects the desired content, he can click the icon of the content to enter the corresponding interface.
  • the application of the first electronic device obtains the resource of the content selected by the user from the cloud platform.
  • S402 Determine whether the touch object meets a response condition, where the response condition includes that the distance between the touch object and the screen of the first electronic device meets a preset distance condition.
  • the response condition can be set according to the actual situation.
  • it is set to determine that the distance between the touch object and the screen of the first electronic device meets a preset distance condition.
  • the above-mentioned preset distance condition can be set according to actual conditions, for example, 1mm-50mm.
  • This preset distance range is preset and can be other values to meet different requirements of various application scenarios.
  • the response condition further includes: the time that the touch object is suspended on the screen of the first electronic device is greater than a time threshold.
  • the subsequent operation is performed only when the time that the touch object is suspended on the screen of the first electronic device is greater than the time threshold, which reduces the probability of false touch and is suitable for practical applications.
  • S403 Determine the position of the touch object on the screen of the first electronic device.
  • the action of determining the position of the touch object on the screen of the first electronic device may occur before, after, or at the same time when determining whether the touch object meets the response condition. Specifically, it can be determined according to actual application scenarios.
  • the above-mentioned position information may be only position coordinates, or position+indication information.
  • the second electronic device After receiving the second electronic device, it needs to determine the location and determine the display information indicating the location, and display the visual element indicating the location at the determined location.
  • the second electronic device determines the location based on the location coordinates.
  • the second electronic device only needs to analyze the information, determine the location and visual elements contained in the information, and display the information accordingly.
  • the above-mentioned content to be projected may be displayed on the first electronic device, or may not be displayed on the first electronic device. Specifically, it can be determined according to actual application scenarios. For example, taking the user in FIG. 3 playing a game on a mobile phone as an example, the content to be cast is displayed on both the first electronic device and the second electronic device.
  • the first electronic device can output alarm information, which carries the above-mentioned response conditions, so that the user can make corresponding adjustments in time according to the above-mentioned alarm information, and then realize subsequent screen projection operations.
  • the second electronic device before the location information and the content to be projected are projected on the second electronic device, it further includes:
  • the first electronic device When the first electronic device detects that the screen projection connection is not currently established, the first electronic device searches for the screen projection device, displays the searched screen projection device, prompts the user to select the screen projection device, receives the screen projection device selected by the user, and sends the user
  • the selected screen projection device is determined to be the second electronic device, and a screen projection connection is established with the second electronic device.
  • the first electronic device when the first electronic device detects that the screen projection connection is not currently established, the first electronic device searches for one or more screen projection devices connected to it.
  • the screen projection device involved in the embodiment of the present application is an electronic device with projection/reception capability.
  • the first electronic device displays a search box connected to the screen projection.
  • the first electronic device establishes a screen-casting connection with the device selected by the user. For example, if the screen projection device selected by the user is a TV in the living room, the first electronic device establishes a screen projection connection with the TV in the living room.
  • the second electronic device may return a screen projection connection response.
  • the first electronic device and the second electronic device will exchange and negotiate performance parameters, that is, the first electronic device can obtain the image on the second electronic device Size parameters, data format parameters that the second electronic device can receive, and so on.
  • the user can also send a screen projection device replacement request to the first electronic device, and the request carries the screen projection device identifier.
  • the first electronic device can first query the searched screen projection device for the screen projection device corresponding to the screen projection device identification carried in the above request, and if it is found, compare the above second electronic device identification with the above request Whether the carried screen projection device IDs are the same, if they are different, a new second electronic device is determined according to the screen projection device ID carried in the above request, and the subsequent steps are performed to meet the needs of replacing the screen projection device in actual applications.
  • the method further includes:
  • Compression processing is performed on the location information and the content to be projected.
  • the amount of storage required to represent the original content to be projected is reduced, facilitating the transmission and storage of data.
  • video compression technology may be used to compress the content to be projected.
  • the video compression technology can include: MPEG1, MPEG2, etc.
  • audio compression technology can be used to compress the content to be projected.
  • audio compression technology can include: MP3, WMA, etc.
  • image compression technology can be used to compress the content to be projected.
  • the image compression technology may include: transform coding, entropy coding and so on.
  • the projecting the location information and the content to be projected onto the second electronic device includes:
  • the 60GHz wireless communication spectrum can realize the transmission of wireless high-definition audio and video signals, bringing more complete high-definition video solutions for multimedia applications.
  • the above-mentioned location information and the content to be projected are projected to the second electronic device through the 60GHz wireless communication spectrum, which can bring the experience of wireless transmission with low latency (compared to wired transmission) and meet the requirements of high latency.
  • Applications, such as those that meet high latency requirements such as game projection.
  • the method further includes:
  • a communication establishment request is received, for example, the user's finger slides down on the main interface of the first electronic device.
  • the first electronic device detects the above operation on the main interface, it is determined that the communication establishment request is received, and the communication is displayed.
  • Management interface which can include mobile data icons, 60GHz wireless communication spectrum icons, wifi icons, etc.
  • the first electronic device automatically searches for screen projection devices connected to the 60GHz wireless communication spectrum through the 60GHz wireless communication spectrum (representing electronic devices with projection/reception capabilities).
  • the first electronic device displays the searched screen projection device on the above interface to prompt the user to select a screen projection device from the searched screen projection device to establish a screen projection connection. After the user selects the projection device that the user wants to project, the first electronic device establishes a projection connection with the projection device selected by the user. In other feasible implementation manners, the user can also activate the 60GHz wireless communication spectrum projection function of the first electronic device by setting an icon on the main interface of the first electronic device.
  • the user can control the content to be projected on the second electronic device through the remote control of the second electronic device. Play on the second electronic device.
  • the second electronic device sends a pause playback instruction to the first electronic device.
  • the first electronic device receives the playback pause instruction
  • the first electronic device pauses the playback of the content to be projected.
  • the first electronic device will suspend the transmission of the above-mentioned location information and the content to be projected, and the second electronic device will also pause the playback because there is no content to be projected.
  • the second electronic device sends a continue playing instruction to the first electronic device.
  • the first electronic device receives the instruction to continue playing, the first electronic device continues to play from the current playback progress of the content to be projected, and the first electronic device continues to transmit the above-mentioned location information and the content to be projected, so the second The electronic device continues to play the content to be cast when it receives it.
  • the first electronic device projects the position information of the touch object and the content to be projected onto the second electronic device ,
  • the second electronic device displays the content to be projected on the screen and the projection point indicating the position, so that when the user looks up at the second electronic device, the projection point on the second electronic device can be used to determine the touch The position of the control object on the screen of the first electronic device.
  • the user needs to operate the projected content, he can accurately perform the corresponding operation on the first electronic device based on the determined position above, without looking down on the screen of the first electronic device , Easy to use and improve user experience.
  • FIG. 5 is a schematic flowchart of another screen projection method proposed in an embodiment of this application.
  • the execution subject of this embodiment may be the first electronic device 100 in the embodiment shown in FIG. 2. As shown in FIG. 5, the method may include:
  • S501 Detect a touch object suspended on the screen of the first electronic device.
  • S502 Determine whether the touch object meets a response condition, where the response condition includes that the distance between the touch object and the screen of the first electronic device meets a preset distance condition.
  • S503 Determine the position of the touch object on the screen of the first electronic device.
  • steps S501-S504 are implemented in the same manner as the foregoing steps S401-S404, and will not be repeated here.
  • S505 Acquire a touch operation of the touch object on the screen of the first electronic device.
  • the screen of the first electronic device includes a mutual capacitance touch sensor
  • the mutual capacitance can realize direct contact touch
  • the first electronic device can obtain the touch operation of the touch object on the screen of the first electronic device based on the mutual capacitance.
  • the screen of the first electronic device includes a self-capacitance touch sensor, and the self-capacitance can realize floating touch on the touch screen.
  • the first electronic device can also acquire the floating touch operation of the touch object on the screen of the first electronic device.
  • a capacitance change corresponding to the hovering touch operation will be generated on the screen of the first electronic device.
  • the floating touch operation will affect the capacitance arrays in the horizontal electrodes and the vertical electrodes of the screen of the first electronic device, so that the capacitance in the screen of the first electronic device changes.
  • the above floating touch operation is a single-point touch. That is, during single-point touch, only the capacitance of one touch point changes.
  • the above floating touch operation is a multi-point touch. That is, during multi-touch, the capacitance of multiple touch points will change.
  • the touch point array is arranged on the horizontal electrodes and the vertical electrodes of the screen of the first electronic device.
  • the capacitance on the horizontal and vertical coordinates of the corresponding touch point 21 changes.
  • the hovering touch operation triggers the touch point 22 on the screen of the first electronic device, the capacitance corresponding to the horizontal and vertical coordinates of the touch point 22 changes.
  • S506 Determine the new content to be projected corresponding to the response result of the touch operation, and project the new content to be projected on the second electronic device for display.
  • the new content to be projected can also be compressed to reduce the amount of storage required to represent the original content , Improve the efficiency of data transmission and storage.
  • the new content to be projected onto the second electronic device for display it may include:
  • Projecting the new content to be projected onto the second electronic device through the 60 GHz wireless communication spectrum. That is, file transmission and data synchronization between devices are performed through the 60GHz wireless communication spectrum, which reduces the delay of information transmission.
  • the first electronic device may also preset the corresponding relationship between the touch operation and the response result before determining the new content to be projected corresponding to the response result of the touch operation.
  • the correspondence relationship may not be a one-to-one correspondence relationship.
  • the same touch operation may correspond to different response results.
  • the response result corresponding to the short message sending interface is sending a short message
  • the corresponding response result on the phone answering interface is answering the phone.
  • the corresponding response result is determined, and the new screen to be projected corresponding to the response result of the above-mentioned touch operation is determined. content. If in the above-mentioned corresponding relationship, the response result corresponding to the above-mentioned touch operation and the currently displayed interface is not found, a corresponding prompt may be generated, such as a response failure prompt, so that the user can view and process in time.
  • the first electronic device may also preset and set the correspondence between the movement track of the touch operation and the response result. That is, each movement track corresponds to a response result.
  • the response result corresponding to the movement trajectory from top to bottom is to move the interface up and down
  • the response result corresponding to the movement trajectory from front to back is to enlarge the interface.
  • the corresponding response result is determined, and then the new pending cast corresponding to the response result of the aforementioned touch operation is determined. Screen content.
  • a corresponding prompt can be generated to prompt the user to deal with it in time.
  • the first electronic device projects the position of the touch object and the content to be cast onto the second electronic device to trigger the second electronic device to display the The content of the screen and the projection point indicating the above position, therefore, when the user looks up at the second electronic device, the location of the touch object can be determined by the projection point on the second electronic device when the user looks up.
  • the projected content can be accurately operated on the first electronic device based on the aforementioned projection points and touch objects, without looking down at the screen of the first electronic device, which is convenient to use and improves user experience.
  • FIG. 7 is a schematic flowchart of another screen projection method proposed in an embodiment of this application.
  • the execution subject of this embodiment may be the first electronic device 100 in the embodiment shown in FIG. 2.
  • the method may include:
  • the target game application is any one or more game applications.
  • the first electronic device receives a start instruction, the start instruction carries a game identifier, and the corresponding game application is started according to the game identifier.
  • the aforementioned target game application may be started after the 60 GHz wireless communication spectrum is started, and the specific time to start the target game application may be determined according to actual conditions.
  • S703 Search for the screen projection device, and establish a screen projection connection with the searched second electronic device.
  • the first electronic device activates the 60GHz wireless communication spectrum projection function.
  • the first electronic device automatically searches for screen projection devices connected to the 60 GHz wireless communication spectrum through the 60 GHz wireless communication spectrum.
  • the first electronic device displays the searched screen projection device on the above interface to prompt the user to select a screen projection device from the searched screen projection device to establish a screen projection connection.
  • the first electronic device establishes a projection connection with the projection device selected by the user.
  • S704 Activate the floating touch function to detect the touch object floating on the screen of the first electronic device.
  • the screen of the first electronic device includes a self-capacitance touch sensor, where the self-capacitance can realize floating touch on the touch screen.
  • the first electronic device may receive a hovering touch activation instruction, and activate the hovering touch function through self-capacitance according to the instruction, and then sense the touch object within a certain range above the screen of the first electronic device through the hovering touch function. Perform subsequent operations based on the sensed touch object.
  • S705 Determine whether the aforementioned touch object meets a response condition, where the response condition includes that the distance between the touch object and the screen of the first electronic device meets a preset distance condition.
  • S706 Determine the position of the touch object on the screen of the first electronic device.
  • the action of determining the position of the touch object on the screen of the first electronic device may occur before, after, or at the same time when determining whether the touch object meets the response condition.
  • the aforementioned content to be cast can be determined according to the aforementioned launched game application.
  • S708 Project the processed location information and the content to be projected onto the second electronic device through the 60GHz wireless communication spectrum to trigger the second electronic device to display the content to be projected and the projection point indicating the location.
  • the second electronic device receives the compressed content to be screened, decompresses the content, and displays the decompressed content.
  • S709 Acquire the touch operation of the touch object on the screen of the first electronic device.
  • the screen of the first electronic device includes a mutual capacitance touch sensor, the mutual capacitance can realize direct contact touch, and the first electronic device can acquire the touch operation of the touch object on the screen of the first electronic device based on the mutual capacitance.
  • S710 Determine the new content to be projected corresponding to the response result of the touch operation, and project the new content to be projected on the second electronic device for display.
  • the first electronic device projects the position information of the touch object suspended on the screen of the first electronic device and the content to be projected onto the second electronic device, triggering the second electronic device to display the aforementioned screen to be projected
  • the first electronic device obtains the touch operation of the touch object on the screen of the first electronic device, determines the new content to be projected corresponding to the response result of the touch operation, and sets the new
  • the projected content is projected to the second electronic device for display, so that when the user looks up at the second electronic device, if he needs to operate the projected content, he does not need to look down at the screen of the first electronic device.
  • the object can accurately perform corresponding operations on the projected content on the first electronic device, which is convenient to use and improves user experience.
  • FIG. 8 is a schematic structural diagram of a screen projection device provided by this application.
  • the device includes: a detection module 801, a judgment module 802, a determination module 803, and a screen projection module 804.
  • the detection module 801 is used to detect the touch object suspended on the screen of the first electronic device.
  • the determining module 802 is configured to determine whether the touch object meets a response condition, and the response condition includes that the distance between the touch object and the screen of the first electronic device meets a preset distance condition.
  • the determining module 803 is configured to determine the position of the touch object on the screen of the first electronic device.
  • the projection module 804 is configured to, in response to determining that the touch object meets the response condition, project the position information and the content to be projected onto the second electronic device to trigger the second electronic device to display the Describe the content to be projected, and the projection point indicating the location.
  • the device in this embodiment can correspondingly be used to execute the technical solutions in the foregoing method embodiments, and its implementation principles and technical effects are similar, and will not be repeated here.
  • FIG. 9 is a schematic structural diagram of another screen projection device provided by this application. As shown in FIG. 9, this embodiment further includes a compression module 805 based on the embodiment in FIG. 8.
  • the screen projection module 804 is specifically used for:
  • the projection module 804 projects the location information and the content to be projected onto the second electronic device, it is further used to:
  • the new content to be cast corresponding to the response result of the touch operation is determined, and the new content to be cast is projected to the second electronic device for display.
  • the compression module 805 is configured to: before the projection module 804 projects the location information and the content to be projected onto the second electronic device, the location information and all the information The content to be projected is compressed.
  • the response condition further includes: the time that the touch object is suspended on the screen of the first electronic device is greater than a time threshold.
  • the screen of the first electronic device includes a self-capacitance touch sensor.
  • the device in this embodiment can correspondingly be used to execute the technical solutions in the foregoing method embodiments, and its implementation principles and technical effects are similar, and will not be repeated here.
  • FIG. 10 is a schematic structural diagram of another screen projection device provided by this application.
  • the device includes: an activation module 1001, an activation module 1002, a search module 1003, a detection module 1004, a judgment module 1005, a determination module 1006, a compression module 1007, and a projection module.
  • the start module 1001 is used to start the target game application.
  • the activation module 1002 is used to activate the 60 GHz wireless communication spectrum.
  • the search module 1003 is used to search for screen projection devices and establish a screen projection connection with the searched second electronic device.
  • the detection module 1004 is used to activate the floating touch function and detect the touch object suspended on the screen of the first electronic device.
  • the determining module 1005 is configured to determine whether the touch object meets a response condition, and the response condition includes that the distance between the touch object and the screen of the first electronic device meets a preset distance condition.
  • the determining module 1006 is used to determine the position of the touch object on the screen of the first electronic device.
  • the compression module 1007 is configured to, in response to determining that the touch object satisfies the response condition, compress the location information and the content to be cast, wherein the content to be cast is determined according to the target game application.
  • the projection module 1008 is used to project the processed location information and the content to be projected onto the second electronic device through the 60GHz wireless communication spectrum, so as to trigger the second electronic device to display the to-be-projected screen Content, and the projection point indicating the location; acquire the touch operation of the touch object on the screen of the first electronic device; determine the new content to be projected corresponding to the response result of the touch operation, The new content to be screened is screened and displayed on the second electronic device.
  • the device in this embodiment can correspondingly be used to execute the technical solutions in the foregoing method embodiments, and its implementation principles and technical effects are similar, and will not be repeated here.
  • FIG. 11 schematically provides a possible basic hardware architecture of the computing device described in this application.
  • the computing device 1100 includes a processor 1101, a memory 1102, a communication interface 1103, and a bus 1104.
  • the computing device 1100 may be a computer or a server, which is not particularly limited in this application.
  • the number of processors 1101 may be one or more, and FIG. 11 only illustrates one of the processors 1101.
  • the processor 1101 may be a central processing unit (CPU). If the computing device 1100 has multiple processors 1101, the types of the multiple processors 1101 may be different or may be the same.
  • multiple processors 1101 of the computing device 1100 may also be integrated into a multi-core processor.
  • the memory 1102 stores computer instructions and data; the memory 1102 can store computer instructions and data required to implement the screen projection method provided in the present application.
  • the memory 1102 stores instructions for implementing the steps of the screen projection method.
  • the memory 1102 may be any one or any combination of the following storage media: non-volatile memory (for example, read only memory (ROM), solid state drive (SSD), hard disk (HDD), optical disk)), volatile memory.
  • the communication interface 1103 may be any one or any combination of the following devices: a network interface (for example, an Ethernet interface), a wireless network card, and other devices with a network access function.
  • the communication interface 1103 is used for data communication between the computing device 1100 and other computing devices or electronic devices.
  • FIG. 11 shows the bus 1104 with a thick line.
  • the bus 1104 may connect the processor 1101 with the memory 1102 and the communication interface 1103. In this way, through the bus 1104, the processor 1101 can access the memory 1102, and can also use the communication interface 1103 to interact with other computing devices or electronic devices.
  • the computing device 1100 executes computer instructions in the memory 1102, so that the computing device 1100 implements the screen projection method provided in FIG. 4 or FIG. 5 of this application, or causes the computing device 1100 to deploy the screen projection device provided in FIG. 8 or FIG. 9 .
  • the screen projection device provided in FIG. 8 or FIG. 9 can be implemented by software as shown in FIG. 11, and can also be implemented as a hardware module or as a circuit unit through hardware.
  • FIG. 12 schematically provides a possible basic hardware architecture of the computing device described in this application.
  • the computing device 1200 includes a processor 1201, a memory 1202, a communication interface 1203, and a bus 1204.
  • the computing device 1200 may be a computer or a server, which is not particularly limited in this application.
  • the number of processors 1201 may be one or more, and FIG. 12 only illustrates one of the processors 1201.
  • the processor 1201 may be a central processing unit. If the computing device 1200 has multiple processors 1201, the types of the multiple processors 1201 may be different or may be the same.
  • multiple processors 1201 of the computing device 1200 may also be integrated into a multi-core processor.
  • the memory 1202 stores computer instructions and data; the memory 1202 can store computer instructions and data required to implement the screen projection method provided in the present application.
  • the memory 1202 stores instructions for implementing the steps of the screen projection method.
  • the memory 1202 may be any one or any combination of the following storage media: non-volatile memory (for example, read-only memory, solid-state hard disk, hard disk, optical disk), and volatile memory.
  • the communication interface 1203 may be any one or any combination of the following devices: a network interface (for example, an Ethernet interface), a wireless network card, and other devices with a network access function.
  • the communication interface 1203 is used for data communication between the computing device 1200 and other computing devices or electronic devices.
  • FIG. 12 shows the bus 1204 with a thick line.
  • the bus 1204 may connect the processor 1201 with the memory 1202 and the communication interface 1203. In this way, through the bus 1204, the processor 1201 can access the memory 1202, and can also use the communication interface 1203 to interact with other computing devices or electronic devices.
  • the computing device 1200 executes computer instructions in the memory 1202 to enable the computing device 1200 to implement the screen projection method provided in FIG. 7 of this application, or to cause the computing device 1200 to deploy the screen projection device provided in FIG. 10.
  • the screen projection device provided in FIG. 10 can be implemented as a hardware module or as a circuit unit in addition to being implemented by software as in the above-mentioned FIG. 12, and implemented by hardware.
  • the present application also provides a computer-readable storage medium.
  • the computer program product includes computer instructions, and the computer instructions instruct a computing device to execute the screen projection method provided in FIG. 4 or FIG. 5 of this application.
  • the present application also provides a computer-readable storage medium.
  • the computer program product includes computer instructions, and the computer instructions instruct a computing device to execute the screen projection method provided in FIG. 7 of the present application.
  • This application also provides a computer program product containing instructions, which when the computer program product runs on an electronic device, causes the electronic device to execute the screen projection method provided in FIG. 4 or FIG. 5 of this application.
  • This application also provides a computer program product containing instructions, which when the computer program product runs on an electronic device, causes the electronic device to execute the screen projection method provided in FIG. 7 of this application.
  • the disclosed device and method may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components can be combined or It can be integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • each unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit may be implemented in the form of hardware, or may be implemented in the form of hardware plus software functional units.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

La présente invention concerne un procédé de projection d'écran et un dispositif informatique. Le procédé consiste à : détecter, au moyen d'un premier dispositif électronique, un objet de commande tactile suspendu sur un écran du premier dispositif électronique (S401) ; déterminer si l'objet de commande tactile satisfait une condition de réponse, la condition de réponse comprenant le fait que la distance entre l'objet de commande tactile et l'écran du premier dispositif électronique satisfait une condition de distance (S402) ; déterminer la position de l'objet de commande tactile sur l'écran du premier dispositif électronique (S403) ; et en réponse à la détermination du fait que l'objet de commande tactile satisfait la condition de réponse, projeter, sur un second dispositif électronique, des informations de la position susmentionnée et le contenu à projeter, de façon à amener le second dispositif électronique à afficher le contenu à projeter, et à indiquer un point de projection de la position susmentionnée (S404). Par conséquent, lorsqu'un utilisateur regarde le second dispositif électronique, l'utilisateur peut déterminer la position de l'objet de commande tactile sur l'écran du premier dispositif électronique au moyen du point de projection sur le second dispositif électronique, et à ce moment, si l'utilisateur a besoin de manipuler le contenu de projection à l'écran, l'utilisateur peut effectuer avec précision une opération sur le premier dispositif électronique sur la base de la position susmentionnée sans baisser la tête pour vérifier l'écran du premier dispositif électronique, de telle sorte que l'utilisation est commode, et l'expérience utilisateur est améliorée.
PCT/CN2020/085499 2019-04-26 2020-04-20 Procédé de projection d'écran et dispositif informatique WO2020216156A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201910345141.X 2019-04-26
CN201910345141 2019-04-26
CN201911076649.0A CN111061445A (zh) 2019-04-26 2019-11-06 投屏方法和计算设备
CN201911076649.0 2019-11-06

Publications (1)

Publication Number Publication Date
WO2020216156A1 true WO2020216156A1 (fr) 2020-10-29

Family

ID=70298466

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/085499 WO2020216156A1 (fr) 2019-04-26 2020-04-20 Procédé de projection d'écran et dispositif informatique

Country Status (2)

Country Link
CN (1) CN111061445A (fr)
WO (1) WO2020216156A1 (fr)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112015508B (zh) * 2020-08-29 2024-01-09 努比亚技术有限公司 一种投屏交互控制方法、设备及计算机可读存储介质
CN112069011B (zh) * 2020-09-08 2024-09-13 西安万像电子科技有限公司 信息获取方法及装置、电子设备
CN112153457A (zh) * 2020-09-10 2020-12-29 Oppo(重庆)智能科技有限公司 无线投屏连接方法、装置、计算机存储介质及电子设备
CN112261467B (zh) * 2020-10-09 2022-11-15 深圳市锐尔觅移动通信有限公司 投屏内容展示方法、装置、电子设备和可读存储介质
CN112486363B (zh) * 2020-10-30 2023-12-19 华为技术有限公司 一种跨设备的内容分享方法、电子设备及系统
CN112306442B (zh) * 2020-11-20 2023-05-12 Oppo广东移动通信有限公司 跨设备的内容投屏方法、装置、设备及存储介质
CN112306443A (zh) * 2020-11-23 2021-02-02 Oppo广东移动通信有限公司 一种信息展示方法和存储介质
CN112818163B (zh) * 2021-01-22 2024-06-21 山西亦加企业管理咨询有限责任公司 基于移动终端的歌曲显示处理方法、装置、终端及介质
CN112905136A (zh) * 2021-03-11 2021-06-04 北京小米移动软件有限公司 投屏控制方法、装置以及存储介质
CN113126874B (zh) * 2021-04-15 2022-07-15 杭州当贝网络科技有限公司 终端设备的控制方法和控制系统
CN117880577A (zh) * 2021-06-22 2024-04-12 荣耀终端有限公司 屏幕镜像方法、装置、设备及存储介质
CN115686252B (zh) * 2021-09-24 2023-10-20 荣耀终端有限公司 触控屏中的位置信息计算方法和电子设备
CN116450000A (zh) * 2022-01-07 2023-07-18 荣耀终端有限公司 触控屏显示方法、装置及存储介质
CN115079981A (zh) * 2022-06-28 2022-09-20 Oppo广东移动通信有限公司 设备控制方法及相关装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10349673A1 (de) * 2003-10-24 2005-05-25 Bayerische Motoren Werke Ag Vorrichtung und Verfahren zur Dateneingabe in einem Kraftfahrzeug
CN103699326A (zh) * 2013-12-27 2014-04-02 深圳天珑无线科技有限公司 触控操作的处理方法、终端设备
CN105263036A (zh) * 2015-10-13 2016-01-20 深圳创维数字技术有限公司 一种控制数字电视接收终端的方法及终端
CN105979334A (zh) * 2016-07-06 2016-09-28 乐视控股(北京)有限公司 一种传屏系统、终端及传屏方法
CN109177899A (zh) * 2018-08-30 2019-01-11 上海天马微电子有限公司 车载显示装置的交互方法和车载显示装置
CN110362231A (zh) * 2019-07-12 2019-10-22 腾讯科技(深圳)有限公司 抬头触控设备、图像显示的方法及装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103576906B (zh) * 2012-07-27 2018-04-24 深圳富泰宏精密工业有限公司 鼠标图标控制方法及系统
CN105867596A (zh) * 2015-01-22 2016-08-17 联想(北京)有限公司 一种显示方法及电子设备
CN108989879B (zh) * 2018-08-28 2021-06-15 广州视源电子科技股份有限公司 投屏的控制方法、装置和系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10349673A1 (de) * 2003-10-24 2005-05-25 Bayerische Motoren Werke Ag Vorrichtung und Verfahren zur Dateneingabe in einem Kraftfahrzeug
CN103699326A (zh) * 2013-12-27 2014-04-02 深圳天珑无线科技有限公司 触控操作的处理方法、终端设备
CN105263036A (zh) * 2015-10-13 2016-01-20 深圳创维数字技术有限公司 一种控制数字电视接收终端的方法及终端
CN105979334A (zh) * 2016-07-06 2016-09-28 乐视控股(北京)有限公司 一种传屏系统、终端及传屏方法
CN109177899A (zh) * 2018-08-30 2019-01-11 上海天马微电子有限公司 车载显示装置的交互方法和车载显示装置
CN110362231A (zh) * 2019-07-12 2019-10-22 腾讯科技(深圳)有限公司 抬头触控设备、图像显示的方法及装置

Also Published As

Publication number Publication date
CN111061445A (zh) 2020-04-24

Similar Documents

Publication Publication Date Title
WO2020216156A1 (fr) Procédé de projection d'écran et dispositif informatique
WO2021213120A1 (fr) Procédé et appareil de projection d'écran et dispositif électronique
WO2020168965A1 (fr) Procédé de commande d'un dispositif électronique à écran pliant et dispositif électronique
WO2021078284A1 (fr) Procédé de continuation de contenu et dispositif électronique
WO2021000807A1 (fr) Procédé et appareil de traitement pour un scénario d'attente dans une application
WO2022257977A1 (fr) Procédé de projection d'écran pour dispositif électronique, et dispositif électronique
WO2021052214A1 (fr) Procédé et appareil d'interaction par geste de la main et dispositif terminal
WO2020224449A1 (fr) Procédé de manœuvre d'affichage à écran partagé et dispositif électronique
WO2020244623A1 (fr) Procédé de mise en œuvre de mode de souris 3d et dispositif associé
WO2020143380A1 (fr) Procédé de transmission de données et dispositif électronique
CN111316598A (zh) 一种多屏互动方法及设备
WO2020173370A1 (fr) Procédé pour déplacer des icônes d'applications et dispositif électronique
WO2021023035A1 (fr) Procédé et appareil de commutation de lentille
CN111132234A (zh) 一种数据传输方法及对应的终端
WO2021036898A1 (fr) Procédé d'activation d'applications pour appareil à écran pliable et dispositif associé
WO2021036830A1 (fr) Procédé pour afficher une application sur un écran pliable, et dispositif électronique
WO2022001619A1 (fr) Procédé de capture d'écran et dispositif électronique
WO2022028537A1 (fr) Procédé de reconnaissance de dispositif et appareil associé
WO2022116930A1 (fr) Procédé de partage de contenu, dispositif électronique et support de stockage
WO2022135163A1 (fr) Procédé d'affichage de projection d'écran et dispositif électronique
WO2022042769A2 (fr) Système et procédé d'interaction multi-écrans, appareil, et support de stockage
CN112130788A (zh) 一种内容分享方法及其装置
CN113923230A (zh) 数据同步方法、电子设备和计算机可读存储介质
WO2020133006A1 (fr) Procédé de communication basé sur bluetooth à faible énergie, et appareil associé
WO2022007944A1 (fr) Procédé de commande de dispositif et appareil associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20795128

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20795128

Country of ref document: EP

Kind code of ref document: A1