CN114915834A - Screen projection method and electronic equipment - Google Patents

Screen projection method and electronic equipment Download PDF

Info

Publication number
CN114915834A
CN114915834A CN202110584296.6A CN202110584296A CN114915834A CN 114915834 A CN114915834 A CN 114915834A CN 202110584296 A CN202110584296 A CN 202110584296A CN 114915834 A CN114915834 A CN 114915834A
Authority
CN
China
Prior art keywords
instruction
screen
screen projection
multimedia content
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110584296.6A
Other languages
Chinese (zh)
Inventor
陈兰昊
徐世坤
于飞
孟庆吉
杜奕全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to PCT/CN2022/073202 priority Critical patent/WO2022166618A1/en
Publication of CN114915834A publication Critical patent/CN114915834A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/4222Remote control device emulator integrated into a non-television apparatus, e.g. a PDA, media center or smart toy
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a screen projection method and electronic equipment, wherein when a user needs to project a screen, sensor data is acquired; when the sensor data is identified as a first type of instruction, at least one of the following operations is performed: the content mirror image is projected to the second equipment, and the service data is sent to the second equipment; when the sensor data is identified as the second type of instruction, at least one of the following operations is performed: the content mirror image is projected to the first device; sending service data to first equipment; and stopping projecting the content mirror image to the second equipment, and sending an instruction to the second equipment. The scheme provided by the application can give consideration to various screen projection scenes, logic is uniformly realized, and better and convenient screen projection use experience is provided for users.

Description

Screen projection method and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of electronics, in particular to a screen projection method and electronic equipment.
Background
When a user owns a plurality of devices, screen content displayed by one device is projected to a screen of another device, for example, multimedia content played by a small-screen device (e.g., a mobile phone), a game interface and the like are projected to a large-screen device (e.g., a computer, a smart television) for playing, and a better use experience is provided for the user by using a display screen of the large-screen device and a sound box device.
In the prior art, the screen projection operation is a complex operation, comprises various screen projection scenes, requires a user to execute a plurality of steps, and brings operation experience which is not convenient and fast for the user.
Disclosure of Invention
The embodiment of the application provides a screen projection method and electronic equipment, when a user needs to project a screen, the user can conveniently and rapidly project the screen, and the use experience of the user is improved.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in one possible design, the first device or the second device acquires sensor data; when the first device or the second device analyzes the sensor data into a first type of instruction, at least one of the following operations is performed: the first equipment projects a display content mirror image to the second equipment, and the first equipment sends service data to the second equipment; when the sensor data is analyzed into the second type of instruction, at least one of the following operations is executed: the first equipment projects the display content mirror image to the first equipment; the first equipment sends the service data to the first equipment; the first device stops projecting the display content mirror image to the second device, and the first device sends a control instruction to the second device.
In one possible design, the traffic data includes at least one of: the name of the multimedia content, the identification of the multimedia content, the uniform resource locator of the multimedia content, the playing progress of the multimedia content, the playing volume of the multimedia content and the type of the multimedia content. By the method, the screen projection of the multimedia content can be realized, namely the second equipment can play the multimedia content on the second equipment together with the information.
In one possible design, the first device sending the traffic data to the second device includes: when the current application of the first device is a video playing application, calling an application program interface of the video playing application program, acquiring service data, and sending the service data to the second device, and the second device continuously plays multimedia contents according to the service data.
In one possible design, the first device casts the content image to the second device, further comprising: and using a Miracast protocol to project the display content image to the second device. Wherein Miracast is a protocol of mirror image screen projection.
In one possible design, after the mirror-projecting the display content to the second device, the method further includes: the first device sets a foreground application associated with the display content to be displayed in a floating window or picture-in-picture mode.
In one possible design, the associated gesture of the first type of instruction is a bottom-up spaced gesture or a three-finger slide-down or a four-finger slide-down.
In one possible design, the associated gesture of the second type of instruction is a top-down spaced gesture or a three-finger slide-up or a four-finger slide-up.
In one possible design, a first device mirroring display content to a second device includes: the first equipment obtains the position of the screen projection control through image analysis of a current interface, executes a screen projection function built in the application through simulation of user operation, and projects a display content mirror image to the second equipment. In this way, when the screen projection application program does not provide an application program interface for screen projection, the screen projection function in the application program can be used for realizing screen projection.
In one possible design, the first device sending the traffic data to the second device includes: the first equipment obtains the position of the screen projection control through image analysis of the current interface, executes the built-in screen projection function of the application through simulation of user operation, and sends service data to the second equipment. In this way, when the screen projection application program does not provide an application program interface for screen projection, the screen projection function in the application program can be used for realizing screen projection.
In one possible design, the first device or the second device stores a database of operating instructions, wherein parsing the sensor data comprises: and comparing the sensor data or the result obtained after processing the sensor data with the data in the database to obtain the sensor data corresponding to the first type of instruction or the second type of instruction.
In one possible design, the first device is a cell phone and the second device is a large screen.
In one possible design, the present application provides an electronic device comprising a memory and one or more processors; wherein the memory is to store computer program code, the computer program code comprising computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform a method of screen projection.
In one possible design, the present application provides a computer-readable storage medium comprising computer instructions that, when executed on an electronic device, cause the electronic device to perform a method of screen projection
In one possible design, the present application provides a computer program product that, when run on a computer, causes the computer to perform a method of screen projection.
The scheme provided by the application can take various screen projection scenes into consideration, can uniformly realize logic for shielding the bottom layer of a user, and can provide better and convenient screen projection use experience for the user.
Drawings
Fig. 1 shows a hardware structure diagram of an electronic device provided in an embodiment of the present application;
2A-2B illustrate a schematic view of a usage scenario provided by an embodiment of the application;
FIG. 3 illustrates a flow chart of a method provided by an embodiment of the present application;
fig. 4 shows a flowchart of a method provided by an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association relationship describing an associated object, and means that there may be three relationships, for example, a and/or B, and may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, in the description of the embodiments of the present application, "a plurality" means two or more than two.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present embodiment, "a plurality" means two or more unless otherwise specified.
The embodiment of the application provides a screen projection method and device, which can be applied to electronic devices such as a mobile phone, a tablet computer, a wearable device (e.g., a watch, a bracelet, a helmet, an earphone, etc.), an on-board device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), a smart home device (e.g., a smart television, a smart speaker, a smart camera, etc.), and the like. It is understood that the embodiment of the present application does not set any limit to the specific type of the electronic device.
Fig. 1 shows a schematic diagram of a hardware structure of the electronic device 100. As shown in fig. 1, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. Wherein the controller may be a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution. A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. The charging management module 140 is configured to receive charging input from a charger. The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The display screen 194 is used to display a display interface of an application, such as a viewfinder interface of a camera application. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or more display screens 194.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or more cameras 193.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. Wherein the storage program area may store an operating system, software code of at least one application program (e.g., huaye video application, wallet, etc.), and the like. The data storage area may store data (e.g., captured images, recorded videos, etc.) generated during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a Universal Flash Storage (UFS), and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as pictures, videos, and the like are saved in an external memory card.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration prompts as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization. Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc. The SIM card interface 195 is used to connect a SIM card. The SIM card may be brought into and out of contact with the electronic device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195.
It is to be understood that the components shown in fig. 1 are not to be construed as specifically limiting for electronic device 100, and that electronic device 100 may include more or fewer components than shown, or some components may be combined, or some components may be split, or a different arrangement of components. In addition, the combination/connection relationship between the components in fig. 1 may also be modified.
In some possible embodiments, a user may display content displayed on a first device on a second device, and a better content playing experience may be obtained by using hardware such as a display screen and a speaker of the second device. For example, the content displayed on the first device may be multimedia content (e.g., pictures, video, audio, etc.), and for example, the content displayed on the first device may be games, application interfaces, etc.
In some possible embodiments, the first device and the second device may be one of the electronic devices 100, for example, the first device is a mobile phone, a tablet computer, and the second device is a smart tv, a personal computer, a large screen (abbreviated as a large screen), and the like.
By screen-cast is meant that content on a first device is enabled to be presented on a second device through a certain protocol. In some possible embodiments, the first device optionally displays multimedia content (e.g., pictures, videos, audios, etc.) on the second device by projecting the multimedia content onto the second device, where common protocols such as DLNA (Digital Living Network Alliance) protocol and Chromecast protocol, etc., as shown in fig. 2A, include a mobile phone 201 and a smart television 202, the mobile phone 201 sends an identifier (e.g., a link) of "run" of the multimedia content being played, playing information (e.g., a playing progress, a playing double speed, a playing sound, etc.) to the smart television 202, the smart television 202 obtains information related to "run" from a Network or a database according to the identifier (e.g., the link), the playing information (e.g., a playing progress, a playing double speed, a playing sound, etc.) and plays the multimedia content "run", thereby achieving an effect that the multimedia content is played from one device to another device, at this time, the user can control the playing progress, playing speed, playing sound and the like of the 'running' on the second device. In other possible embodiments, the first device optionally presents the content displayed by the first device on the second device in a mirror image manner, for example, the content displayed by the first device is transmitted to the second device by way of a video stream after being encoded, and the video stream is played after being decoded by the second device, where the common protocols are, for example, Miracast protocol, Airplay protocol, and the like. As shown in fig. 2B, the mobile phone 203 and the smart tv 204 are included, the mobile phone projects the content (the user interface of the video application) displayed on the mobile phone to the smart tv 204 in a mirror image manner, that is, the content displayed on the smart tv 204 is consistent with the content displayed on the mobile phone 203, and when the user interface on the mobile phone 203 changes, the smart tv 204 also changes in response.
In some possible embodiments, common ways of user interaction with the electronic device 100 are: voice interaction, touch screen gestures, and clear gestures, among others. The touch screen gesture refers to a gesture generated by a user contacting a display screen of the electronic device 100, and common touch screen gestures may include single-finger operations, such as tap (tap), press (press), pan (pan), double click (double click), and the like; multi-finger operations such as pinch, three-finger slide, rotate, etc. may also be included. The air-separating gesture means that a user has a certain distance from a display screen of the electronic device 100, the shape of the hand of the user is captured through a sensor (such as a camera and a distance sensor) of the electronic device 100, and after the shape is compared with a preset gesture in a database, corresponding operation is executed according to the preset gesture. It can be understood that the gesture is only a triggering manner of the function, and the gesture used in the present application is not limited.
In some possible embodiments, a user watches videos in a living room by using a smart television, the user needs to go to a balcony/living room to do something, and the user wants to not terminate the playing of the videos, namely, the videos played on the large screen are transferred to a mobile phone to be watched continuously. In some possible embodiments, after the user finishes doing things and returns to the living room, it may be desirable to shift the video played on the mobile phone to the large screen for continuous viewing. Therefore, a method is needed to conveniently switch the played video between the first device and the second device, so as to meet the requirements of users on video playing at different times and different scenes.
In some possible embodiments, the first device and the second device are capable of data interaction. At the moment, the first device and the second device are optionally positioned in the same local area network, and data interaction is carried out through the local area network; the first device and the second device optionally use a point-to-point connection to directly perform data interaction through a P2P channel; the first device and the second device optionally use data traffic for data interaction over the wide area network. The method for data interaction between the first device and the second device is not limited in the present application.
In some possible embodiments, the screen shot of the multimedia content (which may also be referred to as streaming media) requires mutual authentication of multiple devices, so that the screen shot is performed in a trusted environment, for example, the first device and the second device log in to the same account, such as a huayao account, an AppleID account, a samsung account, and the like. In other possible embodiments, the application of the first device and the second device may be installed on the same account, or different versions of the same application, and the first device and the second device may both log on the application, so that the second device can obtain the specified multimedia content from the database according to the received identification (e.g., link) of the multimedia content.
In some possible embodiments, as shown in fig. 3, the embodiment includes step 301, step 302 and step 303, and it is understood that step 301, step 302 and step 303 are optional steps, and the execution order of step 301, step 302 and step 303 can be adjusted.
The first device acquires sensor data (step 301). In some possible embodiments, when the gesture is a touch screen gesture, the first device acquires touch screen data reported by a display driver corresponding to the display screen 194. In other possible embodiments, when the gesture is a clear gesture, the first device optionally acquires image data via a camera, and the first device optionally acquires millimeter wave data via a radar sensor. In other possible embodiments, the first device acquires the data collected by the microphone 170C when the control command (screen projection command) is a voice command.
The first device acquires an operation instruction based on the acquired sensor data (step 302). In some possible embodiments, the first device stores a database of sensor data and operating instructions, and compares the results of analyzing the sensor data with data in the database of operating instruction data to obtain the operating instructions. For example, the first device detects data on the touch screen 194 twice in a short time, and the first device determines that the operation is a double-click operation according to the database of the operation instructions; for another example, the first device determines that the operation is a blank gesture (operation) from bottom to top based on the database of operation instructions by moving the hand of the user from a first position near the upper edge of the image to a second position near the lower edge of the image in the multi-frame image captured by the analysis camera 193.
In the embodiment of the present application, there are two main types of operation instructions associated with the present application, where a first type of instruction is used to project multimedia content onto another device for playing, for example, project multimedia content played by a first device onto a second device for playing; wherein the second type of instruction is for playing multimedia content played on the other device on the first device. The first type of instruction includes, but is not limited to: indicating a voice instruction, a touch screen gesture, and an air gesture to play the multimedia content on the second device; the second class of instructions includes, but is not limited to: a voice instruction indicating to play multimedia content on the first device, a touch screen gesture, and a clear gesture.
In some possible embodiments, the first type of instruction and the second type of gesture may be associated with a plurality of voice instructions, touch screen gestures, and blank gestures, and when one of the voice instructions, touch screen gestures, and blank gestures is captured, a responsive operation may be performed.
In some possible embodiments, in a scenario where a mobile phone interacts with a large screen, a user tends to gesture the mobile phone, facing the large screen. Therefore, the gesture associated with the first type of instruction can be an air separation gesture with a palm opened and from bottom to top, and can also be a gesture with a mobile phone touch screen and sliding from the lower side to top of the screen, and the gesture can be a multi-finger gesture different from a preset gesture of the system, such as three-finger upward sliding and four-finger upward sliding, so that collision is prevented. The gesture associated with the second type of instruction may be an air separation gesture in which a palm is opened and the palm is moved from top to bottom, or may be a gesture in which the palm is contacted with a touch screen of a mobile phone and the palm is moved downwards from above the touch screen, and the gesture may be a multi-finger gesture different from a system preset gesture, such as three-finger downward movement and four-finger downward movement.
The first device plays the multimedia content on the second device according to the acquired operation instruction as the first type instruction (step 303). The first device plays the multimedia content on the first device according to the obtained operation instruction as the second type instruction (step 304).
Fig. 4 is a flowchart illustrating an example of a processing scheme of a first device for processing multimedia content and screen images after recognizing a first type of instruction and a second type of instruction.
The first device acquires sensor data (step 401), and the first device identifies the operation instruction as a first type of instruction according to the acquired sensor data (step 402). The above steps are already described in step 301 and step 302 and are not expanded in detail here.
In some possible embodiments, the first device screens the content image to the second device (step 403). After the first device recognizes the first type of instruction, the displayed content is projected to the second device in a mirror image manner, for example, the first device projects the content to the second device through a Miracast protocol.
In some possible embodiments, the first device sends the service data to the second device (step 404), and the second device plays the multimedia content after parsing the service data. Since the screen projection of the content (multimedia content) is an operation in an application program, if the application provides an application program interface for screen projection, the first device calls the application program interface provided by the application program to realize screen projection after recognizing as the first instruction. For example, the first device identifies that the application of the current foreground is a video playing application, the video playing application has a screen-casting application program interface, the first device calls the screen-casting application program interface of the application, the application program collects and returns service data to the first device, the first device sends the service data to the second device, and the second device realizes continuous playing of multimedia content according to the received service data.
In some possible embodiments, the traffic data includes one or more of the following options: the name of the multimedia content, such as "running" shown in fig. 2A; the identifier of the multimedia content, for example, the corresponding identity identifier of "running" in the video playing application is 12001; a uniform resource locator for multimedia content, such as "run" corresponds to a uniform resource locator of www.video.com/12001; a playing progress of the multimedia content; the playing volume of the multimedia content; the type of the multimedia content, for example, the type corresponding to "running" is video, and the type corresponding to music is audio.
In some possible embodiments, the first device sends the service data to the second device (step 404), and the second device plays the multimedia content after parsing the service data. When the first device identifies that the operation instruction is a first type instruction, the control of the current interface is analyzed through the image, the built-in screen projection function of the application is executed through simulating user operation, the service data is sent to the second device, and the multimedia content is played on the second device.
In some possible embodiments, when the first device identifies that the operation instruction is a first type of instruction, first, it is determined whether the foreground application has a screen-projected application program interface, if so, the screen-projected application program interface is called, and if not, the interface of the first device is projected to the second device in a screen mirroring manner.
In some possible embodiments, when the first device identifies that the operation instruction is a first type of instruction, first, whether a foreground application has a screen-projecting application program interface is judged, if the screen-projecting application program interface exists, the screen-projecting application program interface is called to project a screen, if the screen-projecting application program interface does not exist, whether a current interface includes a screen-projecting control is judged, if the screen-projecting control exists, operation of a user is simulated, multimedia content is projected to the second device by clicking the control, and if the screen-projecting control does not exist, the interface of the first device is projected to the second device in a screen mirroring mode.
In some possible embodiments, when the first device adopts DLNA screen projection, even if the first device receives a user operation and moves the first device back to the background, since DLNA screen projection only needs to maintain a basic connection channel for transmitting the play information, the play on the second device is not paused.
In some possible embodiments, when the first device employs the mirror projection, the first device receives a user operation, and moves the first device back to the background, and the mirror projection captures an application (e.g., a desktop application or other application) that is in the foreground at the time, resulting in content that is not originally intended to be played on the second device. At this time, the first device determines that the first application is projecting a screen to the second device mirror image, optionally when performing an operation of returning to the first application, the first device retains the first application (the video playing application, or a part of the video playing application, or a part obtained by cutting the video playing application) in a form of a floating window, and simultaneously keeps the life cycle of the floating window at a Resumed stage, and projects the content of the floating window to the second device; or the first device optionally retains the video played by the first application in a picture-in-picture mode while the content of the picture-in-picture is projected to the second device when the operation of returning to the first application is executed.
In some possible embodiments, when the first device uses DLNA screen projection, the first device can receive control commands (e.g., fast forward, fast backward, volume up, volume down) from a user, and transmit the control commands to the second device through a connection channel between the first device and the second device, so as to control the playing of multimedia content on the second device. At the same time, the second device can also accept the control commands (such as fast forward, fast backward, turning up volume, turning down volume) of the multimedia content from the user.
In some possible embodiments, when the first device employs the mirror projection and the first device is currently playing the multimedia content, the first device can receive control commands (e.g., fast forward, fast reverse, volume up, volume down) from the user, apply the control commands to the multimedia content, and transmit the result of the execution to the second device, thereby changing the result of the display of the streaming media on the second device. At the same time, on the second device, the user is unable to control the playing of streaming media.
In some possible embodiments, when the second device has a function capable of sensing a user operation, for example, the second device has a camera, a radar, or the like capable of capturing a user gesture, the second device optionally prestores a gesture library related to broadcast control, and after acquiring and recognizing the user gesture, transmits a recognized control result to the first device, and the first device controls the playing of the streaming media according to the control result, so as to change a display effect of the streaming media on the second device. In other embodiments, the second device optionally transmits the captured video of the user gesture or the processing result of the video to the first device, and the first device executes the recognized command on the streaming media according to the stored gesture library associated with the playing, thereby changing the display effect of the streaming media on the second device.
Optionally, before performing gesture recognition, the second device may determine whether the currently used screen projection of the multimedia content is a screen projection of a mirror image, and if the currently used screen projection of the multimedia content is a screen projection of the multimedia content, perform gesture recognition; if the screen is projected in a mirror image mode, gesture recognition is not executed.
Optionally, before performing gesture recognition, the second device may determine whether the currently used screen projection of the multimedia content is a screen projection of a mirror image, and if the currently used screen projection of the multimedia content is a screen projection of the multimedia content, perform gesture recognition; if the current playing is the continuous media stream, executing gesture recognition through image analysis, and if the current playing is the continuous media stream, not executing gesture recognition.
The first device acquires sensor data (step 401), and the first device identifies the operation instruction as a second type of instruction according to the acquired sensor data (step 405). The above steps are already described in step 301 and step 302 and are not expanded in detail here.
In some possible embodiments, the playing device of the multimedia content is the second device. And when the first equipment identifies that the operation instruction is the second type instruction, sending a message for acquiring the multimedia content to the second equipment. After receiving the message, the second device optionally directly projects the played content image to the first device (step 406). After receiving the message, the second device optionally determines whether the interface for playing the multimedia content currently provides a screen-casting application program interface, if the screen-casting application program interface exists, the application program interface is called to obtain the service data, and the service data is sent to the first device (step 407), and if the screen-casting application program interface does not exist, the played content is mirrored and screened to the first device.
In some possible embodiments, the second device still keeps playing the streaming media after completing the mirror screen projection or the streaming media screen projection; in other possible embodiments, after the mirror projection is completed, the second device continues playing the streaming media in a floating window or picture-in-picture mode; in other possible embodiments, the second device returns to the main interface or plays other multimedia content after completing the streaming media screen projection.
In some possible embodiments, the playing device of the multimedia content is the first device. When the first device identification operation instruction is the second type instruction, the first device stops mirror image screen projection to the second device when judging that the first device is currently in the state of mirror image screen projection to the second device (step 408). When the first equipment identifies that the operation instruction is a second type instruction and judges that the current screen projection is carried out through the service data, a message for stopping playing the multimedia content is sent to the second equipment, and the second equipment pauses or stops playing the multimedia content after receiving the data. When the first device identifies that the operation instruction is a second type instruction and judges that the record of screen projection to the second device exists in a certain time period, a message for stopping playing is sent to the second device (step 409), and the second device pauses or stops playing the multimedia content after receiving the data.
In some possible embodiments, when the first device performs mirror image projection on the second device, or when the second device performs mirror image projection on the first device, the first device or the second device determines a current network quality, performs mirror image projection at a first resolution if the network quality is better than a threshold, and performs screen projection at a second resolution if the network command is lower than the threshold, where the first resolution is higher than the second resolution.
In some possible embodiments, the migration of the media stream may be a video stream and may also be an audio stream, when the migrated object is audio, the first device may not need to display the interface of the first application in a floating window form/display multimedia content in a picture-in-picture form, and when the first application has a background play right, no extra operation is needed; when the first application does not have background rights, the lifecycle of the first application is maintained when the first application falls back to the background.
In some possible embodiments, the first device and the second device in the above solutions may be used in the context of audio calls and video calls. When the main body of the video call is a first device (a mobile phone) and an opposite-end device (a device at the other end of the video call), the first device can transfer the video call to a second device (a large screen) after receiving the gesture and recognizing the gesture as a first type of instruction; when the main device of the video call is the second device and the opposite-end device, the first device can migrate the video call to the first device after receiving the gesture and recognizing the gesture as the second type of instruction. The scheme can meet the requirement that when audio call software/video call software does not exist in the first device/the second device, the first device/the second device has video call capability, and the migration of an audio call/video call main body can be realized.
In some possible embodiments, when the first device/the second device and the opposite-end device adopt a VoIP call, the first device receives the gesture and when the gesture is resolved into a first type of instruction, optionally the second device may be pulled into a session conference (session) by calling a VoIP number of the second device, and at this time, the session includes the first device, the second device and the opposite-end device, and optionally, the first device exits the session in a conference session, so as to implement transition from the first device to the second device. In some possible embodiments, the first device remains in the session and remains in the mute state, so as to enable a quick response when the user still needs to switch the implementation of the call main body. In some possible embodiments, the first device remains in the session, remains in a mute state, counts the time of residing in the session, and exits the session when the residence time is greater than a first time threshold, so that the power consumption of the first device can be reduced while the user can rapidly switch the call subject and the user does not need to use the first device.
In some possible embodiments, when the first device/the second device and the peer device use a VoIP call, the first device, upon receiving the gesture and resolving into the second type of instruction, optionally sends a call request to the second device, where the call request optionally includes a VoIP number, a UUID, or other fields or information for identifying the first device. After receiving the call request, the second device pulls the first device into a session (session), where the session includes the first device, the second device and an opposite-end device, and optionally, when it is determined that the first device is in the session, the second device exits the session, thereby implementing transition from the second device to the first device.
In some possible embodiments, when the first device/the second device and the peer device use a call based on an account, the foregoing method may also be used, which is not described herein again.
In some possible embodiments, the first device optionally projects video onto the second device by way of screen projection, and the video input source of the first device is still the picture captured by the camera of the first device.
In some possible embodiments, a video transmission channel is established between the first device and the second device, when the gesture is received and the gesture is resolved into a first type of instruction, the second device optionally starts a camera, transmits the captured picture to the first device, and the first device transmits the received video stream to the opposite device as a captured picture.
In some possible embodiments, an audio transmission channel is established between the first device and the second device, when the gesture is received and the gesture is resolved into a first type of instruction, the second device optionally starts a microphone to collect sound and transmits the captured sound to the first device, and the first device transmits the received audio stream as collected audio to the opposite device. By the mode, a user can feel better migration experience, namely, the video call can be completed without picking up the second equipment.
In some possible embodiments, when the first device and the second device have VoIP conversation capability or account conversation capability, when a gesture is received and analyzed as a first type of instruction, a mirror image screen projection manner is adopted at the first time, and when a first time threshold is reached, the above manner of switching conversation bodies is adopted.
In some possible embodiments, when the main body of the call is the second device and the opposite device, the call main body can be switched from the second device to the first device when the gesture is received and the gesture is resolved into a second type of instruction, which is not discussed here.
It is to be understood that the video call in the above method may also be an audio call, which is not limited in this application.
In some possible embodiments, the solution of the present application may also be used for the projection of audio, for example, the projection of audio (music, FM, etc.) played on the first device onto a sound box. Also optionally a talk directional gesture (e.g. a directional gesture on the screen or a directional air-space gesture) is implemented.
An electronic device is provided that includes a memory and one or more processors; wherein the memory is to store computer program code, the computer program code comprising computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform a method of screen projection.
The present application provides a computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform a method of screen projection.
The present application provides a computer program product for causing a computer to perform a method of screen projection when the computer program product is run on the computer.
Those skilled in the art will recognize that, in one or more of the examples described above, the functions described in this invention may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. A method for projecting a screen, which is applied to a first device, is characterized by comprising the following steps:
acquiring sensor data;
when the sensor data is analyzed as a first type of instruction, at least one of the following operations is executed: the method comprises the steps of projecting a display content mirror image to a second device, and sending service data to the second device;
when the sensor data is analyzed as the second type of instruction, at least one of the following operations is executed: the display content is projected to the first device in a mirror mode; sending the service data to the first device; stopping projecting the display content mirror image to the second equipment, and sending a control instruction to the second equipment.
2. The method of claim 1, wherein the traffic data comprises at least one of: the name of the multimedia content, the identification of the multimedia content, the uniform resource locator of the multimedia content, the playing progress of the multimedia content, the playing volume of the multimedia content and the type of the multimedia content.
3. The method of claim 1 or 2, wherein the sending the traffic data to the second device comprises:
and when the foreground application is a video playing application, calling an application program interface of the video playing application program, acquiring the service data, and sending the service data to the second equipment, wherein the second equipment continuously plays the multimedia content according to the service data.
4. The method of claim 1 or 2, wherein the projecting the content image to the second device further comprises:
and using a Miracast protocol to project the display content mirror image to the second device.
5. The method of any of claims 1-4, further comprising, after the projecting the display content image to the second device:
and setting the foreground application associated with the display content into a floating window or picture-in-picture mode for displaying.
6. The method according to any one of claims 1 to 5, wherein the associated gesture of the first type of instruction is a bottom-up spaced gesture or a three-finger slide-down or a four-finger slide-down.
7. The method according to any one of claims 1 to 5, wherein the associated gesture of the second type of instruction is a top-down spaced gesture or a three-finger swipe or a four-finger swipe.
8. The method of any of claims 1-7, wherein the mirroring of the display content to the second device comprises:
the first device obtains the position of a screen projection control through image analysis of a current interface, executes a screen projection function built in an application through simulation of user operation, and projects the display content to the second device in a mirror image mode.
9. The method according to any of claims 1 to 7, wherein the sending traffic data to the second device comprises:
the first device analyzes the current interface through images to obtain the position of a screen projection control, executes a screen projection function built in the application through simulating user operation, and sends the service data to the second device.
10. The method of any one of claims 1 to 9, the first device or the second device storing a database of operational instructions, wherein the parsing the sensor data comprises:
and comparing the sensor data or a result obtained after processing the sensor data with the data in the database to obtain the sensor data corresponding to the first class instruction or the second class instruction.
11. The method of any of claims 1-10, wherein the first device is a cell phone and the second device is a large screen.
12. An electronic device, wherein the electronic device comprises memory and one or more processors; wherein the memory is to store computer program code comprising computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the method of screen projection as claimed in any one of claims 1 to 11 performed by the first device or the second device.
13. A computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of screen projection as claimed in any one of claims 1 to 11 performed by the first device or the second device.
14. A computer program product, which, when run on a computer, causes the electronic device to perform the method of screen projection as claimed in any one of claims 1 to 11 performed by the first device or the second device.
CN202110584296.6A 2021-02-08 2021-05-27 Screen projection method and electronic equipment Pending CN114915834A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/073202 WO2022166618A1 (en) 2021-02-08 2022-01-21 Screen projection method and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110171010 2021-02-08
CN2021101710101 2021-02-08

Publications (1)

Publication Number Publication Date
CN114915834A true CN114915834A (en) 2022-08-16

Family

ID=82761423

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110584296.6A Pending CN114915834A (en) 2021-02-08 2021-05-27 Screen projection method and electronic equipment

Country Status (1)

Country Link
CN (1) CN114915834A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116033209A (en) * 2022-08-29 2023-04-28 荣耀终端有限公司 Screen projection method and electronic equipment
CN116679895A (en) * 2022-10-26 2023-09-01 荣耀终端有限公司 Collaborative business scheduling method, electronic equipment and collaborative system
CN116679895B (en) * 2022-10-26 2024-06-07 荣耀终端有限公司 Collaborative business scheduling method, electronic equipment and collaborative system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090288132A1 (en) * 2008-05-14 2009-11-19 Samsung Electronics Co. Ltd. Method and communication system for controlling appliance device using a mobile device
CN102445985A (en) * 2010-11-26 2012-05-09 深圳市同洲电子股份有限公司 Digital television receiving terminal and mobile terminal interaction method, device and system
CN106502604A (en) * 2016-09-28 2017-03-15 北京小米移动软件有限公司 Throw screen changing method and device
CN107659712A (en) * 2017-09-01 2018-02-02 咪咕视讯科技有限公司 A kind of method, apparatus and storage medium for throwing screen
CN108063820A (en) * 2017-12-19 2018-05-22 广州敬信药草园信息科技有限公司 A kind of throwing screen synchronous method of cloud meeting
CN110147199A (en) * 2019-05-23 2019-08-20 北京硬壳科技有限公司 Display on the same screen method, apparatus and touch control display
CN110381195A (en) * 2019-06-05 2019-10-25 华为技术有限公司 A kind of throwing screen display methods and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090288132A1 (en) * 2008-05-14 2009-11-19 Samsung Electronics Co. Ltd. Method and communication system for controlling appliance device using a mobile device
CN102445985A (en) * 2010-11-26 2012-05-09 深圳市同洲电子股份有限公司 Digital television receiving terminal and mobile terminal interaction method, device and system
CN106502604A (en) * 2016-09-28 2017-03-15 北京小米移动软件有限公司 Throw screen changing method and device
CN107659712A (en) * 2017-09-01 2018-02-02 咪咕视讯科技有限公司 A kind of method, apparatus and storage medium for throwing screen
CN108063820A (en) * 2017-12-19 2018-05-22 广州敬信药草园信息科技有限公司 A kind of throwing screen synchronous method of cloud meeting
CN110147199A (en) * 2019-05-23 2019-08-20 北京硬壳科技有限公司 Display on the same screen method, apparatus and touch control display
CN110381195A (en) * 2019-06-05 2019-10-25 华为技术有限公司 A kind of throwing screen display methods and electronic equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116033209A (en) * 2022-08-29 2023-04-28 荣耀终端有限公司 Screen projection method and electronic equipment
CN116033209B (en) * 2022-08-29 2023-10-20 荣耀终端有限公司 Screen projection method and electronic equipment
CN116679895A (en) * 2022-10-26 2023-09-01 荣耀终端有限公司 Collaborative business scheduling method, electronic equipment and collaborative system
CN116679895B (en) * 2022-10-26 2024-06-07 荣耀终端有限公司 Collaborative business scheduling method, electronic equipment and collaborative system

Similar Documents

Publication Publication Date Title
CN109660842B (en) Method for playing multimedia data and electronic equipment
CN111316598B (en) Multi-screen interaction method and equipment
EP4030276B1 (en) Content continuation method and electronic device
EP4319169A1 (en) Screen projection method for electronic device, and electronic device
WO2021052214A1 (en) Hand gesture interaction method and apparatus, and terminal device
CN112394895B (en) Picture cross-device display method and device and electronic device
CN113923230B (en) Data synchronization method, electronic device, and computer-readable storage medium
CN111061445A (en) Screen projection method and computing equipment
WO2020173370A1 (en) Method for moving application icons, and electronic device
CN112398855B (en) Method and device for transferring application contents across devices and electronic device
WO2022166618A1 (en) Screen projection method and electronic device
JP7408784B2 (en) Callback stream processing methods and devices
CN111726678B (en) Method for continuously playing multimedia content between devices
WO2022100610A1 (en) Screen projection method and apparatus, and electronic device and computer-readable storage medium
CN114173193A (en) Multimedia stream playing method and electronic equipment
CN115756268A (en) Cross-device interaction method and device, screen projection system and terminal
CN113593567B (en) Method for converting video and sound into text and related equipment
CN113946302B (en) Method and device for opening file
CN114915834A (en) Screen projection method and electronic equipment
CN115883893A (en) Cross-device flow control method and device for large-screen service
CN115185441A (en) Control method, control device, electronic equipment and readable storage medium
CN114827098A (en) Method and device for close shooting, electronic equipment and readable storage medium
WO2023093778A1 (en) Screenshot capture method and related apparatus
CN114584817A (en) Screen projection method and system
CN117729420A (en) Continuous shooting method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220816

RJ01 Rejection of invention patent application after publication