WO2022166618A1 - Procédé de projection d'écran et dispositif électronique - Google Patents

Procédé de projection d'écran et dispositif électronique Download PDF

Info

Publication number
WO2022166618A1
WO2022166618A1 PCT/CN2022/073202 CN2022073202W WO2022166618A1 WO 2022166618 A1 WO2022166618 A1 WO 2022166618A1 CN 2022073202 W CN2022073202 W CN 2022073202W WO 2022166618 A1 WO2022166618 A1 WO 2022166618A1
Authority
WO
WIPO (PCT)
Prior art keywords
instruction
screen projection
application
multimedia content
type
Prior art date
Application number
PCT/CN2022/073202
Other languages
English (en)
Chinese (zh)
Inventor
陈兰昊
徐世坤
于飞
孟庆吉
杜奕全
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202110584296.6A external-priority patent/CN114915834A/zh
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022166618A1 publication Critical patent/WO2022166618A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content

Definitions

  • the embodiments of the present application relate to the field of electronic technologies, and in particular, to a screen projection method and electronic device.
  • the screen content displayed by one device When the user has multiple devices, project the screen content displayed by one device to the screen of another device, for example, project the multimedia content, game interface, etc. played by a small-screen device (for example, a mobile phone) to a large-screen device (for example, , computer, smart TV) playback, using the display screen and speaker equipment of the large-screen device to provide users with a better experience.
  • a small-screen device for example, a mobile phone
  • a large-screen device for example, computer, smart TV
  • the screen projection operation is a complex operation, including multiple screen projection scenarios, requiring the user to perform multiple steps, which may bring the user an inconvenient operation experience.
  • the embodiments of the present application provide a screen projection method and an electronic device, when a user needs to perform a screen projection, a convenient screen projection operation is provided for the user, and the user experience is improved.
  • the first device or the second device obtains the sensor data; when the first device or the second device parses the sensor data as the first type of instruction, at least one of the following operations is performed: the first device will The display content is mirrored and projected to the second device, and the first device sends the service data to the second device; when parsing the sensor data as the second type of instruction, at least one of the following operations is performed: the first device mirrors and projects the display content to the screen to the first device; the first device sends service data to the first device; the first device stops mirroring the display content to the second device, and the first device sends a control instruction to the second device.
  • the service data includes at least one of the following items: the name of the multimedia content, the identifier of the multimedia content, the uniform resource locator of the multimedia content, the playback progress of the multimedia content, the playback volume of the multimedia content, and the multimedia content.
  • Type of content the screen projection of the multimedia content can be realized, that is, the second device can play the multimedia content on the second device according to the above-mentioned information.
  • sending the service data by the first device to the second device includes: when the current application of the first device is a video playback application, calling an application program interface of the video playback application, obtaining the service data, The service data is sent to the second device, and the second device continues to play the multimedia content according to the service data.
  • the first device mirrors and projects the content to the second device, and further includes: mirroring and projecting the display content to the second device using the Miracast protocol.
  • Miracast is a protocol for mirroring screens.
  • the method further includes: the first device sets the foreground application associated with the display content as a floating window or a picture-in-picture mode for display.
  • the associated gesture of the first type of instruction is a bottom-to-up space gesture or a three-finger slide or four-finger slide.
  • the associated gesture of the second type of instruction is a top-to-bottom space gesture or a three-finger swipe up or a four-finger swipe up.
  • the first device mirrors and projects the display content to the second device, including: the first device analyzes the current interface through an image, obtains the position of the mirroring control, and simulates a user operation to execute the built-in projection of the application.
  • the screen function is used to mirror and mirror the display content to the second device. In this way, when the application program does not provide an application program interface for screen projection, the screen projection function in the application program can also be used to realize screen projection.
  • the first device sends the service data to the second device, including: the first device analyzes the current interface through an image, obtains the position of the screen projection control, and executes the built-in screen projection function of the application by simulating a user operation, Send service data to the second device.
  • the screen projection function in the application program can also be used to realize screen projection.
  • the first device or the second device stores a database of operation instructions, and it is characterized in that parsing the sensor data includes: comparing the sensor data or the result obtained after processing the sensor data with the data in the database. Comparing, acquiring sensor data corresponds to the first type of instruction or the second type of instruction.
  • the first device is a mobile phone
  • the second device is a large screen
  • the present application provides an electronic device including a memory and one or more processors; wherein the memory is used to store computer program codes, and the computer program codes include computer instructions; when the computer instructions are executed by the processor , the method of making the electronic device perform screen projection.
  • the present application provides a computer-readable storage medium, including computer instructions, when the computer instructions are executed on an electronic device, the electronic device performs a screen projection method
  • the present application provides a computer program product, which when the computer program product runs on a computer, causes the computer to execute a method for screen projection.
  • the solution proposed in this application can take into account a variety of screen projection scenarios, shield the user from the bottom layer implementation, and unify the implementation logic, so as to provide users with a better and more convenient screen projection experience.
  • FIG. 1 shows a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application
  • FIGS. 2A-2B show schematic diagrams of usage scenarios provided by embodiments of the present application.
  • FIG. 3 shows a flowchart of a method provided by an embodiment of the present application
  • FIG. 4 shows a flowchart of a method provided by an embodiment of the present application.
  • first and second are only used for descriptive purposes, and should not be construed as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
  • a feature defined as “first” or “second” may expressly or implicitly include one or more of that feature.
  • plural means two or more.
  • the embodiments of the present application provide a method and device for screen projection, which can be applied to mobile phones, tablet computers, wearable devices (for example, watches, wristbands, helmets, headphones, etc.), in-vehicle devices, and augmented reality (augmented reality, AR)/virtual reality (VR) devices, laptops, ultra-mobile personal computers (UMPCs), netbooks, personal digital assistants (PDAs), smart home devices (eg, smart TVs, smart speakers, smart cameras, etc.) and other electronic devices.
  • augmented reality augmented reality, AR
  • VR virtual reality
  • laptops laptops
  • UMPCs ultra-mobile personal computers
  • PDAs personal digital assistants
  • smart home devices eg, smart TVs, smart speakers, smart cameras, etc.
  • FIG. 1 shows a schematic diagram of the hardware structure of the electronic device 100 .
  • the electronic device 100 may include a processor 110 , an external memory interface 120 , an internal memory 121 , a universal serial bus (USB) interface 130 , a charging management module 140 , a power management module 141 , and a battery 142 , Antenna 1, Antenna 2, Mobile Communication Module 150, Wireless Communication Module 160, Audio Module 170, Speaker 170A, Receiver 170B, Microphone 170C, Headphone Interface 170D, Sensor Module 180, Key 190, Motor 191, Indicator 192, Camera 193 , a display screen 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100 . The controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices.
  • the charging management module 140 is used to receive charging input from the charger.
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140 and supplies power to the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed into a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave and radiate it out through the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify the signal, and then convert it into an electromagnetic wave for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the display screen 194 is used to display a display interface of an application, such as a viewfinder interface of a camera application.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • AMOLED organic light-emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed
  • quantum dot light-emitting diode quantum dot light emitting diodes, QLED
  • electronic device 100 may include one or more display screens 194 .
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include one or more cameras 193 .
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy and so on.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos of various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 Moving Picture Experts Group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing the instructions stored in the internal memory 121 .
  • the internal memory 121 may include a storage program area and a storage data area. Wherein, the storage program area can store the operating system, and the software code of at least one application (for example, Huawei video application, wallet, etc.).
  • the storage data area may store data generated during the use of the electronic device 100 (eg, captured images, recorded videos, etc.) and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. Such as saving pictures, videos and other files in an external memory card.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and an environmental sensor Light sensor 180L, bone conduction sensor 180M, etc.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. Touch buttons are also possible.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback. For example, touch operations acting on different applications (such as taking pictures, playing audio, etc.) can correspond to different vibration feedback effects.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate a charging state, a change in power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card. The SIM card can be connected to and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • FIG. 1 do not constitute a specific limitation on the electronic device 100, and the electronic device 100 may also include more or less components than those shown in the figure, or combine some components, or separate some components components, or a different arrangement of components.
  • the combination/connection relationship between the components in FIG. 1 can also be adjusted and modified.
  • the user can display the content displayed on the first device on the second device, and use hardware such as a display screen and a speaker of the second device to obtain a better content playback experience.
  • the content displayed on the first device may be multimedia content (eg, pictures, videos, audio, etc.), and for example, the content displayed on the first device may be games, application interfaces, and the like.
  • the first device and the second device may be one type of electronic device 100, for example, the first device is a mobile phone, a tablet computer, and the second device is a smart TV, a personal computer, a large screen (referred to as a large screen) )Wait.
  • the first device is a mobile phone, a tablet computer
  • the second device is a smart TV, a personal computer, a large screen (referred to as a large screen) )Wait.
  • Screencasting means that the content on the first device can be displayed on the second device through a certain protocol.
  • the first device optionally projects the multimedia content (such as pictures, videos, audios, etc.) to the second device for display.
  • Common protocols such as DLNA (Digital Living Network Alliance, Digital Living Network Alliance) ) protocol and Chromecast protocol, etc., as shown in Figure 2A, including mobile phone 201 and smart TV 202, the identification (such as link) of multimedia content "Running" to be played by mobile phone 201, playback information (such as playback progress, playback speed, playback sound) etc.) are sent to the smart TV 202, and the smart TV 202 obtains information related to "Running” from the network or database according to the identification (such as a link) and playback information (such as playback progress, playback speed, playback sound, etc.), and plays the multimedia content " “Running” to realize the effect of playing multimedia content from one device to another.
  • DLNA Digital Living Network Alliance, Digital Living Network Alliance
  • Chromecast protocol etc
  • the user can control the playback progress, playback speed, and playback sound of "Running" on the second device.
  • the first device optionally displays the content displayed by the first device on the second device in a mirror image, for example, encodes the content displayed by the first device and transmits it to the second device through a video stream.
  • the second device decodes and plays the video stream, common protocols such as Miracast protocol, Airplay protocol, etc. As shown in FIG.
  • the mobile phone mirrors and projects the content displayed on the mobile phone (the user interface of the video application) to the smart TV 204 , that is, the content displayed on the smart TV 204 and the content displayed on the mobile phone 203
  • the smart TV 204 also changes in response.
  • common user interactions with the electronic device 100 include voice interaction, touch screen gestures, and air gestures.
  • the touch screen gesture refers to the gesture generated by the user and the display screen of the electronic device 100.
  • Common touch screen gestures may include single-finger operations, such as tap (tap), long press (press), pan (pan), double-click ( double click), etc.; it may also include multi-finger operations, such as pinch, three-finger swipe, rotate, and the like.
  • the space gesture refers to a certain distance between the user and the display screen of the electronic device 100.
  • the sensor of the electronic device 100 (such as a camera, a distance sensor, etc.) captures the shape of the user's hand, and compares it with the preset gestures in the database. Then, perform the corresponding operation according to the preset gesture. It can be understood that a gesture is only a way of triggering a function, and the present application does not limit the gesture to be used.
  • the user uses a smart TV to watch videos in the living room, the user needs to go to the balcony/living room to do some things, and the user wants to not terminate the video playback, that is, transfer the video played on the large screen to the mobile phone to continue watching.
  • the user after the user returns to the living room after finishing his work, he also hopes to be able to transfer the video played on the mobile phone to the large screen to continue watching. Therefore, there is a need for a method that can conveniently switch the played video between the first device and the second device, so as to meet the user's requirements for video playback at different times and in different scenarios.
  • the first device and the second device are capable of data interaction.
  • the first device and the second device are optionally in the same local area network, and perform data exchange through the local area network; the first device and the second device can optionally use a point-to-point connection to directly perform data interaction through a P2P channel; the first device and the second device
  • the two devices optionally use data traffic to exchange data through the WAN.
  • This application does not limit the manner of data interaction between the first device and the second device.
  • the screencasting of multimedia content requires multiple devices to perform mutual authentication, so that screencasting is performed in a trusted environment, for example, the first device and the second device log in The same account, such as Huawei account, AppleID, Samsung account, etc.
  • the projection of the multimedia content requires that the application programs of the two devices log in to the same account, for example, the first device and the second device have the same application program installed, or different versions of the same application program. Both a device and a second device are logged on the application, so that the second device can obtain the specified multimedia content from the database according to the received identification (eg, link) of the multimedia content.
  • the implementation includes step 301, step 302 and step 303. It can be understood that step 301, step 302 and step 303 are all optional steps, and step 301, step 302 and step 303 are optional. The execution order of step 302 and step 303 can be adjusted.
  • the first device acquires sensor data (step 301).
  • the gesture when the gesture is a touch screen gesture, the first device acquires touch screen data reported by a display driver corresponding to the display screen 194 .
  • the gesture when the gesture is an air gesture, the first device optionally acquires image data through a camera, and the first device optionally acquires millimeter wave data through a radar sensor.
  • the control command screen projection command
  • the first device acquires the data collected by the microphone 170C.
  • the first device acquires an operation instruction according to the acquired sensor data (step 302).
  • the first device stores a database of sensor data and operation instructions, compares the results obtained by analyzing the sensor data with data in the database of operation instruction data, and obtains the operation instructions. For example, the first device detects data on the touch screen 194 for two consecutive times in a short period of time, and the first device determines that the operation is a double-click operation according to the database of operation instructions; The user's hand moves from a first position close to the upper edge of the image to a second position close to the lower edge of the image, and the first device determines that the operation is a bottom-up gesture (operation) according to the database of operation instructions.
  • the first type of instructions is used to project multimedia content to other devices for playback, for example, to project the multimedia content played by a first device to a second device. Play on the device; wherein the second type of instruction is used to play the multimedia content played on other devices on the first device.
  • the first type of instructions include but are not limited to: voice instructions, touch screen gestures and air gestures that instruct to play multimedia content on the second device; the second type of instructions include but are not limited to: instruct to play multimedia content on Voice commands, touchscreen gestures, and air gestures played on a device.
  • the first type of instruction and the second type of gesture may be associated with a plurality of voice instructions, touch screen gestures, and air gestures.
  • voice instructions, touch screen gestures, and air gestures When one of the voice instructions, touch screen gestures, and air gestures is captured, The corresponding action can be performed.
  • the gesture associated with the first type of instruction can be a gesture of opening the palm from the bottom to the top, or a gesture of touching the touch screen of the mobile phone and sliding up from the bottom of the screen.
  • the gesture can be preset with the system. Different multi-finger gestures, such as three-finger swipe up, four-finger swipe up, prevent conflicts.
  • the gesture associated with the second type of instruction can be an air gesture of opening the palm and moving from top to bottom, or a gesture of touching the touch screen of the mobile phone and sliding down from the top of the screen, which can be a gesture preset with the system Different multi-finger gestures, such as three-finger swipe, four-finger swipe.
  • the first device plays the multimedia content on the second device according to the obtained operation instruction being the first type of instruction (step 303).
  • the first device plays the multimedia content on the first device according to the obtained operation instruction being the second type of instruction (step 304).
  • FIG. 4 is an embodiment of the present application, which is used to introduce a solution for processing multimedia content and screen mirroring by the first device after recognizing the first type of instruction and the second type of instruction.
  • the first device acquires sensor data (step 401 ), and the first device identifies the operation instruction as the first type of instruction according to the acquired sensor data (step 402 ).
  • the above steps have been described in step 301 and step 302, and will not be described in detail here.
  • the first device mirrors the content to the second device (step 403). After the first device recognizes the first type of instruction, it mirrors and projects the displayed content to the second device. For example, the first device mirrors the content to the second device through the Miracast protocol.
  • the first device sends the service data to the second device (step 404), and after parsing the service data, the second device plays the multimedia content.
  • screen projection of content is an operation within an application program
  • the application provides an application program interface for screen projection
  • the first device calls the application program interface provided by the application program to realize screen projection after recognizing it as the first instruction.
  • the first device identifies the current foreground application as a video playback application, and the video playback application has an application program interface for screencasting, the first device calls the screencasting application program interface of the application, and the application program collects and returns business data to the first device.
  • the first device sends service data to a second device, and the second device implements continuous playback of multimedia content according to the received service data.
  • the service data includes one or more of the following options: the name of the multimedia content, such as "Running” shown in Figure 2A; the identification of the multimedia content, such as "Running” in a video playback application The corresponding ID is 12001; the uniform resource locator of the multimedia content, for example, the uniform resource locator corresponding to "Running” is www.video.com/12001; the playback progress of the multimedia content; the playback volume of the multimedia content; the type of the multimedia content For example, the type corresponding to "Running” is video, and the type corresponding to music is audio, etc.
  • the first device sends the service data to the second device (step 404), and after parsing the service data, the second device plays the multimedia content.
  • the first device recognizes that the operation command is the first type of command, it analyzes the controls of the current interface through the image, executes the built-in screen projection function of the application by simulating user operations, and sends the service data to the second device to realize the implementation on the second device. Playback of multimedia content.
  • the first device when the first device recognizes that the operation instruction is the first type of instruction, it first determines whether the foreground application has a screen-casting application program interface, and if so, calls the screen-casting application program interface; Project the interface of the first device to the second device by screen mirroring.
  • the first device when the first device recognizes that the operation instruction is the first type of instruction, it first determines whether the foreground application has a screen-casting application program interface, and if there is a screen-casting application program interface, it calls the screen-casting application program The interface is used for screen projection. If not, it is judged whether the current interface includes a screen projection control. If there is a screen projection control, the user's operation is simulated. By clicking on the control, the multimedia content is projected to the second device. If there is no screen projection control, the interface of the first device is mirrored to the second device by screen mirroring.
  • the first device when the first device adopts DLNA screen projection, even if the first device receives a user operation and returns the first device to the background, because DLNA screen projection only needs to maintain a basic connection channel for transmitting playback information , so playback on the second device doesn't pause.
  • the first device when the first device adopts mirror projection, the first device receives a user operation, returns the first device to the background, and mirror projection will capture the application (such as a desktop application or other application in the foreground at this time) application), resulting in content not originally expected to be played on the second device.
  • the application such as a desktop application or other application in the foreground at this time
  • the first device determines that the first application is mirroring the screen to the second device, and the first device optionally performs the operation of returning to the first application, the first application (the video playback application, or the video playback application's part, or a part obtained by cropping the video playback application) in the form of a floating window, while keeping the life cycle of the floating window in the Resumed stage, and projecting the content of the floating window to the second device; or the first device optionally When performing the operation of returning to the first application, the video being played by the first application is retained in the form of picture-in-picture, and the content of the picture-in-picture is screened to the second device at the same time.
  • the first device when the first device adopts DLNA screen projection, the first device can receive user control commands (for example, fast-forward, fast-rewind, turn up the volume, turn down the volume), and send these control commands through the first device.
  • the connection channel between one device and the second device is transmitted to the second device, so as to control the playback of the multimedia content on the second device.
  • the second device may also accept a user's control command on the multimedia content (for example, fast-forward, fast-rewind, turn up the volume, turn down the volume).
  • the first device when the first device adopts mirror projection and the first device is currently playing multimedia content, the first device can receive user control commands (for example, fast-forward, fast-rewind, increase volume, adjust low volume), and apply the control command to the multimedia content, and at the same time transmit the execution result to the second device, thereby changing the display result of the streaming media on the second device. Meanwhile, on the second device, the user cannot control the playback of the streaming media.
  • user control commands for example, fast-forward, fast-rewind, increase volume, adjust low volume
  • the second device when the second device has a function capable of sensing user operations, for example, the second device has a camera, a radar, etc. that can capture user gestures, the second device optionally pre-stores a gesture library related to broadcast control , after acquiring and recognizing the user gesture, transmits the recognized control result to the first device, and the first device controls the playback of the streaming media according to the control result, thereby changing the display effect of the streaming media on the second device.
  • the second device optionally transmits the captured video of the user gesture or the processing result of the video to the first device, and the first device executes the identified command on the streaming media according to the stored playback-related gesture library, Thus, the display effect of the streaming media on the second device is changed.
  • the second device will determine whether the screen projection of multimedia content or mirror projection is currently being used, and if it is a screen projection of multimedia content, gesture recognition is performed; Perform gesture recognition.
  • the second device will determine whether the screen projection of multimedia content or mirror projection is currently being used, and if it is a screen projection of multimedia content, gesture recognition is performed; Image analysis, it is judged that what is currently playing is a continuous media stream, if yes, gesture recognition is performed, if not, gesture recognition is not performed.
  • the first device acquires sensor data (step 401 ), and the first device identifies the operation instruction as a second type of instruction according to the acquired sensor data (step 405 ).
  • the above steps have been described in step 301 and step 302, and will not be described in detail here.
  • the playback device of the multimedia content is the second device.
  • the first device recognizes that the operation instruction is the second type of instruction, it sends a message for acquiring the multimedia content to the second device.
  • the second device optionally directly mirrors and projects the played content to the first device (step 406).
  • the second device optionally determines whether the interface currently playing multimedia content provides an application program interface for screen projection, and if there is an application program interface for screen projection, it calls the application program interface to obtain service data, and sends the service data. to the first device (step 407 ), if not, mirror and project the played content to the first device.
  • the second device still keeps playing the streaming media after the mirror projection or streaming media projection is completed; in other possible implementations, after the mirror projection is completed, the second device continues to play The streaming media continues to be played in the form of a floating window or picture-in-picture; in some other possible implementations, the second device returns to the main interface or plays other multimedia content after completing the streaming media projection.
  • the playback device of the multimedia content is the first device.
  • the first device recognizes that the operation instruction is the second type of instruction, and determines that the first device is currently mirroring and projecting the screen to the second device, the first device stops mirroring and projecting the screen to the second device (step 408 ).
  • the first device recognizes that the operation command is the second type of command, and judges that it is currently performing screen projection through service data, it sends a message for terminating the playback of the multimedia content to the second device. After the second device receives the data, it pauses or terminates the playback. multimedia content.
  • the first device When the first device recognizes that the operation instruction is the second type of instruction, and determines that there is a record of screencasting to the second device within a certain period of time, it sends a message for terminating the playback to the second device (step 409), and the second device receives After the data, pause or stop playing the multimedia content.
  • the first device or the second device determines the current network quality , if the network quality is better than the threshold, mirror projection is performed at the first resolution, and if the network command is lower than the threshold, projection is performed at the second resolution, where the first resolution is higher than the second resolution.
  • the above-mentioned media stream can also be an audio stream.
  • the first device does not need to display the interface/interface of the first application in the form of a floating window. Display multimedia content in the form of picture-in-picture.
  • the first application has the permission to play in the background, no unnecessary operations are required; when the first application does not have the permission to play in the background, when the first application returns to the background, the life cycle of the first application is maintained .
  • the first device and the second device in the above solution may be used in scenarios of audio calls and video calls.
  • the first device can migrate the video call to the second device after receiving the gesture and recognizing it as the first type of instruction On the device (large screen);
  • the main devices of the video call are the second device and the peer device, the first device can transfer the video call to the first device after receiving the gesture and recognizing it as the second type of instruction.
  • the above solution can satisfy that when the first device/second device does not have audio call software/video call software, it has the video call capability, and can realize the migration of the audio call/video call subject.
  • the first device/second device and the peer device when the first device/second device and the peer device are using a VoIP call, when the first device receives the gesture and parses it into the first type of instruction, it can optionally call The VoIP number of the second device is used to pull the second device into a conference call (session).
  • the call session includes the first device, the second device and the peer device.
  • the first device exits the session, and the transition from the first device to the second device is implemented.
  • the first device still remains in the session and remains in a mute state, so that when the user still needs to switch the subject of the call, it can respond quickly.
  • the first device still remains in the session and in a silent state, and counts the time that it resides in the session, and when the retention time is greater than the first time threshold, the first device exits The session can not only satisfy the user to quickly switch the call subject, but also reduce the power consumption of the first device when the user does not need to use the first device.
  • the first device/second device and the peer device when the first device/second device and the peer device are using a VoIP call, when the first device receives the gesture and parses it into a second type of instruction, it can optionally send a message to the second device.
  • the device sends a call request, where the call request optionally includes fields or information used to identify the first device, such as a VoIP number, a UUID, and the like.
  • the second device pulls the first device into a conference call (session).
  • the call session includes the first device, the second device and the peer device.
  • the second device exits the session session to realize the transition from the second device to the first device.
  • the above method when the first device/second device and the opposite end device use an account-based call, the above method may also be used, which will not be repeated in this application.
  • the first device optionally projects the video to the second device by way of screen projection.
  • the video input source of the first device is still the image captured by the camera of the first device.
  • a close-range user portrait can still be captured.
  • a video transmission channel is established between the first device and the second device, and when the gesture is received and parsed into the first type of instruction, the second device can optionally turn on the camera and transmit the captured image to the first device.
  • the first device transmits the received video stream as a captured image to the peer device.
  • an audio transmission channel is established between the first device and the second device, and when the gesture is received and parsed into the first type of instruction, the second device can optionally turn on the microphone to collect sound, and transmit the captured sound. For the first device, the first device transmits the received audio stream as the collected audio to the peer device. In the above manner, the user can experience a better migration experience, that is, the video call can be completed without picking up the second device.
  • the screen mirroring method is used for the first time, When the first time threshold is reached, the above method of switching the calling subject is adopted.
  • the subject of the call when the subject of the call is the second device and the peer device, when the gesture is received and parsed into a second type of instruction, the subject of the call can be switched from the second device to the first device. no longer discussed.
  • the video call in the above method may also be an audio call, which is not limited in this application.
  • the solution of the present application can also be used for audio projection, for example, projecting the audio (music, FM, etc.) played on the first device to the speaker. It is also optionally implemented with directional gestures for calls (eg, directional gestures on the screen or directional space gestures).
  • the present application provides an electronic device, including a memory and one or more processors; wherein, the memory is used to store computer program codes, and the computer program codes include computer instructions; when the computer instructions are executed by the processor, the electronic device is made to perform screen projection. Methods.
  • the present application provides a computer-readable storage medium, including computer instructions, when the computer instructions are executed on an electronic device, the electronic device performs a screen projection method.
  • the present application provides a computer program product, which when the computer program product runs on a computer, enables the computer to execute a method for screen projection.
  • the functions described in the present invention may be implemented in hardware, software, firmware, or any combination thereof.
  • the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium.
  • Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
  • a storage medium can be any available medium that can be accessed by a general purpose or special purpose computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon des modes de réalisation, la présente demande concerne un procédé de projection d'écran et un dispositif électronique. Le procédé consiste : lorsqu'un utilisateur a besoin d'effectuer une projection d'écran, à obtenir des données de capteur; lorsque les données de capteur sont identifiées en tant qu'instruction d'un premier type, à exécuter une projection en miroir de contenu sur un second dispositif, et/ou un envoi des données de service au second dispositif; lorsque les données de capteur sont identifiées en tant qu'instruction d'un second type, à exécuter une projection en miroir de contenu sur un premier dispositif, et/ou un envoi des données de service au premier dispositif, et/ou l'arrêt de la projection en miroir du contenu sur le second dispositif, et/ou l'envoi de l'instruction au second dispositif. La solution proposée dans la présente demande est applicable à divers scénarios de projection d'écran, et la logique de mise en œuvre est unifiée, ce qui permet d'obtenir une expérience d'utilisation de projection d'écran améliorée et plus commode pour un utilisateur.
PCT/CN2022/073202 2021-02-08 2022-01-21 Procédé de projection d'écran et dispositif électronique WO2022166618A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202110171010 2021-02-08
CN202110171010.1 2021-02-08
CN202110584296.6A CN114915834A (zh) 2021-02-08 2021-05-27 一种投屏的方法和电子设备
CN202110584296.6 2021-05-27

Publications (1)

Publication Number Publication Date
WO2022166618A1 true WO2022166618A1 (fr) 2022-08-11

Family

ID=82741936

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/073202 WO2022166618A1 (fr) 2021-02-08 2022-01-21 Procédé de projection d'écran et dispositif électronique

Country Status (1)

Country Link
WO (1) WO2022166618A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115484484A (zh) * 2022-08-30 2022-12-16 深圳市思为软件技术有限公司 一种智能设备投屏控制方法、装置、电子设备及存储介质
CN115802083A (zh) * 2022-11-22 2023-03-14 深圳创维-Rgb电子有限公司 控制方法、装置、分体电视及可读存储介质
CN117119615A (zh) * 2023-10-25 2023-11-24 吉林藤兴科技有限公司 一种智能移动终端在车辆中控大屏的投屏方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102445985A (zh) * 2010-11-26 2012-05-09 深圳市同洲电子股份有限公司 数字电视接收终端与移动终端交互方法、装置和系统
US20170085960A1 (en) * 2014-05-30 2017-03-23 Tencent Technology (Shenzhen) Company Limited Video-based interaction method, terminal, server and system
CN107659712A (zh) * 2017-09-01 2018-02-02 咪咕视讯科技有限公司 一种投屏的方法、装置及存储介质
CN111327769A (zh) * 2020-02-25 2020-06-23 北京小米移动软件有限公司 多屏互动方法及装置、存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102445985A (zh) * 2010-11-26 2012-05-09 深圳市同洲电子股份有限公司 数字电视接收终端与移动终端交互方法、装置和系统
US20170085960A1 (en) * 2014-05-30 2017-03-23 Tencent Technology (Shenzhen) Company Limited Video-based interaction method, terminal, server and system
CN107659712A (zh) * 2017-09-01 2018-02-02 咪咕视讯科技有限公司 一种投屏的方法、装置及存储介质
CN111327769A (zh) * 2020-02-25 2020-06-23 北京小米移动软件有限公司 多屏互动方法及装置、存储介质

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115484484A (zh) * 2022-08-30 2022-12-16 深圳市思为软件技术有限公司 一种智能设备投屏控制方法、装置、电子设备及存储介质
CN115484484B (zh) * 2022-08-30 2024-05-14 深圳市思为软件技术有限公司 一种智能设备投屏控制方法、装置、电子设备及存储介质
CN115802083A (zh) * 2022-11-22 2023-03-14 深圳创维-Rgb电子有限公司 控制方法、装置、分体电视及可读存储介质
CN117119615A (zh) * 2023-10-25 2023-11-24 吉林藤兴科技有限公司 一种智能移动终端在车辆中控大屏的投屏方法

Similar Documents

Publication Publication Date Title
WO2021078284A1 (fr) Procédé de continuation de contenu et dispositif électronique
WO2020098437A1 (fr) Procédé de lecture de données multimédia et dispositif électronique
WO2020014880A1 (fr) Procédé et dispositif d'interaction multi-écran
WO2020216156A1 (fr) Procédé de projection d'écran et dispositif informatique
WO2021052214A1 (fr) Procédé et appareil d'interaction par geste de la main et dispositif terminal
WO2022257977A1 (fr) Procédé de projection d'écran pour dispositif électronique, et dispositif électronique
WO2020224449A1 (fr) Procédé de manœuvre d'affichage à écran partagé et dispositif électronique
WO2022166618A1 (fr) Procédé de projection d'écran et dispositif électronique
WO2022100304A1 (fr) Procédé et appareil de transfert d'un contenu d'application à travers des dispositifs, et dispositif électronique
WO2021052282A1 (fr) Procédé de traitement de données, module bluetooth, dispositif électronique et support d'enregistrement lisible
WO2020228645A1 (fr) Procédé pour la réalisation de lecture de données audio et vidéo, terminal et dispositif
WO2020143380A1 (fr) Procédé de transmission de données et dispositif électronique
WO2020173370A1 (fr) Procédé pour déplacer des icônes d'applications et dispositif électronique
WO2022048474A1 (fr) Procédé destiné à de multiples applications afin de partager une caméra, et dispositif électronique
WO2022121775A1 (fr) Procédé de projection sur écran, et dispositif
WO2021047567A1 (fr) Procédé et dispositif de traitement de flux de rappel
WO2022100610A1 (fr) Procédé et appareil de projection d'écran, ainsi que dispositif électronique et support de stockage lisible par ordinateur
WO2021258809A1 (fr) Procédé de synchronisation de données, dispositif électronique et support de stockage lisible par ordinateur
WO2022052791A1 (fr) Procédé de lecture de flux multimédia et dispositif électronique
WO2022028537A1 (fr) Procédé de reconnaissance de dispositif et appareil associé
WO2022007944A1 (fr) Procédé de commande de dispositif et appareil associé
WO2022042769A2 (fr) Système et procédé d'interaction multi-écrans, appareil, et support de stockage
WO2022135163A1 (fr) Procédé d'affichage de projection d'écran et dispositif électronique
JP2023528384A (ja) コンテンツ共有方法、装置、およびシステム
CN114915834A (zh) 一种投屏的方法和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22748897

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22748897

Country of ref document: EP

Kind code of ref document: A1