CN117156189A - Screen-throwing display method and electronic equipment - Google Patents

Screen-throwing display method and electronic equipment Download PDF

Info

Publication number
CN117156189A
CN117156189A CN202310212384.2A CN202310212384A CN117156189A CN 117156189 A CN117156189 A CN 117156189A CN 202310212384 A CN202310212384 A CN 202310212384A CN 117156189 A CN117156189 A CN 117156189A
Authority
CN
China
Prior art keywords
electronic device
display
screen
video streams
service
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310212384.2A
Other languages
Chinese (zh)
Inventor
徐辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310212384.2A priority Critical patent/CN117156189A/en
Publication of CN117156189A publication Critical patent/CN117156189A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43072Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4347Demultiplexing of several video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Telephone Function (AREA)

Abstract

The application provides a screen-throwing display method and electronic equipment, which are beneficial to providing a multi-view viewing picture for a user on second electronic equipment under the scene of throwing a screen from first electronic equipment to second electronic equipment, and meet the requirement that the user wants to view a global picture and a local picture. The method comprises the following steps: the first electronic device determines that the second electronic device supports simultaneous display of N paths of video streams; the first electronic equipment acquires N virtual display screens; the method comprises the steps that a first electronic device determines that a target application supports simultaneous display of M paths of video streams, wherein the target application comprises an application loaded in the first electronic device; the first electronic equipment draws Z paths of video streams in Z virtual display screens in N virtual display screens respectively, wherein Z is smaller than or equal to a first value, and the first value is the minimum value in M and N; the first electronic device projects a screen to display the Z-path video stream in the second electronic device.

Description

Screen-throwing display method and electronic equipment
Technical Field
The application relates to the technical field of terminals, in particular to a screen projection display method and electronic equipment.
Background
With the rapid development of the automobile industry, particularly automobile hardware, automobile media functions play an increasingly important role in the life of people, and the time for people to use the automobile media functions on automobiles is gradually increased. For example, during long-distance trips, a passenger of a secondary or rear-row may watch a media program such as a sporting event or a live game on the vehicle.
Typically, a user may connect a mobile phone to a car machine, and then watch a media program on the car machine by means of a screen-throwing manner. Taking a media program such as a football match as an example, for a sports event such as a football match, the vehicle machine presents a picture of a global view (referred to as a global picture) and a picture of a local view (referred to as a local picture) for a user in a manner of serial switching between the global view and the local view. For example, a global picture is provided first, and after a highlight appears, the local picture is played back by playback. However, such a way makes it impossible for the user to see the global movement position of the football on the playing field and the highlight moment picture of the shooting of a certain football star at the same time.
In summary, in a scene that the mobile phone is put on the car, the car cannot meet the requirement that the user wants to watch the global picture and the local picture.
Disclosure of Invention
The application provides a screen-throwing display method and electronic equipment, which are beneficial to providing a multi-view watching picture for a user on second electronic equipment under the scene that first electronic equipment (for example, a mobile phone) throws a screen to second electronic equipment (for example, a car machine), thereby meeting the requirement that the user wants to watch a global picture and a local picture.
In a first aspect, a method for displaying a projection screen is provided, the method comprising: the first electronic device determines that the second electronic device supports simultaneous display of N paths of video streams; the first electronic equipment acquires N virtual display screens; the method comprises the steps that a first electronic device determines that a target application supports simultaneous display of M paths of video streams, wherein the target application comprises an application loaded in the first electronic device; the first electronic equipment draws Z paths of video streams in Z virtual display screens in N virtual display screens respectively, wherein Z is smaller than or equal to a first value, and the first value is the minimum value in M and N; the first electronic device projects a screen to display the Z-path video stream in the second electronic device.
In the application, after the first electronic equipment is connected with the second electronic equipment, the first electronic equipment can inquire whether the second electronic equipment supports simultaneous display of multiple paths of video streams, and under the condition that the second electronic equipment supports simultaneous display of N paths of video streams, N virtual display screens are configured for simultaneously displaying the multiple paths of video streams on the second electronic equipment, and the N virtual display screens can be used for drawing the N paths of video streams.
After the user opens the target application on the second electronic device (the target application displayed on the second electronic device is projected on the screen, the actual target application is loaded in the first electronic device, and the target application may not be installed in the second electronic device), the first electronic device may also determine whether the target APP supports simultaneous display of multiple video streams. Under the condition that the second electronic equipment supports simultaneous display of N paths of video streams and the target application supports simultaneous display of M paths of video streams, the first electronic equipment and the second electronic equipment can achieve the purpose of simultaneously displaying Z paths of video streams on the second electronic equipment based on a screen projection protocol. Wherein different video streams correspond to pictures of different perspectives of a media program in a target application. The mode can meet the requirement that the user simultaneously views pictures of the same media program at different viewing angles on the second electronic equipment, and improves the use experience of the user.
With reference to the first aspect, in certain implementation manners of the first aspect, the determining, by the first electronic device, that the second electronic device supports simultaneous display of N video streams includes: the method comprises the steps that a first electronic device sends a first query request to a second electronic device, wherein the first query request is used for requesting to query whether the second electronic device supports simultaneous display of multiple video streams; the method comprises the steps that a first electronic device receives first query feedback from a second electronic device, and the first query feedback indicates that the second electronic device supports simultaneous display of N paths of video streams; the first electronic device determines that the second electronic device supports simultaneous display of N paths of video streams based on the first query feedback. The first electronic device obtains N virtual display screens, including: the first electronic device obtains N virtual display screens based on the first query feedback.
With reference to the first aspect, in certain implementation manners of the first aspect, the determining, by the first electronic device, that the target application supports simultaneous display of M video streams includes: and under the condition that the first indication information from the second electronic equipment is received, the first electronic equipment determines that the target application supports simultaneous display of M paths of video streams, and the first indication information is used for indicating a user to select the target application on the second electronic equipment.
With reference to the first aspect, in certain implementation manners of the first aspect, before the first electronic device determines that the second electronic device supports simultaneous display of N video streams, the method further includes: the first electronic device establishes a connection with the second electronic device based on the communication connection technology.
With reference to the first aspect, in certain implementations of the first aspect, the communication connection technique includes a wireless fidelity (wireless fidelity, wi-Fi) connection, a bluetooth connection, or a universal serial bus (universal serial bus, USB) connection.
With reference to the first aspect, in some implementations of the first aspect, the first electronic device drops a screen to display a Z-path video stream in the second electronic device, including: the first electronic device sends the Z-path video stream to the second electronic device through a Wi-Fi connection or a USB connection.
In a second aspect, a method for displaying a screen is provided, which is applied to a screen display system including a first electronic device and a second electronic device, and includes: the method comprises the steps that a first electronic device sends a first query request to a second electronic device, wherein the first query request is used for requesting to query whether the second electronic device supports simultaneous display of multiple video streams; the method comprises the steps that a first electronic device receives first query feedback from a second electronic device, wherein the first query feedback is used for indicating the second electronic device to support simultaneous display of N paths of video streams; the first electronic equipment determines that the second electronic equipment supports simultaneous display of N paths of video streams based on the first query feedback; the first electronic equipment acquires N virtual display screens based on first query feedback; the second electronic equipment sends first indication information to the first electronic equipment, wherein the first indication information is used for indicating a user to select a target application on the second electronic equipment; the method comprises the steps that based on first indication information, the first electronic equipment determines that a target application supports simultaneous display of M paths of video streams; the first electronic equipment draws Z paths of video streams in Z virtual display screens in N virtual display screens respectively, wherein Z is smaller than or equal to a first value, and the first value is the minimum value in M and N; the first electronic device sends a Z-path video stream to the second electronic device; the second electronic device displays the Z-path video stream.
In a third aspect, the present application provides an electronic device comprising: comprising the following steps: a processor and a memory; the memory stores computer-executable instructions; the processor executes computer-executable instructions stored in the memory to cause the electronic device to perform the method as in the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium storing a computer program. The computer program, when executed by a processor, implements a method as in the first aspect.
In a fifth aspect, the application provides a computer program product comprising a computer program which, when run, causes a computer to perform the method as in the first aspect.
In a sixth aspect, the application provides a chip comprising a processor for invoking a computer program in memory to perform the method according to the first aspect.
It should be understood that, the second aspect to the sixth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the advantages obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device to which an embodiment of the present application is applicable;
FIG. 2 is a block diagram of a software architecture of an electronic device to which embodiments of the present application are applicable;
FIG. 3 is a schematic diagram of a mobile phone and a vehicle interaction framework provided by an embodiment of the present application;
FIG. 4 is a schematic flow chart of a screen projection display method provided by an embodiment of the application;
FIG. 5 is an interface schematic diagram of a vehicle machine according to an embodiment of the present application;
fig. 6 is a schematic flow chart of another screen projection display method according to an embodiment of the present application.
Detailed Description
The technical scheme of the application will be described below with reference to the accompanying drawings.
In order to clearly describe the technical scheme of the embodiment of the present application, the following first describes the related terms related to the present application in detail.
In the embodiment of the application, the words "first", "second", etc. are used to distinguish identical items or similar items having substantially the same function and action, and the sequence thereof is not limited. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
In the present application, the words "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
Furthermore, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, and c may represent: a, b, or c, or a and b, or a and c, or b and c, or a, b and c, wherein a, b and c can be single or multiple.
Fig. 1 is a schematic structural diagram of an electronic device to which an embodiment of the present application is applicable. As shown in fig. 1, the electronic device 100 may include: processor 110, external memory interface 120, internal memory 121, usb interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and customer identification module (subscriber identity module, SIM) card interface 195, etc.
It is to be understood that the structure illustrated in the present embodiment does not constitute a specific limitation on the electronic apparatus 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, a display processing unit (display process unit, DPU), and/or a neural-network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. In some embodiments, the electronic device 100 may also include one or more processors 110. The processor may be a neural hub and a command center of the electronic device 100, among others. The processor can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 uses or recycles. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. This avoids repeated accesses and reduces the latency of the processor 110, thereby improving the efficiency of the electronic device 100.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier, etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication (near field communication, NFC), infrared (IR), etc. applied on the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include GSM, GPRS, CDMA, WCDMA, TD-SCDMA, LTE, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou satellite navigation system (bei dou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-OLED, or a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED). In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, data files such as music, photos, videos, etc. are stored in an external memory card.
The internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may cause the electronic device 100 to execute various functional applications, data processing, and the like by executing the above-described instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area can store an operating system; the storage area may also store one or more applications (e.g., gallery, contacts, etc.), and so forth. The storage data area may store data created during use of the electronic device 100 (e.g., photos, contacts, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. In some embodiments, the processor 110 may cause the electronic device 100 to perform various functional applications and data processing by executing instructions stored in the internal memory 121, and/or instructions stored in a memory provided in the processor 110.
The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. The embodiment of the application takes an Android (Android) system with a layered architecture as an example, and illustrates a software structure of the electronic device 100.
Fig. 2 is a block diagram of a software architecture of an electronic device to which embodiments of the present application are applicable. The layered architecture divides the software system of the electronic device 100 into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system may be divided into an application layer (APP), an application framework layer (application framework), an Zhuoyun rows (Android run) and libraries, a hardware abstraction layer (hardware abstraction layer, HAL), and a kernel layer (kernel). In some embodiments, the electronic device 100 also includes hardware, such as a display screen, a Bluetooth chip, a Wi-Fi chip, a USB chip, and the like.
The application layer may include a series of application packages that run applications by calling an application program interface (application programming interface, API) provided by the application framework layer. As shown in fig. 2, the application package may include camera, calendar, map, talk, music, WLAN, bluetooth, video, social, gallery, navigation, XX sports, etc. applications.
The application framework layer provides APIs and programming frameworks for application programs of the application layer. The application framework layer includes a number of predefined functions. As shown in fig. 2, the application framework layer may include a window manager, a content provider, a resource manager, a notification manager, a view system, a phone manager, a travel service, a screen-drop service, a display service, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is presented in a status bar, a presentation sound is emitted, the electronic device 100 vibrates, and an indicator light blinks.
The travel service is used for providing functions including vehicle-machine interconnection, caller identification and the like in a driving vehicle scene. It can also provide multi-view capability queries and requests to multi-machine/multi-view APP. Security compliance detection of parked viewing media programming may also be provided.
The screen-casting service is used for initializing a multi-view viewing mode and providing protocol processing of a screen-casting data stream. The display service is used to render the composite image.
The android runtime includes a core library and virtual machines. And the android running time is responsible for scheduling and managing an android system. The core library consists of two parts: one part is a function to be called by a java language used by the java API framework, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), two-dimensional graphics engine (e.g., SGL), interconnect transport services, and the like.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The interconnection transmission service can provide a function of connection specification for interconnection of the mobile phone and the vehicle. Through the interconnection transmission service, a user can screen an application or a media program conforming to driving safety on an electronic device (such as a mobile phone) to another electronic device (such as a car machine) for display, and a safer and richer information entertainment experience is provided for the user.
The hardware abstraction layer is an abstract interface driven by the device kernel, and provides an application program interface for accessing the bottom layer device for a java API framework at a higher level. The hardware abstraction layer may include a plurality of library modules, e.g., a display module, a USB module, a bluetooth module, a Wi-Fi module, etc., each of which may implement an interface for a particular type of hardware component. When the framework API requires access to the device hardware, the Android system will load the library module for that hardware component.
The kernel layer is a layer between hardware and software. The kernel layer is used for driving the hardware so that the hardware works. The kernel layer at least includes a display driver, a USB driver, a bluetooth driver, a Wi-Fi driver, etc., which is not limited in the embodiment of the present application.
It should be understood that, in the embodiment of the present application, the electronic device may be a device for implementing a function of the electronic device, or may be a device capable of supporting the electronic device to implement the function, for example, a chip system, and the device may be installed in the electronic device. In the embodiment of the application, the chip system can be composed of chips, and can also comprise chips and other discrete devices.
The electronic device in the embodiment of the present application may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), and the like. The electronic device may be a mobile phone, a personal computer (personal computer, PC), a smart television, a wearable device, a tablet (Pad), a computer with wireless transceiving functionality, a Virtual Reality (VR) electronic device, an augmented reality (augmented reality, AR) electronic device, a wireless terminal in industrial control (industrial control), a wireless terminal in self-driving (self-driving), a wireless terminal in teleoperation (remote medical surgery), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation security (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), etc.
By way of example, and not limitation, in embodiments of the application, the electronic device may also be a wearable device. The wearable device can also be called as a wearable intelligent device, and is a generic name for intelligently designing daily wear by applying wearable technology and developing wearable devices, such as glasses, gloves, watches, clothes, shoes and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device includes full functionality, large size, and may not rely on the smart phone to implement complete or partial functionality, such as: smart watches or smart glasses, etc., and focus on only certain types of application functions, and need to be used in combination with other devices, such as smart phones, for example, various smart bracelets, smart jewelry, etc. for physical sign monitoring.
Furthermore, the electronic device may also be an electronic device in an internet of things (internet of things, ioT) system. IoT is an important component of future information technology development, and its main technical feature is to connect an item with a network through a communication technology, so as to implement man-machine interconnection and an intelligent network for object interconnection. The present application is not limited to the specific form of the electronic device. It should be understood that, in the embodiment of the present application, the electronic device may be a device for implementing a function of the electronic device, or may be a device capable of supporting the electronic device to implement the function, for example, a chip system, and the device may be installed in the terminal. In the embodiment of the application, the chip system can be composed of chips, and can also comprise chips and other discrete devices.
Currently, after a first electronic device (for example, a mobile phone) is interconnected with a second electronic device (for example, a car machine) through an interconnection protocol, a media program on the first electronic device can be projected onto a screen of the second electronic device for viewing, so that entertainment experience of a user on the second electronic device is enriched.
Taking the first electronic device as a mobile phone and the second electronic device as a car machine as an example, in a scenario including but not limited to the following, a user may have a need to see a global picture and a local picture in a media program on the car machine at the same time:
in one scenario, a user may watch a live sports event on a car machine, for example, a multi-player team athletic event such as a basketball game, a football game, a tennis game, etc., taking the football game as an example, the global picture may represent the global movement position of the football on the course, the position of the player, and the running change, and the local picture may represent the wonderful moment of a certain football star with/shooting the ball. However, at present, a global picture and a local picture in a multi-person team athletic event cannot be simultaneously presented on a screen of a vehicle when only a single view exists.
In another scenario, a user may watch a live game event on a vehicle, a global picture may present a game picture in a game athletic fight process, and a local picture may present a micro-operation and expression of a game person. But currently, when only a single view angle exists, a global picture and a local picture in live broadcasting of a game event cannot be simultaneously displayed on a screen of a vehicle.
In still another scenario, the user may watch live broadcast with the goods on the vehicle, the global picture may present a picture of the anchor explaining the goods, and the local picture may present a promotional picture of the goods. But currently, the global picture and the local picture in live broadcast with goods cannot be simultaneously presented on the screen of the vehicle machine when only a single view exists.
In order to present a global picture and a local picture, it is currently common to switch the local picture and the global picture in series. For example, a global picture of a global view is provided first, and after a highlight appears, a local picture of another view is repeatedly played once by playback. However, this approach suffers from the following drawbacks:
(1) Only global pictures or local pictures can be played at a certain moment, and the other picture can be watched only in a playback mode, so that real-time watching is poor.
(2) In the process of playing back at the wonderful moment, the latest picture on the current competition field is lost, so the method is not suitable for playing table tennis, basketball, badminton and other events and live programs with strong real-time performance, and has certain limitation on the type of media programs.
In another mode, the user can watch the media program in a screen throwing mode, and the global picture and the local picture are displayed in parallel in multiple screen areas of the vehicle through the capability of the vehicle. However, this approach suffers from the following drawbacks:
(1) The universality is poor: taking live viewing of a sports event as an example, the vehicle side needs to be equipped with an APP that can view the sports event, which APP needs to adapt the vehicle system, however not every APP has an adapted version on the vehicle side.
(2) Flow restriction: the data flow required by high-definition display in a plurality of screen areas is large, the live broadcast of a sports event may need the flow of GB, the vehicle has fixed flow every month, and once the free flow of the vehicle side is used up, the media program cannot be watched.
(3) Account management trouble and double-end payment problem: for users who have viewing requirements on both mobile phones and car phones, there is a high probability that the account numbers of the mobile phone version and the account numbers of the car phone version of the same APP need to be managed simultaneously, and the pay-per-view rebroadcasting/live broadcasting can be watched even twice, which is particularly common in the event of global live broadcasting/rebroadcasting. In addition, when media programs are retransmitted through different APP repeaters, the account numbers of APPs that users need to manage and the viewing fees that need to be paid can be rapidly expanding in geometric order.
In view of the requirement that a user expects to see a global picture and a local picture in a media program on a vehicle at the same time, the embodiment of the application provides a screen projection display method, and a first electronic device can inquire about multi-view capability of a second electronic device, namely whether the second electronic device supports simultaneous display of multiple video streams. In the case that it is determined that the second electronic device supports simultaneous display of multiple video streams and the target application also supports simultaneous display of multiple video streams, the first electronic device may provide the second electronic device with a Z-channel video stream of a media program in the target application, and screen-cast the Z-channel video stream in the second electronic device, where the Z-channel video stream corresponds to a frame of the media program at Z views, for example, a frame of z=2, where the frames of 2 views include a frame of global view (i.e., a global frame) and a frame of local view (i.e., a local frame), so as to satisfy a requirement that a user desires to view the global frame and the local frame of the media program on the vehicle at the same time.
For ease of understanding, the following description will take the first electronic device as a mobile phone and the second electronic device as a car phone as an example. Fig. 3 is a schematic diagram of a framework of interaction between a mobile phone and a vehicle according to an embodiment of the present application in conjunction with the description of fig. 2.
The mobile phone comprises sports APP (also can be live broadcast APP or game APP, etc.), travel service, display service, screen throwing service (transmitting end), interconnection transmission service, wi-Fi drive, bluetooth drive, USB drive, wi-Fi chip, bluetooth chip, USB chip and display screen hardware. The display service may include a main display screen, a virtual display screen 1, and a virtual display screen 2, where the virtual display screen 1 and the virtual display screen 2 are used to carry a global screen and a local screen, respectively. The screen-casting service may include a media video coding service, which may be considered a sub-service of the screen-casting service, for coding a data stream that needs to be screen-cast for display to the vehicle side. The internet transport service may include a discovery connection module, a user datagram protocol (user datagram protocol, UDP), a transmission control protocol (transmission control protocol, TCP), and an internet protocol (internet protocol, IP), and may implement data transmission through any one of UDP, TCP, or IP based on the discovery connection module establishing a connection with the vehicle.
The vehicle comprises a display service, a screen throwing service (receiving end), an interconnection transmission service, a Wi-Fi drive, a Wi-Fi chip, a display screen 1 and a display screen 2. Optionally, the vehicle further comprises a bluetooth driver, a USB driver, a bluetooth chip, and a USB chip.
For convenience of description, the display service of the mobile phone is called a first display service, the screen-throwing service of the mobile phone is called a first screen-throwing service, the interconnection transmission service of the mobile phone is called a first interconnection transmission service, the Wi-Fi drive of the mobile phone is called a first Wi-Fi drive, the bluetooth drive of the mobile phone is called a first bluetooth drive, the USB drive of the mobile phone is called a first USB drive, the Wi-Fi chip of the mobile phone is called a first Wi-Fi chip, the bluetooth chip of the mobile phone is called a first bluetooth chip, the USB chip of the mobile phone is called a first USB chip, and the display screen hardware of the mobile phone is called a first display screen. The display service of the vehicle is called second display service, the screen-throwing service of the vehicle is called second screen-throwing service, the interconnection transmission service of the vehicle is called second interconnection transmission service, the Wi-Fi drive of the vehicle is called second Wi-Fi drive, the Wi-Fi chip of the vehicle is called second Wi-Fi chip, the display screen 1 of the vehicle is called second display screen, and the display screen 2 of the vehicle is called third display screen.
The mobile phone and the vehicle can establish connection through the first interconnection transmission service and the second interconnection transmission service, and the connection mode can comprise wireless connection and wired connection, wherein the wireless connection comprises a Bluetooth-based connection mode, a mobile phone Wi-Fi hotspot-based connection mode or a mobile phone Wi-Fi direct-to-peer (Wi-Fi P2P) -based connection mode, and the wired connection comprises a connection mode with the vehicle through a USB connection wire. The connection mode based on the Wi-Fi hot spot of the mobile phone or the connection mode based on the Wi-Fi P2P of the mobile phone needs Wi-Fi drive and support of a Wi-Fi chip, and the wired connection needs USB drive and support of a USB chip. After the mobile phone and the car machine are connected, the first interconnection transmission service sends a connection notice to the travel service, and then the travel service interacts with the first screen throwing service to detect the multi-view capability of the car side and initialize the multi-view capability of the first screen throwing service.
In a live broadcast field of a sports class, a worker can acquire pictures of multiple visual angles of the live broadcast field through a plurality of units and the like and store the pictures in a cloud. After a user starts the sports APP, the sports APP can acquire a global picture and a local picture of a live broadcast scene from the cloud. The sports APP interacts with the travel service to request to start a multi-view working mode, and the global picture and the local picture are transmitted to a second display screen and a third display screen on the side of the vehicle through the established transmission channel.
Based on the above description of fig. 3, an internal interaction flow chart for implementing multi-view concurrent display capability of a mobile phone and an end-to-end full link of the mobile phone according to an embodiment of the present application is described below with reference to fig. 4.
It should be understood that the multi-view mode described in the embodiments of the present application includes that the mobile phone displays the content of the APP on a plurality of display areas of the vehicle, where each display area displays a picture of a different view angle, for example, one display area displays a global picture and another display area displays a local picture. The plurality of display areas may be a plurality of display areas in a logical sense, for example, the second display screen and the third display screen of the vehicle are actually logical divisions of the same display hardware of the vehicle. The plurality of display areas may also be a plurality of display areas in a physical sense, for example, the second display screen and the third display screen of the vehicle are two physically distinct display hardware.
Fig. 4 is a schematic flow chart of a screen projection display method 400 according to an embodiment of the present application. The mobile phone comprises travel service, first interconnection transmission service, first screen projection service, media video coding service, first display service and sports APP. The vehicle machine comprises a second interconnection transmission service, a second screen throwing service, a second display service and display hardware. The display hardware of the vehicle machine can comprise a second display screen and a third display screen.
The screen display method 400 provided by the embodiment of the application can comprise an initialization stage and a service flow stage. Wherein the purpose of the initialization phase is to obtain a plurality of logically different virtual displays (virtual displays); the purpose of the service flow stage is to draw a plurality of pictures with different visual angles through a plurality of virtual display screens obtained in the initialization stage, transmit the pictures with different visual angles to the vehicle machine, and display a plurality of pictures with different visual angles from the same APP on the display hardware of the vehicle machine. The stages are specifically described below in connection with S401 to S446.
The initialization stage includes S401 to S417, and the specific steps are as follows:
s401, the first interconnection transmission service and the second interconnection transmission service establish connection.
Alternatively, the cell phone and the car phone may establish a connection based on a communication connection technology. The communication connection technology may include Wi-Fi, bluetooth, or USB.
In this step, the process of establishing a connection may include device discovery and device authentication, and the connection may include a wireless connection and a wired connection. The connection mode based on Wi-Fi or Bluetooth is a wireless connection mode, and the connection mode based on USB is a wired connection mode.
The wireless connection mode of the wireless Wi-Fi can also comprise a connection mode based on a mobile phone Wi-Fi hot spot or a connection mode based on a mobile phone Wi-Fi P2P.
Taking a connection mode based on bluetooth as an example, when the travel service of the mobile phone monitors some user behaviors, the travel service can send a bluetooth search instruction to the bluetooth module to instruct the bluetooth module to find nearby bluetooth devices, identify a target vehicle-mounted bluetooth which the user desires to connect from at least one nearby possible bluetooth device, confirm the connection target vehicle-mounted bluetooth according to the prompt, complete bluetooth connection, and establish a bottom physical communication channel between the first interconnection transmission service and the second interconnection transmission service. The user behavior may include an operation of enabling the vehicle-to-machine interconnection by a user in an interface of the travel service, or an operation of connecting the vehicle-to-machine by a user voice instruction. The specific monitored user behavior can be set by the trip service in a self-defined manner, and the embodiment of the application is not limited to the specific monitored user behavior.
Taking a connection mode based on mobile phone Wi-Fi P2P as an example, the travel service sends an instruction for starting a Wi-Fi P2P connection mode to the first interconnection transmission service. The first interconnection transmission service waits for and receives Wi-Fi P2P connection information sent by the second interconnection transmission service through a physical communication channel of a bottom layer which is established in the Bluetooth connection stage. If the first interconnection transmission service receives Wi-Fi P2P connection information of the second interconnection transmission service, a data channel can be established between the first interconnection transmission service and the second interconnection transmission service, and the data channel can be used for subsequently transmitting service data such as screen throwing data, audio data and the like.
Taking a connection mode based on a Wi-Fi hotspot of a mobile phone as an example, the travel service instructs a Wi-Fi module of the mobile phone to start the Wi-Fi hotspot, and the Wi-Fi module of the mobile phone returns a successful message for starting the Wi-Fi hotspot to the travel service, wherein the message carries a hotspot name and a password. And the trip service displays the name and the password of the hot spot to the user in a user interface and prompts the user to connect the hot spot from the Wi-Fi module of the vehicle. Based on the prompt of the mobile phone, a user can be connected with a Wi-Fi hot spot of the mobile phone through a Wi-Fi module of the car machine. And the Wi-Fi module of the mobile phone returns a Wi-Fi hot spot connection result to the first interconnection transmission service. And under the condition that the Wi-Fi hot spot connection is successful, the travel service sends an instruction for starting the business layer service to the first interconnection transmission service, and the first interconnection transmission service starts the business layer service based on the instruction. After the Wi-Fi module of the mobile phone confirms that the Wi-Fi hot spot connection is successful, the Wi-Fi module sends a message of the successful Wi-Fi hot spot connection to the second interconnection transmission service. The second internet transmission service also starts the business layer service after receiving the successful Wi-Fi hotspot connection message. In this way, the data channel between the first interconnection transmission service and the second interconnection transmission service is successfully established, and the method can be used for subsequently transmitting service data such as screen throwing data, audio data and the like.
S402, the first interconnection transmission service sends a connection state notification to the travel service. Accordingly, the travel service receives the connection status notification.
After the first internet transmission service and the second internet transmission service successfully establish the connection, the first internet transmission service may send a connection status notification to the travel service to notify the travel service that the connection has been established.
S403, the travel service detects whether the opposite terminal is a car machine.
The travel service detects whether the connected device is a car machine, and if the connected device is the car machine, optionally, the travel service further detects whether the connected device is a screen-throwing mode of the car machine. If yes, execution continues with S404. If not, for example, the current application connection mode of the vehicle is not applicable to the screen display process of the application.
S404, the travel service sends a request for multi-view capability query (hereinafter may also be referred to as a first query request) to the first screen-drop service. Accordingly, the first screening service receives a request for the multi-view capability query.
The multiview capability query is used to request whether the vehicle side supports multiview concurrent display capability, i.e., a capability of simultaneously displaying a plurality of pictures of different views, such as a global picture and a local picture, in different display areas. Of course, the pictures of the multiple different perspectives may be all local pictures or all global pictures, but the content of the pictures captured by the pictures is different. For example, the picture of the first view is a global picture on a football pitch, and the picture of the second view is a global picture on an audience; for another example, the first view is a partial view of football star a on the football field, and the second view is a partial view of football star B on the football field.
S405, the first screen throwing service sends a request of multi-view capability query to the second screen throwing service. Accordingly, the second screening service receives the request for the multi-view capability query.
The first screening service may transmit the multi-view capability query via a screening protocol negotiated with the vehicle side.
S406, the second screen throwing service sends a request of multi-view capability query to the second display service. Accordingly, the second display service receives a request for the multi-view capability query.
S407, the second display service sends a request for a multi-view capability query to the display hardware. Accordingly, the display hardware receives a request for the multi-view capability query.
The display hardware sends multi-view capability feedback (hereinafter also referred to as first query feedback) to the second display service S408. Accordingly, the second display service receives the multi-view capability feedback.
And the display hardware sends multi-view capability feedback to the second display service according to the hardware capability of the display hardware, wherein the multi-view capability feedback carries the number N of the supported multi-views, and N is a positive integer greater than or equal to 2.
S409, the second display service sends multi-view capability feedback to the second screen throwing service. Accordingly, the second screening service receives the multi-view capability feedback.
S410, the second screen throwing service sends multi-view capability feedback to the first screen throwing service. Accordingly, the first screening service receives the multi-view capability feedback.
S411, the first screen throwing service sends multi-view capability feedback to the travel service. Accordingly, the travel service receives the multi-view capability feedback.
S412, the travel service sends a request for initializing multi-view capability to the first screen-casting service. Accordingly, the first screening service receives the request to initialize the multi-view capability.
Under the condition that the travel service determines that the vehicle supports the multi-view capability, the travel service can instruct the first screen throwing service to initialize the multi-view capability, and provide N virtual display screens for pictures with N views, wherein the N virtual display screens can be used for drawing N paths of video streams.
S413, the first screen-drop service sends a request for initializing N paths of video coding to the media video coding service. Accordingly, the media video encoding service receives the request to initialize N-way video encoding.
S414, the media video encoding service sends an N-way handle to the first projection service. Accordingly, the first projection service receives the N-way handle.
The N-way handle is used for acquiring N virtual display screens. The media video coding service provides N-way handles for N views, or so to speak N image buffers (buffers) for N views, the N views corresponding one-to-one to the N-way handles.
S415, the first drop service requests a virtual display service from the first display service based on the N-way handle.
The first screen-drop service requesting the virtual display service may include: the first screen-throwing service requests the first display service to provide N virtual display screens to bear the pictures of N visual angles.
S416, the first display service sends the identification of the N virtual display screens to the first screen throwing service. Accordingly, the first screen-casting service receives the identifications of the N virtual display screens.
There may be more than one virtual display (also referred to as a secondary display) on the handset, each virtual display having its corresponding unique identification. The virtual display screen is used for bearing the multi-screen image drawn by the mobile phone when the mobile phone is to display the multi-screen image. However, the virtual display is not a real hardware display, such as a liquid crystal display (liquid crystal display, LCD), and thus the mobile phone needs to transmit the image data displayed on the virtual display to a remote screen (e.g., a screen of a car machine, a screen of a television) for display.
S417, the first screening service waits for an APP request.
After the first screen-throwing service obtains the identifiers of the N virtual display screens, initializing the multi-view capability is completed, and the user begins to wait for selecting the sports APP to start the multi-view working mode.
The above S401 to S417 introduce a procedure of initializing the multi-view capability. The start-up procedure for handling the business stage multi-view capability is described below in connection with S418 to S446.
S418, responding to the operation of selecting the sports APP by the user, and performing multi-view capability self-checking by the sports APP.
The multi-view capability self-check comprises that the sports APP detects whether the sports APP supports multi-view capability or not, and if so, the number of the supported multi-view is what. In the embodiment of the application, the number of multi-view angles supported by sports APP is exemplified by M, wherein M is an integer greater than or equal to 2.
After the mobile phone is connected with the car machine, the mobile phone can display an interface of the mobile phone on the car machine in a screen throwing way, so that a user can operate an application on the mobile phone on the car machine. For example, a user selects a sports APP on a car, and the car transmits first indication information to a mobile phone in response to an operation of selecting the sports APP by the user, where the first indication information is used to indicate that the user selects the sports APP on the car (can be understood as opening the sports APP). In addition, the user can select sports APP on the mobile phone, which is not limited by the embodiment of the present application.
S419, the sports APP sends a request for multi-view capability query to the travel service. Accordingly, the travel service receives the request for the multi-view capability query.
The request for the multiview capability query of this step is similar to the request for the multiview capability query described in S404 for querying whether the vehicle side supports multiview concurrent display capability.
S420, the travel service sends multi-view capability feedback to the sports APP. Accordingly, sports class APP receives the multiview capability feedback.
In the initialization stage, the travel service queries the multi-view capability of the vehicle, and determines that the vehicle supports the multi-view capability and the number of supported multi-views is N. Therefore, the travel service can send multi-view capability feedback to the sports APP, indicating that the vehicle machine supports multi-view capability and the number of supported multi-views is N.
S421, sports APP performs multi-view capability matching.
In connection with the description in S418, the number of multi-views supported by sports class APP is M. In connection with the description in S420, the number of multi-view supported by the vehicle is N. Sports class APP can determine the number of views Z that are ultimately displayed on the vehicle based on M and N. Optionally, Z is less than or equal to a first value, the first value being the minimum of M and N.
S422, the sports APP requests to travel services to start a multi-view working mode.
Sports class APP can carry the number Z of multi-views that can be actually provided for the vehicle when it is requested to start the multi-view mode of operation. The following steps are described by taking z=2 as an example.
S423, the travel service sends a request result to the sports APP. Accordingly, sports class APP receives the request result.
The request result comprises the corresponding relation between the number of the handle and the view angle. For example, handle 1 corresponds to a global view and handle 2 corresponds to a local view.
S424, the sports APP draws a first picture on the first virtual display screen.
Because the corresponding relation exists between the handle and the virtual display screen, the number of the handle also has the corresponding relation with the view angle. Thus, after determining the correspondence between the number of the handle and the view angle, the sports class APP may determine the correspondence between the view angle and the virtual display screen. For example, when z=2, a first screen is drawn on the first virtual display screen, and a second screen is drawn on the second virtual display screen. Wherein the first picture may be a picture of a first view, such as a global picture; the second picture may be a picture of a second viewing angle, e.g., a partial picture. The first viewing angle and the second viewing angle are different.
S425, the sports APP sends the first screen to the first display service. Accordingly, the first display service receives the first screen.
S426, the first display service renders the first screen into a first image.
S427, the first display service transmits the first image to the media video encoding service. Accordingly, the media video encoding service receives the first image.
S428, the media video coding service codes the first image to obtain a first video stream.
In general, the data size of the first image is large, and the acquired first image needs to be encoded when being transmitted to the vehicle for display, so that the compression of the first image is realized, and the data transmission rate can be improved.
S429, the media video coding service sends the first video stream to the first screen casting service. Accordingly, the first video stream is received by the first screening service.
And S430, the first screen throwing service sends a first video stream to the travel service. Accordingly, the travel service receives the first video stream.
And S431, performing protocol processing on the first video stream by the trip service to obtain a second video stream.
The mobile phone and two ends of the vehicle-mounted device need to support the same vehicle-mounted device interconnection protocol, and in the step, the trip service processes the first video stream in a protocol mode, which can be understood as converting the format of the first video stream, so that the converted second video stream accords with the data transmission format specified by the vehicle-mounted device interconnection protocol, and the purpose of transmitting the second video stream from the first interconnection transmission service to the second interconnection transmission service is achieved.
And S432, the travel service sends a second video stream to the first interconnection transmission service. Accordingly, the first interconnect transport service receives the second video stream.
S433, the sports APP draws a second picture on the second virtual display screen.
S434, the sports APP sends the second screen to the first display service. Accordingly, the first display service receives the second screen.
S435, the first display service renders the second screen into a second image.
S436, the first display service sends the second image to the media video coding service. Accordingly, the media video encoding service receives the second image.
And S437, the media video coding service codes the second image to obtain a third video stream.
And S438, the media video coding service sends a third video stream to the first screen casting service. Accordingly, the first video streaming service receives the third video stream.
S439, the first screen throwing service sends a third video stream to the travel service. Accordingly, the travel service receives the third video stream.
S440, the trip service carries out protocol processing on the third video stream to obtain a fourth video stream.
S441, the travel service sends a fourth video stream to the first interconnection transmission service. Accordingly, the first interconnect transport service receives the fourth video stream.
The descriptions of S433 to S441 and S424 to S432 are similar, and are not repeated here.
And S442, the first interconnection transmission service sends the second video stream and the fourth video stream to the second interconnection transmission service. Accordingly, the second interconnect transport service receives the second video stream and the fourth video stream.
S443, the second interconnection transmission service sends the second video stream and the fourth video stream to the second screen throwing service. Accordingly, the second video stream and the fourth video stream are received by the second screening service.
And S444, the second screen projection service performs protocol analysis and decoding on the second video stream and the fourth video stream.
The second video stream can be restored to the first video stream and the fourth video stream can be restored to the third video stream through protocol parsing and decoding.
S445, the second screen service transmits the first video stream and the third video stream to the second display service. Accordingly, the second display service receives the first video stream and the third video stream.
S446, the second display service transmits the first screen and the second screen to the display hardware.
The first screen may be carried for display on a second display screen of the display hardware of the vehicle, and the second screen may be carried for display on a third display screen of the display hardware of the vehicle.
The screen projection display method provided by the embodiment of the application has good universality, is not influenced by whether the vehicle side is provided with the APP corresponding to the version of the vehicle and whether the APP is installed, and only needs to be provided with the APP capable of supporting the multi-view capability. And only the account passwords of the mobile phone version of the same APP are required to be managed, the account passwords of the vehicle-mounted version are not required to be managed, and APP management is simpler. In addition, the problem of double-end payment for the same APP on the mobile phone and the car phone can be solved, and the use cost of a user is reduced.
In the embodiment of the application, the system service (comprising the travel service and the first screen throwing service) of the mobile phone can provide the virtual display screen with multiple positions/multiple views for the same APP upwards, and the pictures with multiple views can be displayed in multiple display areas of the mobile phone downwards through the private screen throwing and transmission protocols and the channel management protocols. Therefore, the large common multi-view capability of the full link from top to bottom end to end enables a plurality of pictures obtained by live broadcasting sites through a plurality of sites and other means to be concurrently displayed on a plurality of display areas of the vehicle, solves the problem that a user cannot watch global pictures and local pictures of media programs through a screen of the vehicle at the same time, and provides richer, more comprehensive and more restored viewing experience of real scenes for the user.
It should be noted that, in the above description, the screen of the mobile phone is taken as an example of the screen of the vehicle, and in addition, the screen display method provided by the embodiment of the application may also be applicable to the scene of the mobile phone from the screen to the large screen supporting multiple screen displays, which is not limited herein.
Taking the example that a user views a football match by throwing a screen on a car machine through a sports APP on a mobile phone, a screen throwing display method based on the embodiment of the application finally presents a picture to the user on the car machine as shown in fig. 5.
Fig. 5 is an interface schematic diagram of a vehicle machine according to an embodiment of the present application. As shown in fig. 5, the screen of the car machine has two display areas, which respectively display the global picture and the local picture on the football field, so as to meet the requirement of the user for watching the global picture and the local picture of the media program at the same time. The two display areas may be two physically independent pieces of display hardware, or two display areas may be two logically divided display screens of the same piece of hardware display screen.
In connection with the method 400 above, FIG. 6 is a schematic flow chart of another method 600 for on-screen display provided by an embodiment of the application. The method 600 includes steps S601 to S605, which are specifically as follows:
S601, the first electronic device determines that the second electronic device supports simultaneous display of N paths of video streams.
Optionally, S601 includes: the method comprises the steps that a first electronic device sends a first query request to a second electronic device, wherein the first query request is used for requesting to query whether the second electronic device supports simultaneous display of multiple video streams; the first electronic device receives first query feedback from the second electronic device, wherein the first query feedback indicates that the second electronic device supports the simultaneous display of N paths of video streams; the first electronic device determines that the second electronic device supports simultaneous display of N paths of video streams based on the first query feedback. The module interaction for determining that the second electronic device supports simultaneous display of N video streams in the first electronic device may refer to the description of S404 to S411 in the method 400, which is not repeated herein.
S602, the first electronic device acquires N virtual display screens.
Optionally, the first electronic device obtains N virtual display screens based on the first query feedback. Because the first query feedback indicates that the second electronic device supports simultaneous display of N paths of video streams, the first electronic device acquires N virtual display screens with the same number as the video streams supported by the second electronic device, and the N virtual display screens can be used for drawing the N paths of video streams. The module interaction for acquiring N virtual display screens in the first electronic device may refer to descriptions of S412 to S416 in the method 400, which are not described herein.
S603, the first electronic device determines that the target application supports simultaneous display of M paths of video streams, and the target application comprises an application loaded in the first electronic device.
Optionally, S603 includes: and under the condition that the first indication information from the second electronic equipment is received, the first electronic equipment determines that the target application supports simultaneous display of M paths of video streams, and the first indication information is used for indicating a user to select the target application on the second electronic equipment.
Based on the descriptions in method 400 for S417 and S418, after the user opens the target application on the second electronic device (e.g., sports APP in method 400), the second electronic device may send first indication information to the first electronic device indicating that the user has opened the target application on the vehicle, i.e., indicating that the user currently has a need to view the media program on the target application. Therefore, after receiving the first indication information, the first electronic device can query whether the target APP supports simultaneous display of multiple video streams. The module interaction for querying whether the target APP supports simultaneous display of multiple video streams in the first electronic device may refer to the description of S419 to S420 in the method 400, which is not repeated here.
And S604, the first electronic device draws Z paths of video streams in Z virtual display screens in the N virtual display screens respectively, wherein Z is smaller than or equal to a first value, and the first value is the minimum value in M and N.
Optionally, before S604, the first electronic device first determines the number Z of video streams to be drawn, and then selects Z virtual display screens from the N virtual display screens to respectively draw Z paths of video streams. The process of determining the number Z of video streams to be drawn in the first electronic device may refer to the description of S421, and will not be described herein. The module interaction of drawing the Z-path video stream in the Z virtual display screens in the N virtual display screens in the first electronic device may refer to descriptions of the method 400 for S424 to S431 and S433 to S440, which are not repeated herein.
S605, the first electronic device projects a screen to display the Z-path video stream in the second electronic device.
Optionally, S605 includes: the first electronic device sends the Z-path video stream to the second electronic device through a Wi-Fi connection or a USB connection. Specifically, the first electronic device sends the Z-path video stream to the second electronic device, and after receiving the Z-path video stream, the second electronic device may display the Z-path video stream in Z different display areas of the second electronic device.
The module interaction of the first electronic device in the second electronic device for displaying the Z-path video stream in a screen may be referred to in the method 400 for descriptions of S442 to S446, which are not repeated herein.
In the embodiment of the application, after the first electronic device is connected with the second electronic device, the first electronic device can inquire whether the second electronic device supports simultaneous display of multiple paths of video streams, and under the condition that the second electronic device supports simultaneous display of N paths of video streams, N virtual display screens are configured for simultaneously displaying the multiple paths of video streams on the second electronic device so as to bear the N paths of video streams.
After the user opens the target application on the second electronic device (the target application displayed on the second electronic device is projected on the screen, the actual target application is loaded in the first electronic device, and the target application may not be installed in the second electronic device), the first electronic device may also determine whether the target APP supports simultaneous display of multiple video streams. Under the condition that the second electronic equipment supports simultaneous display of N paths of video streams and the target application supports simultaneous display of M paths of video streams, the first electronic equipment and the second electronic equipment can achieve the purpose of simultaneously displaying Z paths of video streams on the second electronic equipment based on a screen projection protocol. Wherein different video streams correspond to pictures of different perspectives of a media program in a target application. According to the technical scheme, the requirements of a user for watching pictures of the same media program at different angles on the second electronic equipment can be met, and the use experience of the user is improved.
The embodiment of the application provides electronic equipment, which comprises: a processor and a memory; the memory stores computer-executable instructions; the processor executes the computer-executable instructions stored in the memory to cause the electronic device to perform the method described above.
The embodiment of the application provides a chip. The chip comprises a processor for invoking a computer program in a memory to perform the technical solutions in the above embodiments. The principle and technical effects of the present application are similar to those of the above-described related embodiments, and will not be described in detail herein.
The embodiment of the application also provides a computer readable storage medium. The computer-readable storage medium stores a computer program. The computer program realizes the above method when being executed by a processor. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer readable media can include computer storage media and communication media and can include any medium that can transfer a computer program from one place to another. The storage media may be any target media that is accessible by a computer.
In one possible implementation, the computer readable medium may include random access memory (random access memory, RAM), read-only memory (ROM), compact disk (compact disc read-only memory, CD-ROM) or other optical disk memory, magnetic disk memory or other magnetic storage device, or any other medium targeted for carrying or storing the desired program code in the form of instructions or data structures, and accessible by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (digital subscriber line, DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes optical disc, laser disc, optical disc, digital versatile disc (digital versatile disc, DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
Embodiments of the present application provide a computer program product comprising a computer program which, when executed, causes a computer to perform the above-described method.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The foregoing detailed description of the application has been presented for purposes of illustration and description, and it should be understood that the foregoing is by way of illustration and description only, and is not intended to limit the scope of the application.

Claims (10)

1. The screen projection display method is characterized by comprising the following steps of:
the first electronic device determines that the second electronic device supports simultaneous display of N paths of video streams;
the first electronic device acquires N virtual display screens;
the first electronic equipment determines that the target application supports simultaneous display of M paths of video streams; the target application comprises an application loaded in the first electronic device;
the first electronic equipment draws Z paths of video streams in Z virtual display screens in the N virtual display screens respectively, wherein Z is smaller than or equal to a first value, and the first value is the minimum value in M and N;
and the first electronic equipment displays the Z paths of video streams in a screen-casting mode in the second electronic equipment.
2. The method of claim 1, wherein the first electronic device determining that the second electronic device supports simultaneous display of N video streams comprises:
the first electronic device sends a first query request to the second electronic device, wherein the first query request is used for requesting to query whether the second electronic device supports simultaneous display of multiple video streams;
the first electronic device receives first query feedback from the second electronic device, wherein the first query feedback indicates that the second electronic device supports simultaneous display of N paths of video streams;
The first electronic device determines that the second electronic device supports simultaneous display of N paths of video streams based on the first query feedback;
the first electronic device obtains N virtual display screens, including:
and the first electronic equipment acquires the N virtual display screens based on the first query feedback.
3. The method according to claim 1 or 2, wherein the first electronic device determining that the target application supports simultaneous display of M video streams comprises:
and under the condition that the first electronic equipment receives first indication information from the second electronic equipment, the first electronic equipment determines that the target application supports simultaneous display of M paths of video streams, and the first indication information is used for indicating a user to select the target application on the second electronic equipment.
4. A method according to any one of claims 1 to 3, wherein before the first electronic device determines that the second electronic device supports simultaneous display of N video streams, the method further comprises:
the first electronic device establishes a connection with the second electronic device based on a communication connection technique.
5. The method of claim 4, wherein the communication connection technology comprises a wireless fidelity Wi-Fi connection, a bluetooth connection, or a universal serial bus USB connection.
6. The method of claim 5, wherein the first electronic device projecting the Z-path video stream in the second electronic device comprises:
and the first electronic device sends the Z-path video stream to the second electronic device through Wi-Fi connection or USB connection.
7. A method of on-screen display, characterized in that it is applied to an on-screen display system comprising a first electronic device and a second electronic device, said method comprising:
the first electronic device sends a first query request to the second electronic device, wherein the first query request is used for requesting to query whether the second electronic device supports simultaneous display of multiple video streams;
the first electronic device receives first query feedback from the second electronic device, wherein the first query feedback is used for indicating that the second electronic device supports simultaneous display of N paths of video streams;
the first electronic device determines that the second electronic device supports simultaneous display of N paths of video streams based on the first query feedback;
the first electronic device obtains N virtual display screens based on the first query feedback;
the second electronic device sends first indication information to the first electronic device, wherein the first indication information is used for indicating a user to select a target application on the second electronic device, and the target application comprises an application loaded in the first electronic device;
The first electronic device determines that the target application supports simultaneous display of M paths of video streams based on the first indication information;
the first electronic equipment draws Z paths of video streams in Z virtual display screens in the N virtual display screens respectively, wherein Z is smaller than or equal to a first value, and the first value is the minimum value in M and N;
the first electronic device sends the Z-path video stream to the second electronic device;
and the second electronic equipment displays the Z-path video stream.
8. An electronic device, comprising: a processor and a memory, wherein,
the memory is used for storing a computer program;
the processor is configured to invoke and execute the computer program to cause the electronic device to perform the method of any of claims 1 to 6.
9. A computer readable storage medium for storing a computer program which, when run on a computer, causes the computer to perform the method of any one of claims 1 to 6.
10. A computer program product comprising a computer program which, when run, causes a computer to perform the method of any one of claims 1 to 6.
CN202310212384.2A 2023-02-27 2023-02-27 Screen-throwing display method and electronic equipment Pending CN117156189A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310212384.2A CN117156189A (en) 2023-02-27 2023-02-27 Screen-throwing display method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310212384.2A CN117156189A (en) 2023-02-27 2023-02-27 Screen-throwing display method and electronic equipment

Publications (1)

Publication Number Publication Date
CN117156189A true CN117156189A (en) 2023-12-01

Family

ID=88882985

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310212384.2A Pending CN117156189A (en) 2023-02-27 2023-02-27 Screen-throwing display method and electronic equipment

Country Status (1)

Country Link
CN (1) CN117156189A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105872610A (en) * 2015-12-10 2016-08-17 乐视体育文化产业发展(北京)有限公司 Method, equipment and system for playing multiple paths of video data
CN110139063A (en) * 2018-02-09 2019-08-16 杭州海康威视数字技术股份有限公司 A kind of determining equipment supports the method, device and equipment of video flowing number
CN111324327A (en) * 2020-02-20 2020-06-23 华为技术有限公司 Screen projection method and terminal equipment
CN113301388A (en) * 2021-05-20 2021-08-24 杭州海康威视数字技术股份有限公司 Video stream processing system, device and method
CN113573098A (en) * 2021-07-06 2021-10-29 杭州海康威视数字技术股份有限公司 Image transmission method and device and electronic equipment
CN113687803A (en) * 2020-05-19 2021-11-23 华为技术有限公司 Screen projection method, screen projection source end, screen projection destination end, screen projection system and storage medium
CN114040242A (en) * 2021-09-30 2022-02-11 荣耀终端有限公司 Screen projection method and electronic equipment
CN114356258A (en) * 2020-09-30 2022-04-15 华为技术有限公司 Electronic device, screen projection method thereof and medium
WO2022226736A1 (en) * 2021-04-26 2022-11-03 华为技术有限公司 Multi-screen interaction method and apparatus, and terminal device and vehicle
WO2022257977A1 (en) * 2021-06-09 2022-12-15 荣耀终端有限公司 Screen projection method for electronic device, and electronic device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105872610A (en) * 2015-12-10 2016-08-17 乐视体育文化产业发展(北京)有限公司 Method, equipment and system for playing multiple paths of video data
CN110139063A (en) * 2018-02-09 2019-08-16 杭州海康威视数字技术股份有限公司 A kind of determining equipment supports the method, device and equipment of video flowing number
CN111324327A (en) * 2020-02-20 2020-06-23 华为技术有限公司 Screen projection method and terminal equipment
CN113687803A (en) * 2020-05-19 2021-11-23 华为技术有限公司 Screen projection method, screen projection source end, screen projection destination end, screen projection system and storage medium
CN114356258A (en) * 2020-09-30 2022-04-15 华为技术有限公司 Electronic device, screen projection method thereof and medium
WO2022226736A1 (en) * 2021-04-26 2022-11-03 华为技术有限公司 Multi-screen interaction method and apparatus, and terminal device and vehicle
CN113301388A (en) * 2021-05-20 2021-08-24 杭州海康威视数字技术股份有限公司 Video stream processing system, device and method
WO2022257977A1 (en) * 2021-06-09 2022-12-15 荣耀终端有限公司 Screen projection method for electronic device, and electronic device
CN113573098A (en) * 2021-07-06 2021-10-29 杭州海康威视数字技术股份有限公司 Image transmission method and device and electronic equipment
CN114040242A (en) * 2021-09-30 2022-02-11 荣耀终端有限公司 Screen projection method and electronic equipment

Similar Documents

Publication Publication Date Title
JP7463647B2 (en) Notification processing system, method and electronic device
CN110109636B (en) Screen projection method, electronic device and system
CN112286618A (en) Device cooperation method, device, system, electronic device and storage medium
WO2022089271A1 (en) Wireless screen-casting method, mobile device, and computer-readable storage medium
CN114040242B (en) Screen projection method, electronic equipment and storage medium
US20220368792A1 (en) Device Capability Scheduling Method and Electronic Device
CN114741008B (en) Distributed cross-device cooperation method, electronic device and communication system
WO2023284650A1 (en) Communication method and electronic device
CN114845035B (en) Distributed shooting method, electronic equipment and medium
WO2023005711A1 (en) Service recommendation method and electronic device
CN117156189A (en) Screen-throwing display method and electronic equipment
CN116033342B (en) Geofence processing method, equipment and storage medium
CN114679752B (en) Method for sharing wireless communication capability by double systems and terminal equipment
CN115918108B (en) Method for determining function switching entrance and electronic equipment
CN112423008B (en) Live broadcast method, device, terminal, server and storage medium
CN113918110A (en) Screen projection interaction method, device, system, storage medium and product
CN113835802A (en) Device interaction method, system, device and computer readable storage medium
US20230385097A1 (en) Distributed device capability virtualization method, medium, and electronic device
CN116709584B (en) Method for connecting car machine and terminal equipment
CN115460445B (en) Screen projection method of electronic equipment and electronic equipment
CN116679895B (en) Collaborative business scheduling method, electronic equipment and collaborative system
CN118057798A (en) Application sharing method and electronic equipment
WO2023040848A9 (en) Device control method and apparatus
CN117640717A (en) Equipment connection method and equipment
CN117632534A (en) Inter-process communication method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination