CN118075528A - Multi-screen collaborative display method, device and storage medium - Google Patents

Multi-screen collaborative display method, device and storage medium Download PDF

Info

Publication number
CN118075528A
CN118075528A CN202211485787.6A CN202211485787A CN118075528A CN 118075528 A CN118075528 A CN 118075528A CN 202211485787 A CN202211485787 A CN 202211485787A CN 118075528 A CN118075528 A CN 118075528A
Authority
CN
China
Prior art keywords
screen
terminal device
terminal equipment
content
contrast
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211485787.6A
Other languages
Chinese (zh)
Inventor
熊磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202211485787.6A priority Critical patent/CN118075528A/en
Publication of CN118075528A publication Critical patent/CN118075528A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43076Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a multi-screen collaborative display method, a multi-screen collaborative display device and a storage medium. In the technical scheme of the application, after the first terminal equipment establishes multi-screen cooperative connection with the second terminal equipment, screen throwing content of the first terminal equipment and first screen display parameters of the second terminal equipment are obtained; then determining target screen throwing information corresponding to the screen throwing content, wherein the target screen throwing information comprises target screen display parameters which are determined according to the first screen display parameters and the current second screen display parameters of the first terminal equipment; and finally, the target screen throwing information is sent to the second terminal equipment, so that the second terminal equipment displays a picture corresponding to the screen throwing content according to the visual characteristic corresponding to the second screen display parameter. According to the multi-screen collaborative display method, different terminal devices can have the same or similar dynamic range when displaying the same picture under the multi-screen collaborative scene, and user experience is improved.

Description

Multi-screen collaborative display method, device and storage medium
Technical Field
The present application relates to the field of computer application technologies, and in particular, to a multi-screen collaborative display method, device, and storage medium.
Background
With the development of intelligent terminal technology, a user or a family often has a plurality of electronic devices capable of communicating with each other. Various terminal devices generally have respective device characteristics, for example, portability of a mobile phone is better, and display effect of a television screen is better. In order to fully exert the advantages of different terminal devices, the terminal devices can switch and display data among a plurality of devices in a screen throwing mode, namely, multi-screen collaborative display among the terminal devices is realized.
In the related art, when multi-screen collaboration is performed between terminal devices, sharing of a screen is mostly achieved by directly transmitting a current display screen of one terminal device to another terminal device or transmitting a rendering instruction of one terminal device side to another terminal device for re-rendering, however, because the screen hardware capabilities of different terminal devices may be different, the shared content may present different dynamic ranges of the screen to users on different terminal devices, so that user experience is poor.
Disclosure of Invention
The application provides a multi-screen collaborative display method, a multi-screen collaborative display device and a storage medium, which can enable different terminal devices to have the same or similar dynamic range when displaying the same picture under a multi-screen collaborative scene, and improve user experience.
In a first aspect, the present application provides a multi-screen collaborative display method, which is applied to a first terminal device, and the method includes: after the first terminal equipment establishes multi-screen cooperative connection with the second terminal equipment, acquiring screen throwing content of the first terminal equipment and first screen display parameters of the second terminal equipment; the first terminal equipment determines target screen projection information corresponding to the screen projection content, wherein the target screen projection information comprises target screen display parameters which are determined according to the first screen display parameters and current second screen display parameters of the first terminal equipment; and sending the target screen throwing information to the second terminal equipment so that the second terminal equipment displays the picture corresponding to the screen throwing content according to the visual characteristic corresponding to the second screen display parameter.
In the embodiment of the application, the first terminal equipment determines the target screen throwing information according to the current screen display parameters of the first terminal equipment and the screen display parameters of the second terminal equipment, so that the same or similar dynamic range of the first terminal equipment and the second terminal equipment when displaying the same picture can be met under a multi-screen collaborative scene, and the user experience is improved.
In a second aspect, the present application provides a multi-screen collaborative display method, which is applied to a second terminal device, and the method includes: after the second terminal equipment establishes multi-screen cooperative connection with the first terminal equipment, acquiring screen throwing content of the first terminal equipment and current first screen display parameters of the first terminal equipment; the second terminal equipment determines target screen projection information corresponding to the screen projection content, wherein the target screen projection information comprises target screen display parameters which are determined according to the first screen display parameters and second screen display parameters of the second terminal equipment; and the second terminal equipment displays the picture corresponding to the screen throwing content according to the target screen throwing information and the visual characteristic corresponding to the first screen display parameter.
In the embodiment of the application, the second terminal equipment determines the target screen throwing information according to the current screen display parameters of the first terminal equipment and the screen display parameters of the second terminal equipment, so that the same or similar dynamic range of the first terminal equipment and the second terminal equipment when displaying the same picture can be met under a multi-screen collaborative scene, and the user experience is improved.
In a third aspect, the present application provides a multi-screen collaborative display device, the device comprising: the acquisition module is used for acquiring screen throwing content of the first terminal equipment and first screen display parameters of the second terminal equipment after the first terminal equipment and the second terminal equipment are in multi-screen cooperative connection; the determining module is used for determining target screen projection information corresponding to the screen projection content, wherein the target screen projection information comprises target screen display parameters which are determined according to the first screen display parameters and the current second screen display parameters of the first terminal equipment; and the sending module is used for sending the target screen throwing information to the second terminal equipment so that the second terminal equipment displays the picture corresponding to the screen throwing content according to the visual characteristic corresponding to the second screen display parameter.
In a fourth aspect, the present application provides a multi-screen collaborative display device, the device comprising: the acquisition module is used for acquiring screen throwing content of the first terminal equipment and current first screen display parameters of the first terminal equipment after the second terminal equipment establishes multi-screen cooperative connection with the first terminal equipment; the determining module is used for determining target screen projection information corresponding to the screen projection content, wherein the target screen projection information comprises target screen display parameters which are determined according to the first screen display parameters and second screen display parameters of the second terminal equipment; and the display module is used for displaying the picture corresponding to the screen throwing content according to the target screen throwing information and the visual characteristic corresponding to the first screen display parameter.
In a fifth aspect, the present application provides a terminal device, comprising a processor and a memory, the memory being for storing code instructions; the processor is configured to execute the code instructions to implement the method of the first aspect or the second aspect.
In a sixth aspect, the present application provides a computer readable storage medium storing a computer program (which may also be referred to as code, or instructions) which, when run on a computer, causes the computer to perform the method of the first or second aspects described above.
In a seventh aspect, the present application provides a computer program product comprising: a computer program (which may also be referred to as code, or instructions) which, when executed, causes a computer to perform the method of the first or second aspect described above.
Drawings
FIG. 1 is a schematic diagram of an application scenario provided by an embodiment of the present application;
fig. 2 is a schematic diagram of a system architecture of a terminal device according to an embodiment of the present application;
FIG. 3 is a flowchart of a multi-screen collaborative display method according to an embodiment of the present application;
FIG. 4 is a flowchart of a multi-screen collaborative display method according to another embodiment of the present application;
FIG. 5 is a flowchart of a multi-screen collaborative display method according to another embodiment of the present application;
FIG. 6 is a flowchart of a multi-screen collaborative display method according to another embodiment of the present application;
FIG. 7 is a schematic structural diagram of a multi-screen collaborative display device according to an embodiment of the present application;
FIG. 8 is a schematic structural diagram of a multi-screen collaborative display device according to another embodiment of the present application;
Fig. 9 is a schematic structural view of an apparatus according to another embodiment of the present application.
Detailed Description
The technical scheme of the application will be described below with reference to the accompanying drawings.
In order to clearly describe the technical solution of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. For example, the first instruction and the second instruction are for distinguishing different user instructions, and the sequence of the instructions is not limited. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
In the present application, the words "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
Furthermore, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, and c may represent: a, b, or c, or a and b, or a and c, or b and c, or a, b and c, wherein a, b and c can be single or multiple.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application. As shown in fig. 1, the application scenario 100 may include a first terminal device 101 and a second terminal device 102, where the first terminal device 101 may screen its displayed content on the second terminal device 102 through a wireless network and/or bluetooth, and may operate the second terminal device 102 on the first terminal device 101, or operate the first terminal device 101 on the second terminal device 102, to implement multi-screen collaborative display.
It should be understood that, the first terminal device 101 may be projected onto other terminal devices, or the second terminal device 102 may be projected onto the first terminal device 101 or other terminal devices, which is not limited by the number of terminal devices that implement multi-screen collaborative display. The application mainly takes the example that the first terminal equipment 101 throws the screen to the second terminal equipment 102.
It should also be understood that the first terminal device 101 and the second terminal device 102 may be the same type of terminal device, or may be different types of terminal devices, which is not limited in this disclosure.
Under the above scenario, due to the influence of factors such as network environment and screen hardware capability of the terminal device, the dynamic range of the display frames of the first terminal device 101 and the second terminal device 102 in the multi-screen collaboration process is inconsistent, for example, on the premise that the pixels of the shared frames of the first terminal device 101 and the second terminal device 102 are the same, the brightness of the display frame of the first terminal device 101 is 100 nit, the brightness of the display frame of the second terminal device 102 is 500 nit, the brightness of the two frames is different, the visual effects to the user are also different, and the user experience is poor.
In view of this, the embodiments of the present application provide a multi-screen collaborative display method, apparatus, and storage medium, where in the technical solution of the present application, after a first terminal device establishes multi-screen collaborative connection with a second terminal device, screen contents of the first terminal device and first screen display parameters of the second terminal device are obtained; then determining target screen throwing information corresponding to the screen throwing content, wherein the target screen throwing information comprises target screen display parameters which are determined according to the first screen display parameters and the current second screen display parameters of the first terminal equipment; and finally, the target screen throwing information is sent to the second terminal equipment, so that the second terminal equipment displays a picture corresponding to the screen throwing content according to the visual characteristic corresponding to the second screen display parameter. The multi-screen collaborative display method provided by the application can enable different terminal devices to have the same or similar dynamic range when displaying the same picture, and improve user experience.
It should be understood that the terminal device to which the embodiment of the present application relates may be a mobile phone (mobile phone), a tablet computer, a notebook computer, a palm computer, a mobile internet device (mobile INTERNET DEVICE, MID), a wearable device, a Virtual Reality (VR) device, an augmented reality (augmented reality, AR) device, a smart screen, an artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) sound, a headset, a terminal in industrial control (industrial control), a terminal in unmanned (SELF DRIVING), a terminal in teleoperation (remote medical surgery), a terminal in smart grid (SMART GRID), a terminal in transportation security (transportation safety), a terminal in smart city (SMART CITY), a terminal in smart home (smart home), a personal digital assistant (personal DIGITAL ASSISTANT, PDA), and the like, to which the embodiment of the present application is not limited.
Fig. 2 is a schematic diagram of a system architecture of a terminal device according to an embodiment of the present application. As shown in fig. 2, the terminal device includes a processor 210, a memory 220, a transceiver 230, a display unit 240, an input unit 250, a sensor 260, an audio circuit 270, and a power module 280.
The processor 210 is a control center of the terminal device, connects various parts of the entire terminal device using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 220 and calling data stored in the memory 220, thereby performing overall monitoring of the terminal device. Optionally, the processor 210 may include one or more processing units; alternatively, the processor 210 may be an integrated application processor, which primarily processes operating devices, user interfaces, application programs, etc., although other processors may be included, as is not explicitly recited herein.
The memory 220 may be used to store software programs and modules, and the processor 210 performs various functional applications of the terminal device and data processing by executing the software programs and modules stored in the memory 220. The memory 220 mainly includes a storage program area that can store an operating device, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and a storage data area; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the terminal device, and the like. In addition, memory 220 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
Transceiver 230 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc. for application on a terminal device. The transceiver 230 may be one or more devices that integrate at least one communication processing module, such as, without limitation, integrating an antenna with a baseband processor into the transceiver 230, or integrating an antenna with a modem processor into the transceiver 230, etc.
The display unit 240 may be used to display information input by a user or information provided to the user and various menus of the terminal device. The display unit 240 may be configured in the form of a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), or the like, and is not limited herein.
Illustratively, in the embodiment of the present application, the content displayed by the first terminal device 101 is displayed on the second terminal device 102 through the display unit 240.
The input unit 250 may be used to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the terminal device. Specifically, the input unit 250 may collect operations on or near the user and drive the corresponding connection device according to a preset program. In addition, the input unit 250 may include a touch panel, and the touch panel may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 250 may include other input devices in addition to the touch panel. In particular, other input devices may include, but are not limited to, one or more of function keys (such as volume control keys, switch keys, etc.), trackballs, joysticks, and the like.
The terminal device may also include at least one sensor 260, such as a gyroscopic sensor, a motion sensor, and other sensors. The motion sensor can comprise an acceleration sensor, is used for detecting the acceleration in all directions, can detect the gravity and the direction when the motion sensor is static, and can be used for identifying the application of the gesture of the terminal equipment, such as horizontal and vertical screen switching, related games, magnetometer gesture calibration and the like; other sensors such as a pressure gauge, a barometer, a hygrometer, a thermometer, an infrared sensor, a fingerprint sensor, etc. that may be further configured for the terminal device are not described herein.
Audio circuitry 270 may include a speaker and microphone that may provide an audio interface between a user and the terminal device. Audio circuit 270 may transmit the received electrical signal after audio data conversion to a speaker, where it is converted to a sound signal for output; on the other hand, the microphone converts the collected sound signals into electrical signals, which are received by the audio circuit 270 and converted into audio data, which are processed by the audio data output processor 210 for transmission via a video circuit to, for example, another terminal device, or which are output to the memory 220 for further processing.
The terminal device further comprises a power module 280 for supplying power to the respective components, and optionally, the power module 280 may be logically connected to the processor 210 through a power management device, so as to implement functions of managing charging, discharging, power consumption management, and the like through the power management device.
Although not shown, the terminal device may further include a camera. Optionally, the position of the camera on the terminal device may be front-mounted or rear-mounted, which is not limited by the embodiment of the present application.
It will be appreciated that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the terminal device. In other embodiments of the application, the terminal device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
In order to make the purpose and the technical scheme of the application clearer and more intuitive, the multi-screen collaborative display method, the device and the storage medium provided by the embodiment of the application are described in detail below with reference to the accompanying drawings and the embodiment. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
Referring to fig. 3, a flowchart of a multi-screen collaborative display method according to an embodiment of the application is provided. The method may be applied to the application scenario shown in fig. 1, and may be applied to other scenarios, which is not limited in this embodiment of the present application. For convenience of explanation, the application of the method in the scenario shown in fig. 1 will be taken as an example, and accordingly, the first terminal device hereinafter is the first terminal device 101 shown in fig. 1, and the second terminal device hereinafter is the second terminal device 102 shown in fig. 1. The steps in the method shown in fig. 3 are described in detail below, the flowchart comprising:
S301, after the first terminal equipment establishes multi-screen cooperative connection with the second terminal equipment, screen throwing content of the first terminal equipment is obtained.
It should be understood that the first terminal device may establish a multi-screen cooperative connection with the second terminal device through a wireless network and/or bluetooth, and both the first terminal device and the second terminal device may operate on a screen shared after the screen is thrown.
S302, the second terminal equipment sends first screen display parameters of the second terminal equipment to the first terminal equipment.
Accordingly, the first terminal device receives the first screen display parameter.
It should be understood that the first screen display parameter includes information such as brightness, pixels, etc. of the current screen of the second terminal device, and a maximum contrast ratio that can be achieved by the hardware capability of the second terminal device, where the maximum contrast ratio is a screen maximum brightness/a screen minimum brightness of the second terminal device. Of course, the first screen display parameters may be those listed or other parameters, which are not limited in the present application.
It should be noted that, the embodiment of the application does not limit the number of terminal devices for realizing multi-screen collaborative display.
S303, the first terminal equipment determines target screen projection information corresponding to the screen projection content, wherein the target screen projection information comprises target screen display parameters, and the target screen display parameters are determined according to the first screen display parameters and the current second screen display parameters of the first terminal equipment.
It should be understood that the current second screen display parameter of the first terminal device includes information such as brightness and pixels of a screen displayed by the current screen of the first terminal device, and the target screen projection information corresponding to the screen projection content of the first terminal device, which is determined by the first terminal device according to the second screen display parameter and the first screen display parameter, includes original screen projection content of the first terminal device, or updated screen projection content of the first terminal device, and the target screen display parameter.
The target screen display parameters comprise second screen display parameters of the first terminal equipment at present or screen display parameters associated with updated screen throwing content, such as screen brightness, pixels and the like.
It should be understood that the updated screen content of the first terminal device may be content obtained by adjusting parameters such as pixels and brightness of the original screen content of the first terminal device, which is not limited in this application.
S304, the first terminal equipment sends the target screen throwing information to the second terminal equipment, so that the second terminal equipment displays a picture corresponding to the screen throwing content according to the visual characteristic corresponding to the second screen display parameter.
It should be understood that after the first terminal device determines the target screen projection information, the first terminal device encodes and packages the target screen projection information and then sends the target screen projection information to the second terminal device.
Correspondingly, the second terminal equipment receives the target screen projection information, decodes the target screen projection information, and then displays a picture corresponding to the screen projection content according to the visual characteristic corresponding to the second screen display parameter.
It should be understood that the visual characteristic corresponding to the second screen display parameter is used to indicate the brightness of the first screen and the display characteristic of the pixels of the first screen after the mapping process. Because the human eyes have different regional sensibilities to different contrasts or different brightnesses, the image can keep the visual perception of the human eyes on the real scene by mapping the chromaticity, the brightness, the dynamic range and the like of the image into the range of one standard.
For example, assuming that the interface between the mobile phone and the computer is a white screen, and the current pixel value is the same, the brightness of the mobile phone is 500 nit, and the brightness of the computer is 100 nit, only the pixels of the mobile phone are reduced by 5 times, the visual perception of the mobile phone screen and the computer screen is consistent, that is, the sensitivity of the human eyes to the content displayed on the mobile phone screen and the computer screen is basically consistent.
In the technical scheme, the first terminal equipment determines the target screen throwing information according to the current screen display parameters of the first terminal equipment and the screen display parameters of the second terminal equipment, so that the same or similar dynamic range of the first terminal equipment and the second terminal equipment when displaying the same picture can be met under the multi-screen collaborative scene, and the user experience is improved.
Based on the above embodiment, fig. 4 is a flowchart of a multi-screen collaborative display method according to another embodiment of the present application, in the embodiment shown in fig. 4, taking an example that the second screen display parameter includes a current screen contrast of the first terminal device and the first screen display parameter includes a screen maximum contrast of the second terminal device, the following details each step in the method shown in fig. 4, where the flowchart includes:
S401, after the first terminal equipment establishes multi-screen cooperative connection with the second terminal equipment, screen throwing content of the first terminal equipment is obtained.
This step is similar to step S301 in the embodiment shown in fig. 3, and will not be described here again.
S402, the second terminal equipment sends first screen display parameters of the second terminal equipment to the first terminal equipment.
Accordingly, the first terminal device receives the first screen display parameter.
This step is similar to step S302 in the embodiment shown in fig. 3, and will not be described here again.
S403, the first terminal device compares the current screen contrast of the first terminal device with the maximum screen contrast of the second terminal device.
It should be understood that the current screen contrast of the first terminal device is the ratio of the brightness of the current display screen of the first terminal device to the minimum brightness, and the maximum screen contrast of the second terminal device is the ratio of the maximum brightness to the minimum brightness that can be displayed by the screen of the second terminal device.
S404, if the current screen contrast of the first terminal device is greater than the screen maximum contrast of the second terminal device, the first terminal device preprocesses the screen throwing content of the first terminal device, determines target screen throwing information according to the preprocessed screen throwing content, and displays that the screen contrast corresponding to the preprocessed screen throwing content is less than or equal to the screen maximum contrast of the second terminal device.
It should be understood that if the current screen contrast of the first terminal device is greater than the maximum screen contrast of the second terminal device, it is indicated that the screen hardware capability of the second terminal device cannot meet the requirement of directly displaying the screen-throwing content with the current screen display parameter of the first terminal device, at this time, the first terminal device needs to preprocess the screen-throwing content according to the maximum screen contrast of the second terminal device, and then determine the corresponding target screen display parameter to obtain the target screen-throwing information, so that the second terminal device meets the visual characteristic display corresponding to the current screen display parameter of the first terminal device within the hardware capability range of the second terminal device.
The preprocessing comprises tone mapping processing on screen projection content of the first terminal equipment and/or adjusting screen brightness of the first terminal equipment.
It should be appreciated that the Tone mapping process, i.e., tonemapping process, is used to map colors from an original Tone (typically high dynamic range (HIGH DYNAMIC RANGE, HDR)) to a target Tone (typically low dynamic range imaging (LDR)), the result of the mapping being displayed through a medium to achieve the effect of restoring the original scene as much as possible under the visual characteristics of the human eye. The effect of Tone mapping depends on the mapping from the original Tone to the target Tone, which can be a simple scaling or truncation, or an approximation to the existing relationship in photography.
For example, it is assumed that the interface between the mobile phone and the computer is currently a white picture, the pixel value is the same, the brightness of the mobile phone is 500 nit, the brightness of the computer is 100 nit, and the maximum brightness of the computer is 100 nit, at this time, before the mobile phone performs screen projection to the computer, the mobile phone may perform tone mapping processing on the current picture, for example, reduce the pixel value by 5 times, and then generate the target display parameter corresponding to the computer, that is, display with the screen brightness of 100 nit, and finally obtain the target screen projection information according to the tone mapped picture and the target display parameter of the computer. Or before the mobile phone performs screen projection to the computer, the mobile phone can adjust the current screen brightness, for example, directly adjust the current screen brightness to 100 nits, or modulate the brightness to 400 nits, reduce the pixel value by 4 times, then generate the target display parameter corresponding to the computer, namely, display the screen brightness to 100 nits, and finally obtain the target screen projection information according to the adjusted screen and the target display parameter of the computer.
And S405, the first terminal equipment sends the target screen throwing information to the second terminal equipment so that the second terminal equipment displays a picture corresponding to the screen throwing content according to the visual characteristic corresponding to the second screen display parameter.
This step is similar to step S303 in the embodiment shown in fig. 3, and will not be described here again.
In one possible implementation manner, if the current screen contrast of the first terminal device is less than or equal to the maximum screen contrast of the second terminal device, the second terminal device indicates that the second terminal device is capable of directly displaying the screen display content with the current screen display parameter of the first terminal device, and at this time, it is determined that the target screen display parameter is the current second screen display parameter of the first terminal device, and the target screen display information includes the screen display content of the first terminal device and the second screen display parameter.
In the embodiment, the first terminal device performs preprocessing on the screen throwing content under the condition that the screen hardware capability of the second terminal device is limited, so that the second terminal device finally meets the requirement of displaying the screen throwing content with the visual characteristic corresponding to the current screen display parameter of the first terminal device in the hardware capability range of the second terminal device, and the user experience is improved.
It should also be understood that, in the foregoing embodiment, the screen content of the first terminal device may include a picture displayed on the current screen of the first terminal device or a rendering instruction corresponding to the picture displayed on the current screen of the first terminal device, and the form of the screen content of the first terminal device is not limited by the present application.
In the embodiments shown in fig. 3 and fig. 4, the first terminal device has a processing capability to determine the target screen projection information is described as an example, and in a specific implementation, the second terminal device may also have a processing capability to determine the target screen projection information, and in the following, the second terminal device has a processing capability to determine the target screen projection information is described as an example, so that the multi-screen collaborative display method in the present application is described.
Referring to fig. 5, fig. 5 is a flowchart of a multi-screen collaborative display method according to another embodiment of the present application, in the embodiment shown in fig. 5, taking a case that a second terminal device has processing capability to determine target screen projection information as an example, each step in the method shown in fig. 5 is described in detail below, where the flowchart includes:
S501, after the first terminal equipment establishes multi-screen cooperative connection with the second terminal equipment, the first terminal equipment sends screen throwing content of the first terminal equipment and current first screen display parameters of the first terminal equipment to the second terminal equipment.
Correspondingly, the second terminal equipment receives the screen throwing content of the first terminal equipment and the current first screen display parameter of the first terminal equipment.
It should be understood that the second terminal device may establish a multi-screen cooperative connection with the first terminal device through a wireless network and/or bluetooth, etc., and both the first terminal device and the second terminal device may operate on a shared screen.
It should also be understood that the current first screen display parameter of the first terminal device includes information such as brightness and pixels of the current screen of the first terminal device.
It should be noted that, the embodiment of the application does not limit the number of terminal devices for realizing multi-screen collaborative display.
S502, the second terminal equipment determines target screen projection information corresponding to the screen projection content, wherein the target screen projection information comprises target screen display parameters, and the target screen display parameters are determined according to the first screen display parameters and the second screen display parameters of the second terminal equipment.
It should be understood that the second screen display parameter of the second terminal device includes information such as brightness, pixels, etc. of the current screen of the second terminal device, and a maximum contrast ratio that can be achieved by the hardware capability of the second terminal device, where the maximum contrast ratio is a screen maximum brightness/a screen minimum brightness of the second terminal device. Of course, the second screen display parameters may be those listed or other parameters, which are not limited in the present application. The target screen throwing information corresponding to the screen throwing content of the first terminal equipment, which is determined by the second terminal equipment according to the first screen display parameter and the second screen display parameter, comprises the original screen throwing content of the first terminal equipment, or the screen throwing content after updating the original screen throwing content, and the target screen display parameter.
The target screen display parameters comprise current first screen display parameters of the first terminal equipment or screen display parameters associated with updated screen throwing content, such as screen brightness, pixels and the like.
It should be understood that the updated screen content may be content obtained by adjusting parameters such as pixels and brightness of the original screen content of the first terminal device, which is not limited in the present application.
And S503, the second terminal equipment displays the picture corresponding to the screen throwing content according to the target screen throwing information and the visual characteristic corresponding to the first screen display parameter.
It is understood that the second terminal device displays the screen throwing content according to the target screen display parameters in the target screen throwing information, so as to realize multi-screen collaborative display.
In this embodiment, the second terminal device determines the target screen throwing information according to the current screen display parameter of the first terminal device and the screen display parameter of the second terminal device, so that the same or similar dynamic range of the first terminal device and the second terminal device when displaying the same picture can be satisfied under the multi-screen collaborative scene, and the user experience is improved.
Based on the embodiment shown in fig. 5, fig. 6 is a flowchart of a multi-screen collaborative display method according to another embodiment of the present application, in the embodiment shown in fig. 6, taking an example that a first screen display parameter includes a current screen contrast of a first terminal device and a second screen display parameter includes a maximum screen contrast of a second terminal device, each step in the method shown in fig. 6 is described in detail below, where the flowchart includes:
S601, after the first terminal equipment establishes multi-screen cooperative connection with the second terminal equipment, the first terminal equipment sends screen throwing content of the first terminal equipment and current first screen display parameters of the first terminal equipment to the second terminal equipment.
Correspondingly, the second terminal equipment receives the screen throwing content of the first terminal equipment and the current first screen display parameter of the first terminal equipment.
This step is similar to step S501 in the embodiment shown in fig. 5, and will not be described here again.
S602, the second terminal device compares the current screen contrast of the first terminal device with the maximum screen contrast of the second terminal device.
It should be understood that the current screen contrast of the first terminal device is the ratio of the brightness of the current display screen of the first terminal device to the minimum brightness, and the maximum screen contrast of the second terminal device is the ratio of the maximum brightness to the minimum brightness that can be displayed by the screen of the second terminal device.
S603, if the current screen contrast of the first terminal device is larger than the screen maximum contrast of the second terminal device, the second terminal device preprocesses the screen throwing content of the first terminal device, determines target screen throwing information according to the preprocessed screen throwing content, and displays that the screen contrast corresponding to the preprocessed screen throwing content is smaller than or equal to the screen maximum contrast of the second terminal device.
It should be understood that if the current screen contrast of the first terminal device is greater than the maximum screen contrast of the second terminal device, it is indicated that the screen hardware capability of the second terminal device cannot meet the requirement of directly displaying the screen-throwing content with the current screen display parameter of the first terminal device, at this time, the second terminal device needs to preprocess the screen-throwing content according to the maximum screen contrast of the second terminal device, and then determine the corresponding target screen display parameter to obtain the target screen-throwing information, so that the second terminal device meets the visual characteristic display corresponding to the current screen display parameter of the first terminal device within the hardware capability range of the second terminal device.
The preprocessing comprises tone mapping processing of screen projection content of the first terminal equipment.
For example, assuming that the interfaces of the tablet computer and the computer are currently a white picture, the pixel values are the same, the brightness of the tablet computer is 100 nits, the brightness of the computer is 500 nits, at this time, the tablet computer performs screen projection to the computer, after receiving the screen projection content, the computer can perform tone mapping processing on the picture corresponding to the screen projection content, for example, reduce the pixel value by 5 times, then determine the target screen display parameter, wherein the brightness parameter in the target screen display parameter is still 500 nits, and finally obtain the target screen projection information according to the picture and the target display parameter after the tone mapping processing.
In one possible implementation manner, if the current screen contrast of the first terminal device is greater than the screen maximum contrast of the second terminal device, and the screen content preprocessed by the second terminal device still cannot meet the screen maximum contrast display of the second terminal device, at this time, if the first terminal device also has processing capability, the second terminal device may send a request message to the first terminal device, where the request message is used to request the first terminal device to preprocess the screen content, so that the screen content preprocessed by the first terminal device meets the screen maximum contrast display of the second terminal device that is less than or equal to the screen maximum contrast display of the second terminal device. Wherein the preprocessing comprises tone mapping processing on the screen content and/or adjusting the screen brightness of the first terminal device.
It should be understood that, in the case where both the first terminal device and the second terminal device have processing capabilities, the terminal device specified in the configuration requirement may be preferentially selected to perform preprocessing on the screen-throwing content based on the preset configuration, and in the case where the capability of one terminal device is limited, auxiliary processing of another terminal device may be requested, so that different terminal devices have the same or similar dynamic range when displaying the same screen, and user experience is improved.
And S604, the second terminal equipment displays a picture corresponding to the screen throwing content according to the target screen throwing information and the visual characteristic corresponding to the first screen display parameter.
This step is similar to step S503 in the embodiment shown in fig. 5, and will not be described here again.
In another possible implementation manner, if the current screen contrast of the first terminal device is less than or equal to the maximum screen contrast of the second terminal device, the second terminal device indicates that the second terminal device is capable of directly displaying the screen-throwing content with the current screen display parameter of the first terminal device, and at this time, the target screen display parameter is determined to be the current first screen display parameter of the first terminal device, where the target screen-throwing information includes the screen-throwing content of the first terminal device and the first screen display parameter.
In the embodiment, the second terminal device performs preprocessing on the screen throwing content under the condition that the screen hardware capability of the second terminal device is limited, so that the second terminal device finally meets the requirement of displaying the screen throwing content with the visual characteristic corresponding to the current screen display parameter of the first terminal device in the hardware capability range of the second terminal device, and the user experience is improved.
It should also be understood that, in the foregoing embodiment, the screen content of the first terminal device may include a picture displayed on the current screen of the first terminal device or a rendering instruction corresponding to the picture displayed on the current screen of the first terminal device, and the form of the screen content of the first terminal device is not limited by the present application.
In summary, the multi-screen collaborative display method combines the hardware capability of the screen of the terminal equipment, processes the screen throwing content, and can ensure that the same or similar dynamic range exists between the terminal equipment when the same picture is displayed in a multi-screen collaborative scene, thereby improving the user experience.
It should also be understood that the foregoing embodiments may be coupled to one another, and the application is not limited in this regard. The sequence numbers of the above processes do not mean the sequence of the execution sequence, and the execution sequence of each process should be determined according to the functions and the internal logic, and should not limit the implementation process of the embodiment of the present application.
The multi-screen collaborative display method according to the embodiment of the present application is described in detail above with reference to fig. 1 to 6, and the multi-screen collaborative display device according to the embodiment of the present application is described in detail below with reference to fig. 7 to 9.
Referring to fig. 7 with reference to fig. 1 to fig. 4, fig. 7 is a schematic structural diagram of a multi-screen collaborative display device 700 according to an embodiment of the application, where the device 700 includes: an acquisition module 701, a determination module 702 and a transmission module 703.
The acquiring module 701 is configured to acquire screen content of the first terminal device and a first screen display parameter of the second terminal device after the first terminal device and the second terminal device establish multi-screen cooperative connection; a determining module 702, configured to determine target screen projection information corresponding to the screen projection content, where the target screen projection information includes a target screen display parameter, and the target screen display parameter is determined according to the first screen display parameter and a current second screen display parameter of the first terminal device; and a sending module 703, configured to send the target screen projection information to the second terminal device, so that the second terminal device displays a screen corresponding to the screen projection content according to the visual characteristic corresponding to the second screen display parameter.
In some embodiments, the second screen display parameter includes a current screen contrast of the first terminal device, the first screen display parameter includes a screen maximum contrast of the second terminal device, and the determining module 702 is specifically configured to: comparing the current screen contrast of the first terminal device with the maximum screen contrast of the second terminal device; if the current screen contrast of the first terminal device is smaller than or equal to the maximum screen contrast of the second terminal device, determining that the target screen display parameter is the current second screen display parameter of the first terminal device, wherein the target screen projection information comprises the screen projection content of the first terminal device and the second screen display parameter; if the current screen contrast of the first terminal device is larger than the maximum screen contrast of the second terminal device, preprocessing the screen throwing content of the first terminal device, determining the target screen throwing information according to the preprocessed screen throwing content, and displaying that the screen contrast corresponding to the preprocessed screen throwing content is smaller than or equal to the maximum screen contrast of the second terminal device.
In some embodiments, the preprocessing includes tone mapping the screen content of the first terminal device and/or adjusting the screen brightness of the first terminal device.
In some embodiments, the screen-casting content includes a screen displayed currently by the first terminal device or a rendering instruction for indicating the screen displayed currently by the first terminal device.
Referring to fig. 8 with reference to fig. 5 and 6, fig. 8 is a schematic structural diagram of a multi-screen collaborative display device 800 according to another embodiment of the present application, the device 800 includes: an acquisition module 801, a determination module 802, and a display module 803.
The acquiring module 801 is configured to acquire, after the second terminal device establishes a multi-screen cooperative connection with the first terminal device, screen content of the first terminal device and a current first screen display parameter of the first terminal device; a determining module 802, configured to determine target screen projection information corresponding to the screen projection content, where the target screen projection information includes a target screen display parameter, and the target screen display parameter is determined according to the first screen display parameter and a second screen display parameter of the second terminal device; and the display module 803 is configured to display a picture corresponding to the screen-casting content according to the target screen-casting information and with a visual characteristic corresponding to the first screen display parameter.
In some embodiments, the first screen display parameter includes a current screen contrast of the first terminal device, the second screen display parameter includes a screen maximum contrast of the second terminal device, and the determining module 802 is specifically configured to: the second terminal equipment compares the current screen contrast of the first terminal equipment with the maximum screen contrast of the second terminal equipment; if the current screen contrast of the first terminal device is smaller than or equal to the maximum screen contrast of the second terminal device, determining that the target screen display parameter is the current first screen display parameter of the first terminal device, wherein the target screen projection information comprises the screen projection content of the first terminal device and the first screen display parameter; if the current screen contrast of the first terminal device is larger than the screen maximum contrast of the second terminal device, the second terminal device preprocesses the screen projection content of the first terminal device, determines the target screen projection information according to the preprocessed screen projection content, and displays that the screen contrast corresponding to the preprocessed screen projection content is smaller than or equal to the screen maximum contrast of the second terminal device.
In some embodiments, the apparatus further comprises: the system comprises a sending module and a receiving module, wherein the sending module is used for sending a request message to the first terminal equipment if the current screen contrast of the first terminal equipment is larger than the screen maximum contrast of the second terminal equipment and the preprocessed screen projection content cannot be displayed with the screen maximum contrast smaller than or equal to that of the second terminal equipment, and the request message is used for requesting the first terminal equipment to preprocess the screen projection content so that the preprocessed screen projection content can be displayed with the screen maximum contrast smaller than or equal to that of the second terminal equipment; and the receiving module is used for receiving the preprocessed screen projection content.
In some embodiments, the preprocessing includes tone mapping the screen content of the first terminal device and/or adjusting the screen brightness of the first terminal device.
In some embodiments, the screen-casting content includes a screen displayed currently by the first terminal device or a rendering instruction for indicating the screen displayed currently by the first terminal device.
It should be appreciated that the apparatus 700 and apparatus 800 herein are embodied in the form of functional modules. The term module herein may refer to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (e.g., a shared, dedicated, or group processor, etc.) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that support the described functionality. In an alternative example, it will be understood by those skilled in the art that the apparatus 700 may be specifically a first terminal device in the foregoing embodiment, or the functions of the first terminal device in the foregoing embodiment may be integrated in the apparatus 700, and the apparatus 700 may be used to perform each flow and/or step corresponding to the first terminal device in the foregoing method embodiment, which is not described herein for avoiding repetition. In another alternative example, it will be understood by those skilled in the art that the apparatus 800 may be specifically the second terminal device in the foregoing embodiment, or the functions of the second terminal device in the foregoing embodiment may be integrated in the apparatus 800, and the apparatus 800 may be used to execute each flow and/or step corresponding to the second terminal device in the foregoing method embodiment, which is not described herein for avoiding repetition.
The apparatus 700 has a function of implementing the corresponding step performed by the first terminal device in the method, and the apparatus 800 has a function of implementing the corresponding step performed by the second terminal device in the method; the above functions may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the functions described above.
Fig. 9 is a schematic structural view of an apparatus according to another embodiment of the present application. The apparatus shown in fig. 9 may be used to perform the method of any of the previous embodiments.
As shown in fig. 9, the apparatus 900 of the present embodiment includes: memory 901, processor 902, communication interface 903, and bus 904. The memory 901, the processor 902, and the communication interface 903 are communicatively connected to each other via a bus 904.
The memory 901 may be a Read Only Memory (ROM), a static storage device, a dynamic storage device, or a random access memory (random access memory, RAM). The memory 901 may store a program, and when the program stored in the memory 901 is executed by the processor 902, the processor 902 is configured to perform the steps of the method shown in the above-described embodiment.
The processor 902 may employ a general-purpose central processing unit (central processing unit, CPU), microprocessor, application-specific integrated circuit (ASIC), or one or more integrated circuits for executing associated programs to perform the various methods illustrated in embodiments of the application.
The processor 902 may also be an integrated circuit chip with signal processing capabilities. In implementation, various steps of methods of embodiments of the present application may be performed by integrated logic circuitry in hardware or by instructions in software in processor 902.
The processor 902 may also be a general purpose processor, a digital signal processor (DIGITAL SIGNAL processing, DSP), an ASIC, an off-the-shelf programmable gate array (field programmable GATE ARRAY, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The methods, steps and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory 901 and a processor 902 reads information in the memory 901 and combines the functions necessary to be performed by the units comprised in the inventive apparatus with its hardware.
The communication interface 903 may enable communication between the apparatus 900 and other devices or communication networks using, but is not limited to, a transceiver or the like.
The bus 904 may include a path for transferring information between various components of the apparatus 900 (e.g., the memory 901, the processor 902, the communication interface 903).
It should be understood that the apparatus 900 shown in the embodiment of the present application may be an electronic device, or may be a chip configured in an electronic device.
It should be understood that, in various embodiments of the present application, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. A multi-screen collaborative display method, which is characterized by being applied to a first terminal device, the method comprising:
After the first terminal equipment establishes multi-screen cooperative connection with the second terminal equipment, acquiring screen throwing content of the first terminal equipment and first screen display parameters of the second terminal equipment;
The first terminal equipment determines target screen projection information corresponding to the screen projection content, wherein the target screen projection information comprises target screen display parameters which are determined according to the first screen display parameters and current second screen display parameters of the first terminal equipment;
And sending the target screen throwing information to the second terminal equipment so that the second terminal equipment displays the picture corresponding to the screen throwing content according to the visual characteristic corresponding to the second screen display parameter.
2. The method of claim 1, wherein the second screen display parameter comprises a current screen contrast of the first terminal device, the first screen display parameter comprises a screen maximum contrast of the second terminal device, and the determining the target screen projection information corresponding to the screen projection content comprises:
The first terminal equipment compares the current screen contrast of the first terminal equipment with the maximum screen contrast of the second terminal equipment;
If the current screen contrast of the first terminal device is smaller than or equal to the maximum screen contrast of the second terminal device, determining that the target screen display parameter is the current second screen display parameter of the first terminal device, wherein the target screen projection information comprises the screen projection content of the first terminal device and the second screen display parameter;
If the current screen contrast of the first terminal device is larger than the maximum screen contrast of the second terminal device, preprocessing the screen throwing content of the first terminal device, determining the target screen throwing information according to the preprocessed screen throwing content, and displaying that the screen contrast corresponding to the preprocessed screen throwing content is smaller than or equal to the maximum screen contrast of the second terminal device.
3. The method according to claim 2, wherein the preprocessing comprises tone mapping the screen content of the first terminal device and/or adjusting the screen brightness of the first terminal device.
4. A method according to any one of claims 1 to 3, wherein the screen-cast content comprises a screen currently displayed by the first terminal device or a rendering instruction for indicating a screen currently displayed by the first terminal device.
5. A multi-screen collaborative display method, which is characterized by being applied to a second terminal device, the method comprising:
After the second terminal equipment establishes multi-screen cooperative connection with the first terminal equipment, acquiring screen throwing content of the first terminal equipment and current first screen display parameters of the first terminal equipment;
the second terminal equipment determines target screen projection information corresponding to the screen projection content, wherein the target screen projection information comprises target screen display parameters which are determined according to the first screen display parameters and second screen display parameters of the second terminal equipment;
and the second terminal equipment displays the picture corresponding to the screen throwing content according to the target screen throwing information and the visual characteristic corresponding to the first screen display parameter.
6. The method of claim 5, wherein the first screen display parameter comprises a current screen contrast of the first terminal device, the second screen display parameter comprises a screen maximum contrast of the second terminal device, and the determining the target screen projection information corresponding to the screen projection content comprises:
the second terminal equipment compares the current screen contrast of the first terminal equipment with the maximum screen contrast of the second terminal equipment;
If the current screen contrast of the first terminal device is smaller than or equal to the maximum screen contrast of the second terminal device, determining that the target screen display parameter is the current first screen display parameter of the first terminal device, wherein the target screen projection information comprises the screen projection content of the first terminal device and the first screen display parameter;
If the current screen contrast of the first terminal device is larger than the screen maximum contrast of the second terminal device, the second terminal device preprocesses the screen projection content of the first terminal device, determines the target screen projection information according to the preprocessed screen projection content, and displays that the screen contrast corresponding to the preprocessed screen projection content is smaller than or equal to the screen maximum contrast of the second terminal device.
7. The method of claim 6, wherein the method further comprises:
If the current screen contrast of the first terminal device is greater than the screen maximum contrast of the second terminal device, and the preprocessed screen projection content cannot meet the screen maximum contrast display of the second terminal device or less, sending a request message to the first terminal device, wherein the request message is used for requesting the first terminal device to preprocess the screen projection content, so that the preprocessed screen projection content meets the screen maximum contrast display of the second terminal device or less;
And receiving the preprocessed screen projection content.
8. The method according to claim 6 or 7, wherein the preprocessing comprises tone mapping the screen content of the first terminal device and/or adjusting the screen brightness of the first terminal device.
9. The method according to any one of claims 5 to 8, wherein the screen-cast content comprises a screen currently displayed by the first terminal device or a rendering instruction for indicating a screen currently displayed by the first terminal device.
10. A multi-screen collaborative display device, the device comprising:
The acquisition module is used for acquiring screen throwing content of the first terminal equipment and first screen display parameters of the second terminal equipment after the first terminal equipment and the second terminal equipment are in multi-screen cooperative connection;
The determining module is used for determining target screen projection information corresponding to the screen projection content, wherein the target screen projection information comprises target screen display parameters which are determined according to the first screen display parameters and the current second screen display parameters of the first terminal equipment;
And the sending module is used for sending the target screen throwing information to the second terminal equipment so that the second terminal equipment displays the picture corresponding to the screen throwing content according to the visual characteristic corresponding to the second screen display parameter.
11. A multi-screen collaborative display device, the device comprising:
The acquisition module is used for acquiring screen throwing content of the first terminal equipment and current first screen display parameters of the first terminal equipment after the second terminal equipment establishes multi-screen cooperative connection with the first terminal equipment;
The determining module is used for determining target screen projection information corresponding to the screen projection content, wherein the target screen projection information comprises target screen display parameters which are determined according to the first screen display parameters and second screen display parameters of the second terminal equipment;
And the display module is used for displaying the picture corresponding to the screen throwing content according to the target screen throwing information and the visual characteristic corresponding to the first screen display parameter.
12. A terminal device comprising a processor and a memory, the memory for storing code instructions; the processor is configured to execute the code instructions to perform the method of any one of claims 1 to 4 or 5 to 9.
13. A computer readable storage medium storing a computer program comprising instructions for implementing the method of any one of claims 1 to 4 or 5 to 9.
CN202211485787.6A 2022-11-24 2022-11-24 Multi-screen collaborative display method, device and storage medium Pending CN118075528A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211485787.6A CN118075528A (en) 2022-11-24 2022-11-24 Multi-screen collaborative display method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211485787.6A CN118075528A (en) 2022-11-24 2022-11-24 Multi-screen collaborative display method, device and storage medium

Publications (1)

Publication Number Publication Date
CN118075528A true CN118075528A (en) 2024-05-24

Family

ID=91096085

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211485787.6A Pending CN118075528A (en) 2022-11-24 2022-11-24 Multi-screen collaborative display method, device and storage medium

Country Status (1)

Country Link
CN (1) CN118075528A (en)

Similar Documents

Publication Publication Date Title
WO2018171429A1 (en) Image stitching method, device, terminal, and storage medium
WO2018219013A1 (en) Image processing method and device, computer readable storage medium and electronic device
CN109191549B (en) Method and device for displaying animation
CN111324259B (en) Group creation method, device and storage medium
CN110944374B (en) Communication mode selection method and device, electronic equipment and medium
US20210335291A1 (en) Method, device and system for data transmission, and display device
CN109068063B (en) Three-dimensional image data processing and displaying method and device and mobile terminal
CN111443765A (en) Electronic equipment and screen display method
CN109636715B (en) Image data transmission method, device and storage medium
CN110045958B (en) Texture data generation method, device, storage medium and equipment
CN116048436A (en) Application interface display method, electronic device and storage medium
CN110971840B (en) Video mapping method and device, computer equipment and storage medium
CN111901679A (en) Method and device for determining cover image, computer equipment and readable storage medium
CN111935509A (en) Multimedia data playing method, related device, equipment and storage medium
CN110163835B (en) Method, device, equipment and computer readable storage medium for detecting screenshot
CN109561258B (en) Light supplementing method and terminal equipment
WO2020083178A1 (en) Digital image display method, apparatus, electronic device, and storage medium
CN111275607A (en) Interface display method and device, computer equipment and storage medium
CN111273885A (en) AR image display method and AR equipment
CN114143280B (en) Session display method and device, electronic equipment and storage medium
CN116943179A (en) Game picture display method, device, equipment and computer readable storage medium
CN115798417A (en) Backlight brightness determination method, device, equipment and computer readable storage medium
CN115798418A (en) Image display method, device, terminal and storage medium
CN118075528A (en) Multi-screen collaborative display method, device and storage medium
CN109729280A (en) A kind of image processing method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination