CN112667184A - Display device - Google Patents

Display device Download PDF

Info

Publication number
CN112667184A
CN112667184A CN202110115837.0A CN202110115837A CN112667184A CN 112667184 A CN112667184 A CN 112667184A CN 202110115837 A CN202110115837 A CN 202110115837A CN 112667184 A CN112667184 A CN 112667184A
Authority
CN
China
Prior art keywords
image
screen
display
black block
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110115837.0A
Other languages
Chinese (zh)
Inventor
张伟丽
徐建锋
臧晓华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vidaa Netherlands International Holdings BV
Original Assignee
Qingdao Hisense Media Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Media Network Technology Co Ltd filed Critical Qingdao Hisense Media Network Technology Co Ltd
Priority to CN202110115837.0A priority Critical patent/CN112667184A/en
Publication of CN112667184A publication Critical patent/CN112667184A/en
Priority to PCT/CN2022/074016 priority patent/WO2022161401A1/en
Pending legal-status Critical Current

Links

Images

Abstract

The display device that this application embodiment shows, display device is applicable to the show end, includes: a display and a controller. When the controller receives the screen image, the controller can identify the screen image and delete the black block image of the screen image to obtain a media asset image; and finally, zooming the media asset image according to the resolution of the display to obtain a screen projection image. The display device shown in this embodiment deletes the black block image of the screen image before screen projection to obtain a screen projection image; and finally, zooming the screen projection image, wherein the black block image around the screen projection image is removed, so that the screen projection image can be displayed at the display end, and the user experience feeling is better without black edges on at least two sides.

Description

Display device
Technical Field
The application relates to the technical field of file display, in particular to a display device.
Background
The display device has an independent operating system and supports function expansion. Various application programs can be installed in the smart television of the display device according to the needs of a user, for example, a traditional video application, a short video and other social applications, and a cartoon, book reading and other reading applications. Meanwhile, the interaction such as data transmission, resource sharing or screen projection can be carried out between the display devices. The interaction process between the display devices is described by taking screen projection as an example, and the first display device can be connected with the second display device through a local area network, a Bluetooth and other wireless communication modes, so that the first display device can share resources to the second display device, and a display of the second display device can display the resources shared by the first display device.
In this embodiment, one end of the shared resource is referred to as a Source end (also referred to as a screen projection end in this embodiment), and one end of the display resource is referred to as a Sink end (also referred to as a display end in this embodiment). In the resource sharing process, a Source end needs to capture a video frame of a resource, and then the captured video is compressed and encoded and then transmitted to a Sink end. And the Sink end decompresses the received compressed file and finally displays the decompressed picture.
However, a phenomenon is often encountered in the screen projection process, the Source terminal image display cannot be normal for the screen of the Sink terminal, the periphery of the displayed image of the Sink terminal is black, and the user experience is poor.
Disclosure of Invention
In order to solve technical problems in the prior art, embodiments of the present application illustrate a display device.
A first aspect of an embodiment of the present application shows a display device, where the display device is suitable for a display end, and the display device includes:
a display;
a controller configured to:
receiving a screen image transmitted by a screen projection end;
deleting the black block image of the screen image to obtain a media asset image;
zooming the media asset image according to the resolution of the display to obtain a screen projection image;
and controlling the display to display the screen projection image.
The display device that this application embodiment shows, display device is applicable to the show end, includes: a display and a controller. When the controller receives the screen image, the controller can identify the screen image and delete the black block image of the screen image to obtain a media asset image; and finally, zooming the media asset image according to the resolution of the display to obtain a screen projection image. The display device shown in this embodiment deletes the black block image of the screen image before screen projection to obtain a screen projection image; and finally, zooming the screen projection image, wherein the black block image around the screen projection image is removed, so that the screen projection image can be displayed at the display end, and the user experience feeling is better without black edges on at least two sides.
A second aspect of the embodiments of the present application shows a display device, where the display device is suitable for a screen projection end, and the display device includes:
a controller configured to:
capturing a screen image in response to a screen projection starting instruction input by a user;
deleting the black block image of the screen image to obtain a media asset image;
and outputting the media resource image to a display end so that the display end scales the media resource image according to the resolution of the display end.
The display device that this application embodiment shows, display device is applicable to and throws the screen end, includes: a display and a controller. In response to a screen projection starting instruction input by a user, a controller firstly captures a screen image; and then, deleting the black block image of the screen image to obtain a media asset image. The display device shown in this embodiment deletes the black block image of the screen image before performing image transmission with the screen projection end, so as to obtain a media asset image; and finally, transmitting the media asset image to a screen projection end so that the screen projection end can zoom the media asset image. In the display device shown in this embodiment, because the black block images around the media asset images are removed, the media asset images are zoomed to obtain the display images without the black block images around the display images, so that it can be ensured that at least two sides of the projection images have no black side and the user experience is better when the projection images are displayed at the display end.
Drawings
In order to more clearly illustrate the embodiments of the present application or the implementation manner in the related art, a brief description will be given below of the drawings required for the description of the embodiments or the related art, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art according to the drawings.
FIG. 1 illustrates a usage scenario of a display device according to some embodiments;
fig. 2 illustrates a hardware configuration block diagram of the control apparatus 100 according to some embodiments;
fig. 3 illustrates a hardware configuration block diagram of the display apparatus 200 according to some embodiments;
FIG. 4 illustrates a software configuration diagram in the display device 200 according to some embodiments;
FIG. 5 illustrates an interface of a video-on-demand program, which may be directly accessed by a display device after being started;
FIG. 6 is a flowchart illustrating interaction between a presentation side and a screen projection side according to one possible embodiment;
FIG. 7 is a schematic diagram illustrating a change of a screen where a media asset image is located during a screen projection process according to a feasible embodiment;
FIG. 8 is a flowchart illustrating interaction between a presentation side and a screen projection side according to one possible embodiment;
FIG. 9 is a diagram illustrating a variation of a screen where a media asset image is located during a screen projection process according to a feasible embodiment;
FIG. 10 is a flow chart illustrating a screen image processing method according to one possible embodiment;
FIG. 11 is a flow diagram illustrating a generate screen image capture time implementation in accordance with one possible embodiment;
FIG. 12 is a flowchart of an implementation of the screen shot controller capturing a screen image when the asset type is non-video;
FIG. 13 is a flowchart of an implementation of the screen projection controller capturing a screen image when the asset type is video;
FIG. 14 is a flow chart illustrating a manner of generating a media asset image according to one possible embodiment;
FIG. 15 is a diagram illustrating a screen-up presentation interface and screen images according to one possible embodiment;
FIG. 16 is a diagram illustrating a variation process of a screen image captured by a screen-projecting end during a photo playing process according to an embodiment;
FIG. 17 illustrates a variation process of a screen image captured by a screen-projection end during a photo playing process according to a possible embodiment;
FIG. 18 is a flow chart illustrating a manner of identifying black block images according to one possible embodiment;
FIG. 19 is a schematic diagram of a screen image shown in accordance with one possible embodiment;
FIG. 20 is a flow chart illustrating a manner of identifying black block images according to one possible embodiment;
FIG. 21 is a flow chart illustrating the manner in which a black block image is identified, according to one possible embodiment;
FIG. 22 is a schematic view of a screen image shown in accordance with a possible embodiment;
fig. 23 is a flowchart illustrating a method of identifying a black block image according to a possible embodiment.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Fig. 1 is a schematic diagram of a usage scenario of a display device according to an embodiment. As shown in fig. 1, the display apparatus 200 is also in data communication with a server 400, and a user can operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes at least one of an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may control the display apparatus 200 by inputting a user instruction through at least one of a key on a remote controller, a voice input, a control panel input, and the like.
In some embodiments, the smart device 300 may include any of a mobile terminal, a tablet, a computer, a laptop, an AR/VR device, and the like.
In some embodiments, the smart device 300 may also be used to control the display device 200. For example, the display device 200 is controlled using an application running on the smart device.
In some embodiments, the smart device 300 and the display device may also be used for communication of data.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice instruction control of the user may be directly received by a module configured inside the display device 200 to obtain a voice instruction, or may be received by a voice control apparatus provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
In some embodiments, software steps executed by one step execution agent may be migrated on demand to another step execution agent in data communication therewith for execution. Illustratively, software steps performed by the server may be migrated to be performed on a display device in data communication therewith, and vice versa, as desired.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200.
In some embodiments, the communication interface 130 is used for external communication, and includes at least one of a WIFI chip, a bluetooth module, NFC, or an alternative module.
In some embodiments, the user input/output interface 140 includes at least one of a microphone, a touchpad, a sensor, a key, or an alternative module.
Fig. 3 shows a hardware configuration block diagram of the display apparatus 200 according to an exemplary embodiment.
In some embodiments, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, a user interface.
In some embodiments the controller comprises a central processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, the display 260 includes a display screen component for displaying pictures, and a driving component for driving image display, a component for receiving image signals from the controller output, displaying video content, image content, and menu manipulation interface, and a user manipulation UI interface, etc.
In some embodiments, the display 260 may be at least one of a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, the tuner demodulator 210 receives broadcast television signals via wired or wireless reception, and demodulates audio/video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the control device 100 or the server 400 through the communicator 220.
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other actionable control. The operations related to the selected object are: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to an icon.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
A CPU processor. For executing operating system and application instructions stored in the memory, and executing various applications, data and contents according to various interactive instructions receiving external input, so as to finally display and play various audio-video contents. The CPU processor may include a plurality of processors. E.g. comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphics objects, such as: at least one of an icon, an operation menu, and a user input instruction display figure. The graphic processor comprises an arithmetic unit, which performs operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, and perform at least one of video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a signal displayed or played on the direct display device 200.
In some embodiments, the video processor includes at least one of a demultiplexing module, a video decoding module, an image composition module, a frame rate conversion module, a display formatting module, and the like. The demultiplexing module is used for demultiplexing the input audio and video data stream. And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like. And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display. And the frame rate conversion module is used for converting the frame rate of the input video. And the display formatting module is used for converting the received video output signal after the frame rate conversion, and changing the signal to be in accordance with the signal of the display format, such as an output RGB data signal.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform at least one of noise reduction, digital-to-analog conversion, and amplification processing to obtain a sound signal that can be played in the speaker.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on display 260, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include at least one of an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc. visual interface elements.
In some embodiments, user interface 280 is an interface that may be used to receive control inputs (e.g., physical buttons on the body of the display device, or the like).
In some embodiments, a system of a display device may include a Kernel (Kernel), a command parser (shell), a file system, and an application. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and inter-thread communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user application. The application is compiled into machine code after being started, forming a thread.
Referring to fig. 4, in some embodiments, the system is divided into four layers, which are, from top to bottom, an APPlication (apptications) layer (referred to as an "APPlication layer"), an APPlication Framework (apptication Framework) layer (referred to as a "Framework layer"), an Android runtime (Android runtime) and system library layer (referred to as a "system runtime library layer"), and a kernel layer.
In some embodiments, at least one application runs in the application layer, and the applications may be a Window (Window) program carried by an operating system, a system setting program, a clock program or the like; or an application developed by a third party developer. In particular implementations, the application packages in the application layer are not limited to the above examples.
The framework layer provides an APPlication Programming Interface (API) and a programming framework for applications of the APPlication layer. The application framework layer includes some predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. Through the API interface, the application can access the resources in the system and obtain the services of the system in execution.
As shown in fig. 4, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to an application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the various applications and the general navigation fallback functions, such as controlling exit, open, fallback, etc. of the applications. The window manager is used for managing all window programs, such as acquiring the size of a display screen, judging whether a status bar exists, locking the screen, capturing the screen, controlling the change of the display window (for example, reducing the display window, shaking the display, distorting the display, and the like), and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the core layer includes at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
In some embodiments, the display device may directly enter the interface of the preset vod program after being activated, and the interface of the vod program may include at least a navigation bar 510 and a content display area located below the navigation bar 510, as shown in fig. 5, where the content displayed in the content display area may change according to the change of the selected control in the navigation bar. The program in the application layer can be integrated in the video-on-demand program and displayed through one control of the navigation bar, and can also be further displayed after the application control in the navigation bar is selected.
In some embodiments, the display device may directly enter a display interface of a signal source selected last time after being started, or a signal source selection interface, where the signal source may be a preset video-on-demand program, or may be at least one of an HDMI interface, a live tv interface, and the like, and after a user selects different signal sources, the display may display contents obtained from different signal sources. May be used.
The scheme display device shown in this embodiment may have a function of resource sharing, where one end of the shared resource is referred to as a Source end (also referred to as a screen projection end in this embodiment), and one end of the display resource is referred to as a Sink end (also referred to as a display end in this embodiment). In the resource sharing process, a Source end needs to capture a video frame of a resource, and then the captured video is compressed and encoded and then transmitted to a Sink end. And the Sink end decompresses the received compressed file and finally displays the decompressed picture.
However, a phenomenon is often encountered in the screen projection process, the Source terminal picture displays a screen at the Sink terminal which cannot be normally displayed, the periphery of the displayed picture at the Sink terminal is black, and the user experience is poor.
In order to solve the above technical problem, an embodiment of the present application illustrates a display device, where the display device is suitable for a display end, and an interaction process between the display end and a screen projection end may refer to fig. 6.
Wherein the screen projection end is configured to perform step S101 to transmit a screen image;
in the scheme shown in this embodiment, the screen image is an image obtained by screen capturing of the display interface of the screen projection terminal. When the resolution of the media asset image displayed by the screen projection end is inconsistent with the resolution of the display of the screen projection end, black edges appear around the screen projection image of the screen projection end. According to the technical scheme shown in the embodiment of the application, the screen image is divided into the black block image and the asset image, wherein the asset image is a picture of the display equipment playing the asset, and the black block image is an image around the asset image.
In the technical solution shown in this embodiment, the display end may include a display (which may be referred to as a display end display in this embodiment for convenience of distinction) and a controller (which may be referred to as a display end controller in this embodiment for convenience of distinction);
in response to receiving the screen image transmitted by the screen projection terminal, the display terminal controller is configured to execute step S102 to delete the black block image of the screen image, so as to obtain a media asset image;
in the practical application process, there are various implementation manners for identifying the black block image. For example, in some possible implementations, the presentation side controller may pre-store black block image pixel values. And in response to receiving the screen image transmitted by the screen projection end, the display end controller traverses the pixel value of each pixel point of the screen image, and determines the image with the pixel value equal to the pixel value of the black block image as the black block image. The embodiment is merely an exemplary way to describe an implementation manner of identifying a black block image, and in the process of practical application, the implementation manner of identifying a black block image may be, but is not limited to, the above manner.
The display end controller is configured to execute step S103 to zoom the media asset image according to the resolution of the display end display to obtain a screen projection image;
the scaling multiple of the media asset image in the technical scheme shown in the embodiment of the application is determined by the ratio of the display resolution of the display end to the media asset image resolution. Specifically, in a feasible embodiment, the presentation end controller may calculate a ratio of a resolution of the presentation end display in the width direction to a resolution of the media asset image in the width direction, to obtain a first ratio; meanwhile, the display end controller can calculate the ratio of the resolution of the display end display in the high direction to the resolution of the media asset image in the high direction to obtain a second ratio; and the display end controller selects a smaller value from the first ratio and the second ratio as a scaling factor.
For example, in a possible embodiment, the resolution of the media image is 1080 × 360, and the resolution of the display end display is 1920 × 1080, in this embodiment, the first ratio value 1920/1080-1.7 and the second ratio value 1080/360-3 are calculated by the display end controller. In this embodiment, 1.7 is selected as a zoom factor, the presentation end controller controls the media asset image to be zoomed in by 1.7 times, so as to obtain a media asset image 1920 × 640, and the presentation end controller controls the display to present the media asset image 1920 × 640.
S104, controlling the display to display the screen projection image.
The display device shown in this embodiment is further described below with reference to specific examples:
referring to fig. 7, fig. 7 is a schematic diagram illustrating a change of a screen where a media asset image is located in a screen projection process according to a feasible embodiment. In this embodiment, the screen projection end is a display device displayed in a vertical screen, and the resolution of the screen projection end is 1080 × 1920, which can be specifically referred to as a schematic diagram 11 in fig. 7; in response to a screen capture instruction input by a user, capturing a screen image at a resolution of 1080 × 1920 by the screen projection end, which can be specifically referred to as a schematic diagram 12 in fig. 7; the screen projection end sends the resolution of the screen image to be 1080 × 1920 to the display end, so that the display end can set a transmission protocol based on the resolution of the screen shot image (the transmission protocol at least comprises a compression resolution). In the embodiment, the compression resolution is 430 × 1080, so the resolution of the screenshot image transmitted to the display end is 430 × 1080, which can be specifically referred to as the schematic diagram 13 in fig. 7; the display end controller deletes the black block image of the screen image to obtain a media asset image, wherein the resolution of the media asset image is 430 × 144, which can be specifically referred to as a schematic diagram 14 in fig. 7; the display end controller calculates the zoom factor of the media asset image to be 4.4 according to the ratio of the resolution of the display end display to the resolution of the media asset image, and controls the media asset image to be amplified by 4.4 times to obtain a screen projection image; finally, the display at the display end is controlled to display the projection image, which can be specifically referred to as a schematic diagram 15 in fig. 7.
The display device that this application embodiment shows, display device is applicable to the show end, includes: a display and a controller. When the controller receives the screen image, the controller can identify the screen image and delete the black block image of the screen image to obtain a media asset image; and finally, zooming the media asset image according to the resolution of the display to obtain a screen projection image. The display device shown in this embodiment deletes the black block image of the screen image before screen projection to obtain a screen projection image; and finally, zooming the screen projection image, wherein the black block image around the screen projection image is removed, so that the screen projection image can be displayed at the display end, and the user experience feeling is better without black edges on at least two sides.
A second aspect of the embodiment of the present application shows a display device, where the display device is suitable for a screen projection end, and an interaction process between a display end and the screen projection end may refer to fig. 8;
in response to a screen projection starting instruction input by a user, the screen projection terminal is configured to execute step S201 to capture a screen image;
the implementation manner of capturing the screen image may adopt an image capturing manner which is customary in the art, and the applicant does not make much limitation here.
The screen projection terminal is configured to execute step S202 to delete the black block image of the screen image to obtain a media asset image;
the implementation of deleting the black block image in the screen image may refer to the above embodiments, and the applicant does not repeat here.
The screen projection end is configured to execute step S203 to output the media asset image to the display end, so that the display end scales the media asset image according to the resolution of the display end.
In this embodiment, the transmission mode of the media asset image may adopt a data transmission mode commonly used in the art, for example, bluetooth transmission, network transmission, and the like.
In this embodiment, the display end includes a display (referred to as a display end display in this embodiment for convenience of distinction) and a controller (referred to as a display end display in this embodiment for convenience of distinction).
In response to receiving the screen image transmitted by the screen projection terminal, the display terminal controller is configured to execute step S204 to zoom the media asset image according to the resolution of the display terminal display, so as to obtain a screen projection image;
the implementation manner of the display end controller scaling the media asset image according to the resolution of the display end display may refer to the above embodiments, and details are not described in this application.
The display end controller is configured to execute step S205 to control the display to display the screen shot image.
The display device shown in this embodiment is further described below with reference to specific examples:
referring to fig. 9, fig. 9 is a schematic diagram illustrating a change of a screen where a media asset image is located in a screen projection process according to a feasible embodiment. In this embodiment, the screen projection end is a display device displayed in a vertical screen, and the resolution of the display at the screen projection end is 1080 × 1920, which can be specifically referred to as a schematic diagram 21 in fig. 9; in response to a screen capture instruction input by a user, the screen projection end controller captures a screen image, wherein the resolution of the screen image is 1080 × 1920, and particularly, the schematic diagram 22 in fig. 9 can be referred to; the screen projection controller deletes the black block image of the screen image to obtain a media asset image, where the resolution of the media asset image is 1080 × 360, which can be specifically referred to as a schematic diagram 23 in fig. 9. The screen projection end controller sends the resolution of the asset image 430 × 144 to the display end, so that the display end can set a transmission protocol (the transmission protocol at least includes a compression resolution) based on the resolution of the asset image, where the compression resolution is 1080 × 360 in this embodiment, and therefore the resolution of the asset image transmitted to the display end is 1080 × 360, which can be specifically referred to the schematic diagram 24 in fig. 9; the display end controller calculates the zoom factor of the media asset image to be 1.7 according to the ratio of the resolution of the display end display to the resolution of the media asset image, and controls the media asset image to be magnified by 1.7 times to obtain a screen projection image; finally, the display at the display end is controlled to display the projection image, which can be specifically referred to as a schematic diagram 25 in fig. 9.
The display device that this application embodiment shows, display device is applicable to and throws the screen end, includes: a display and a controller. In response to a screen projection starting instruction input by a user, a controller firstly captures a screen image; and then, deleting the black block image of the screen image to obtain a media asset image. The display device shown in this embodiment deletes the black block image of the screen image before performing image transmission with the screen projection end, so as to obtain a media asset image; and finally, transmitting the media asset image to a screen projection end so that the screen projection end can zoom the media asset image. In the display device shown in this embodiment, because the black block images around the media asset images are removed, the media asset images are zoomed to obtain the display images without the black block images around the display images, so that it can be ensured that at least two sides of the projection images have no black side and the user experience is better when the projection images are displayed at the display end.
In the practical application process, most of the display terminals are home televisions, and the display direction of the display of the home television is a horizontal display direction in general. In the above application scenario, in order to reduce the data processing amount of the display side, the embodiment of the present application illustrates a method for processing a screen image, specifically, referring to fig. 10, fig. 10 is a flowchart of a screen image processing method according to a feasible embodiment, the method is applicable to a projection side controller, wherein the projection side controller is further configured to execute steps S11 to S131/S132.
Step S11 reads the aspect ratio of the screen image;
there are various implementations of reading the aspect ratio of the screen image, for example, in some feasible embodiments, the projection terminal controller may generate the aspect ratio of the screen image according to the resolution of the screen image. For example, in a possible embodiment, the resolution of the screen image is 1080 × 1920, and the aspect ratio of the screen image is 1080/1920. For another example, in some feasible embodiments, the aspect ratio of the projection-side display may be stored in advance, and the aspect ratio of the projection-side display may be directly called as the aspect ratio of the screen image when the screen image is generated. For example, in a possible embodiment, the aspect ratio of the projection display is 1920/1080, and the aspect ratio of the screen image is 1920/1080.
Step S12 determines whether the aspect ratio is greater than 1;
the implementation of determining whether the aspect ratio is greater than 1 may be a numerical determination conventionally used in the art, and the applicant does not make much limitation herein.
If the aspect ratio is greater than 1, step S131 outputs the screen image to a display end;
in this embodiment, the transmission mode of the media asset image may adopt a data transmission mode commonly used in the art, for example, bluetooth transmission, network transmission, and the like.
If the aspect ratio is less than or equal to 1, step S132 deletes the black block image of the screen image to obtain a media asset image.
The implementation manner of deleting the black block image of the screen image may refer to the above embodiments, and is not described in detail by the applicant herein.
It can be seen that in the scheme shown in this embodiment, the screen projection end controller determines in advance whether the screen image needs to be cropped according to the aspect ratio of the screen image in the scene of generating the screen image. When the aspect ratio of the screen image is larger than 1 in an application scene, the screen image is a horizontal screen display image, and the display direction of the display end display is also the horizontal screen display direction, in this case, even if the screen image is not cut; the final screen image is also well displayed on a display at the display end; further, in the process, the screen image is not cut by the screen projection controller, so that the data processing amount of the screen projection controller is reduced.
In some application scenarios, if the screen projection end is in a media asset playing state, the screen projection end needs to capture and take a plurality of screen images continuously in response to a screen projection starting instruction input by a user. In order to ensure that the display device shown in this embodiment is suitable for the application scenario, this embodiment shows an implementation manner of generating the screen image capturing time, and specifically, referring to fig. 11, fig. 11 is a flowchart illustrating an implementation manner of generating the screen image capturing time according to a feasible embodiment. Wherein if the display apparatus is in the media asset play state, the controller is configured to perform steps S21-S22.
Responding to a screen casting starting instruction input by a user, executing a step S21 to read the type of the currently played media assets;
there are various ways to read the type of the currently played media assets. For example, in some feasible embodiments, the asset type of the asset is determined based on a suffix of the currently playing asset.
S22 determines the time of screen image capture according to the asset type.
Specifically, if the asset type is non-video, the implementation of the screen capturing by the screen-throwing controller may refer to fig. 12, wherein the controller is further configured to execute step S221 to capture a frame of screen image in response to the switching of the presentation assets.
For example, in a feasible embodiment, the screen projection end is in a photo playing stage, the screen projection end controller switches one photo each time in response to receiving a screen projection starting instruction, and the screen projection end controller captures one frame of screen image until the screen projection end controller exits the screen projection function.
If the asset type is video, an implementation of the controller capturing a screen image may refer to fig. 13, wherein the controller is further configured to perform step S222 of capturing a frame of screen image at intervals of a preset time.
In this embodiment, the preset time may be set according to the requirement, and the applicant does not make much limitation herein. For example, in a feasible embodiment the preset time may be 5 ms.
For example, in a feasible embodiment, the screen projection end is in a video playing stage, and the screen projection end controller responds to receiving a screen projection starting instruction, and the screen projection end controller does not capture one frame of screen image at an interval of 5ms until the screen projection end controller exits the screen projection function.
If the asset type is a video, in order to further reduce the data processing amount of the controller, an embodiment of the present application shows a generation manner of asset images, specifically, referring to fig. 14, fig. 14 is a flowchart of the generation manner of asset images according to a feasible embodiment, and the method is applicable to a screen-casting side controller, wherein the screen-casting side controller is further configured to execute steps S31 to S32.
S31, dividing a display area of the screen projection end display into an effective area and an invalid area, wherein the effective area is an area corresponding to a media resource image of a first frame of the screen image, the invalid area is an area corresponding to a black block image of the first frame of the screen image, and the display area corresponds to the screen image;
under normal conditions, the display area of the projection display corresponds to the screen image; specifically, referring to fig. 15, fig. 15 is a schematic diagram illustrating a screen-projection end display interface and a screen image according to a possible embodiment. The display area of the projection end display can refer to the schematic diagram 31 in fig. 15, and the screen image can refer to the schematic diagram 32 in fig. 15. The screen projection end controller can identify a captured first frame of screen image, and identify a black block image and a media asset image; then, the display area is divided into an effective area and an ineffective area according to the corresponding relationship between the display area and the screen image, and the specific division effect can be seen from the schematic diagram 33 in fig. 15.
S32, deleting the images corresponding to the invalid areas in each frame of screen image in sequence to obtain the media asset images.
Since the application scene shown in this embodiment is that the asset type is a video, in the process of playing the video, the resolutions corresponding to each frame of asset image are all consistent, and in this scene, the screen projection controller does not need to identify the screen image captured each time, and can delete the image corresponding to the invalid area according to the corresponding relationship between the display area of the screen projection and the screen image, so as to obtain the asset image.
The technical solution shown in this embodiment is further described below with reference to specific examples,
referring to fig. 16, fig. 16 is a schematic diagram illustrating a change process of a screen image captured by a screen-projecting end in a photo playing process according to a possible embodiment. Specifically, in the embodiment, the screen projection end is in the photo playing stage, and the screen projection end controller captures a frame of screen image in response to receiving the screen projection start instruction, which can specifically refer to the schematic diagram 41 in fig. 16. When the screen-end controller switches the photos, triggering the screen-end controller to capture the screen image for the second time specifically can refer to the schematic diagram 42 in fig. 16; when the screen-end controller switches the photos, triggering the screen-end controller to capture the screen image for the third time specifically can refer to the schematic diagram 43 in fig. 16; when the screen-end controller switches the photos, triggering the screen-end controller to capture the screen image for the fourth time may be performed in sequence according to the schematic diagram 44 … … in fig. 16 until the screen-end controller exits the screen-projection function.
Referring to fig. 17, fig. 17 is a diagram illustrating a change process of a screen image captured by a screen-projecting end in a photo playing process according to a feasible embodiment. Specifically, in this embodiment, the screen projection end is in a video playing stage, the screen projection end controller determines that the currently played media asset type is a video in response to receiving a screen projection start instruction, and the screen projection end controller captures a frame of screen image every 5ms, for example, the screen image captured by the screen projection end controller in the 5 th ms may refer to a schematic diagram 51 in fig. 17; the screen image captured by the projection end controller at the 10 th ms can be seen in the schematic diagram 52 in fig. 17; the screen image captured by the screen-end controller at 15ms can be seen in the schematic diagram 53 in fig. 17; the screen images captured by the projection end controller at 20ms may be continued in sequence, as shown in the diagram 54 … … in fig. 17, until the projection end controller exits the projection function.
In some feasibility, in order to further reduce the data processing amount of the controller, the embodiment of the present application illustrates a black block image identification manner, specifically, referring to fig. 18, where fig. 18 is a flowchart illustrating the black block image identification manner according to a feasible embodiment, and the controller is further configured to perform steps S41 to S45;
s41 recognizing an image of the screen image downward from the top of the screen image;
s42, in response to the recognition of the non-black block image, recording a first position, wherein the first position is a position corresponding to the top edge of the non-black block image in the screen image;
s43 recognizing an image of the screen image upward from a bottom end of the screen image;
s44, in response to the recognition of the non-black block image, recording a second position, wherein the second position is a position corresponding to the bottom edge of the non-black block image in the screen image;
s45 determines that the image corresponding to the first position to the top of the screen image is a black block image, and the image corresponding to the second position to the bottom of the screen image is a black block image.
The following describes the recognition method of the black block image with reference to specific examples. Fig. 19 is a schematic diagram of a screen image shown according to a possible embodiment. After capturing a screen image, the projection terminal controller may identify an image of the screen image from the top of the screen image downwards, and record a first position in response to identifying a non-black block image, where the first position is a boundary position between the black block image and the non-black block image, as can be seen from fig. 19. In response to the completion of the recording of the first position, the screen projection end controller in the technical solution shown in this embodiment does not continuously recognize the screen image downward, but recognizes the image of the screen image upward from the bottom end of the screen image, and in response to recognizing the non-black block image, records a second position, where the second position is a boundary position between the black block image and the non-black block image in this embodiment. And finally, the screen projection end controller determines that the image corresponding to the first position to the top end of the screen image is a black block image. It can be seen that the screen projection controller adopting the black block image identification method shown in this embodiment does not need to identify the whole frame of screen image, and reduces the data processing amount of the screen projection controller to a certain extent.
In some feasibility, in order to further reduce the data processing amount of the controller, the embodiment of the present application illustrates a black block image identification manner, specifically, referring to fig. 20, where fig. 20 is a flowchart illustrating the black block image identification manner according to a feasible embodiment, and the controller is further configured to perform steps S51 to S55;
s51 recognizing an image of the screen image upward from a bottom end of the screen image;
s52, in response to the recognition of the non-black block image, recording a second position, wherein the second position is a position corresponding to the bottom edge of the non-black block image in the screen image;
s53 recognizing an image of the screen image downward from the top of the screen image;
s54, in response to the recognition of the non-black block image, recording a first position, wherein the first position is a position corresponding to the top edge of the non-black block image in the screen image;
s55 determines that the image corresponding to the first position to the top of the screen image is a black block image, and the image corresponding to the second position to the bottom of the screen image is a black block image.
It can be seen that the screen projection controller adopting the black block image identification method shown in this embodiment does not need to identify the whole frame of screen image, and reduces the data processing amount of the screen projection controller to a certain extent.
In some feasibility, in order to further reduce the data processing amount of the controller, the embodiment of the present application illustrates a black block image identification manner, specifically, referring to fig. 21, where fig. 21 is a flowchart illustrating the black block image identification manner according to a feasible embodiment, and the controller is further configured to perform steps S61 to S65;
s61 recognizing an image of the screen image from the left side to the right side of the screen image;
s62, in response to identifying the non-black block image, records a third position, where the right edge of the non-black block image is at a corresponding position of the screen image.
S63 recognizing an image of the screen image to the left from the right side of the screen image;
s64, in response to the fact that the non-black block image is identified, recording a fourth position, wherein the fourth position is a position, corresponding to the screen image, of the left edge of the non-black block image;
s65 determines that the image corresponding to the third position to the left boundary of the screen image is a black block image, and the image corresponding to the fourth position to the right boundary of the screen image is a black block image.
The following describes the recognition method of the black block image with reference to specific examples. Fig. 22 is a schematic view of a screen image shown according to a possible embodiment. After capturing a screen image, the projection terminal controller may identify an image of the screen image from the left side of the screen image to the right side, and record a third position in response to identifying a non-black block image, where the third position is a boundary position between the black block image and the non-black block image, as can be seen from fig. 22. In response to the completion of the recording of the third position, the screen projection controller according to the present embodiment does not continue to recognize the screen image to the right, but recognizes the image of the screen image to the left from the right side of the screen image; and recording a fourth position in response to the recognition of the non-black block image, wherein the fourth position is a boundary position of the black block image and the non-black block image in the embodiment. And finally, the screen projection end controller determines that the image corresponding to the first position to the top end of the screen image is a black block image. It can be seen that the screen projection controller adopting the black block image identification method shown in this embodiment does not need to identify the whole frame of screen image, and reduces the data processing amount of the screen projection controller to a certain extent.
In some feasibility, in order to further reduce the data processing amount of the controller, the embodiment of the present application illustrates a black block image identification manner, specifically, referring to fig. 23, where fig. 23 is a flowchart illustrating the black block image identification manner according to a feasible embodiment, and the controller is further configured to perform steps S71 to S75;
s71 recognizing an image of the screen image to the left from the right side of the screen image;
s72, in response to the fact that the non-black block image is identified, recording a fourth position, wherein the fourth position is a position, corresponding to the screen image, of the left edge of the non-black block image;
s73 recognizing an image of the screen image from the left side to the right side of the screen image;
s74, responding to the recognition of the non-black block image, recording a third position, wherein the third position is the position corresponding to the right edge of the non-black block image in the screen image
S75 determines that the image corresponding to the third position to the left boundary of the screen image is a black block image, and the image corresponding to the fourth position to the right boundary of the screen image is a black block image.
It can be seen that the screen projection controller adopting the black block image identification method shown in this embodiment does not need to identify the whole frame of screen image, and reduces the data processing amount of the screen projection controller to a certain extent.
In a specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in each embodiment of the method for customizing a control key and the method for starting the control key provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a Random Access Memory (RAM), or the like.
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of software products, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and include instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method in the embodiments or some parts of the embodiments of the present invention.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application. The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device adapted for a display end, comprising:
a display;
a controller configured to:
receiving a screen image transmitted by a screen projection end;
deleting the black block image of the screen image to obtain a media asset image;
zooming the media asset image according to the resolution of the display to obtain a screen projection image;
and controlling the display to display the screen projection image.
2. A display device, the display device is suitable for throw the screen end, its characterized in that includes:
a controller configured to:
capturing a screen image in response to a screen projection starting instruction input by a user;
deleting the black block image of the screen image to obtain a media asset image;
and outputting the media resource image to a display end so that the display end scales the media resource image according to the resolution of the display end.
3. The display device of claim 2, wherein the controller is further configured to:
reading the aspect ratio of the screen image;
if the aspect ratio is larger than 1, outputting the screen image to a display end;
and if the aspect ratio is less than or equal to 1, deleting the black block image of the screen image to obtain a media asset image.
4. The display device of claim 2 or 3, wherein if in the media asset play state, the controller is further configured to:
responding to a screen throwing starting instruction input by a user, and reading the type of the currently played media assets;
and determining the time for capturing the screen image according to the media asset type.
5. The display device of claim 4, wherein if the asset type is non-video, the controller is further configured to:
in response to the switching of the presentation assets, a frame of screen image is captured.
6. The display device of claim 4, wherein if the asset type is video, the controller is further configured to:
one frame of screen image is captured at preset intervals.
7. The display device of claim 6, wherein the controller is further configured to:
dividing a display area of the display into an effective area and an invalid area, wherein the effective area is an area corresponding to a media resource image of a first frame of the screen image, the invalid area is an area corresponding to a black block image of the first frame of the screen image, and the display area corresponds to the screen image;
and deleting images corresponding to the invalid areas in each frame of the screen image in sequence to obtain a media asset image.
8. The display device of claim 1, wherein the controller is further configured to: and outputting the resolution of the media asset image to the display end, so that the display end sets a transmission protocol based on the resolution of the media asset image.
9. The display device according to claim 1 or 2, wherein the controller is further configured to:
identifying an image of the screen image from a top of the screen image downward;
recording a first position in response to the recognition of the non-black block image, wherein the first position is a position corresponding to the top edge of the non-black block image in the screen image;
recognizing an image of the screen image from a bottom end of the screen image upward;
recording a second position in response to the recognition of the non-black block image, wherein the second position is a position corresponding to the bottom edge of the non-black block image in the screen image;
determining that the image corresponding to the top end of the screen image from the first position is a black block image, and determining that the image corresponding to the bottom end of the screen image from the second position is a black block image.
10. The display device according to claim 1 or 2, wherein the controller is further configured to:
recognizing an image of the screen image from a bottom end of the screen image upward;
recording a second position in response to the recognition of the non-black block image, wherein the second position is a position corresponding to the bottom edge of the non-black block image in the screen image;
identifying an image of the screen image from a top of the screen image downward;
recording a first position in response to the recognition of the non-black block image, wherein the first position is a position corresponding to the top edge of the non-black block image in the screen image;
determining that the image corresponding to the top end of the screen image from the first position is a black block image, and determining that the image corresponding to the bottom end of the screen image from the second position is a black block image.
CN202110115837.0A 2021-01-28 2021-01-28 Display device Pending CN112667184A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110115837.0A CN112667184A (en) 2021-01-28 2021-01-28 Display device
PCT/CN2022/074016 WO2022161401A1 (en) 2021-01-28 2022-01-26 Screen-projection data processing method and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110115837.0A CN112667184A (en) 2021-01-28 2021-01-28 Display device

Publications (1)

Publication Number Publication Date
CN112667184A true CN112667184A (en) 2021-04-16

Family

ID=75414818

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110115837.0A Pending CN112667184A (en) 2021-01-28 2021-01-28 Display device

Country Status (1)

Country Link
CN (1) CN112667184A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113010135A (en) * 2021-04-29 2021-06-22 深圳Tcl新技术有限公司 Data processing method and device, display terminal and storage medium
CN113891167A (en) * 2021-08-27 2022-01-04 荣耀终端有限公司 Screen projection method and electronic equipment
CN113905268A (en) * 2021-09-28 2022-01-07 四川长虹电器股份有限公司 Black edge removing method for screen projection display of mobile terminal
WO2022161401A1 (en) * 2021-01-28 2022-08-04 青岛海信传媒网络技术有限公司 Screen-projection data processing method and display device
CN116737097A (en) * 2022-09-30 2023-09-12 荣耀终端有限公司 Screen projection image processing method and electronic equipment
WO2024012345A1 (en) * 2022-07-14 2024-01-18 华为技术有限公司 Mirroring picture processing method and related apparatus
CN116887005B (en) * 2021-08-27 2024-05-03 荣耀终端有限公司 Screen projection method, electronic device and computer readable storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102883135A (en) * 2012-11-01 2013-01-16 成都飞视美视频技术有限公司 Screen sharing and control method thereof
CN104615300A (en) * 2014-12-29 2015-05-13 杰发科技(合肥)有限公司 Image receiving device and method for judging screen arranging state of electronic device
CN106095084A (en) * 2016-06-06 2016-11-09 乐视控股(北京)有限公司 Throw screen method and device
CN205789050U (en) * 2016-07-05 2016-12-07 田江华 A kind of screen of being thrown by mobile phone plane plate is to the perpendicular system shielding advertisement machine
CN107105184A (en) * 2017-04-01 2017-08-29 深圳市蓝莓派科技有限公司 A kind of same screen projective techniques of mobile terminal in portrait layout advertisement machine
CN108334651A (en) * 2018-02-08 2018-07-27 北京小米移动软件有限公司 Collect method, apparatus and storage medium that user's end data realizes preset need
CN110267073A (en) * 2019-07-24 2019-09-20 深圳市颍创科技有限公司 A kind of throwing screen picture, which show and throws, shields picture spinning solution
CN110647303A (en) * 2019-08-30 2020-01-03 北京文渊佳科技有限公司 Multimedia playing method, device, storage medium and electronic equipment
CN112153459A (en) * 2020-09-01 2020-12-29 三星电子(中国)研发中心 Method and device for screen projection display

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102883135A (en) * 2012-11-01 2013-01-16 成都飞视美视频技术有限公司 Screen sharing and control method thereof
CN104615300A (en) * 2014-12-29 2015-05-13 杰发科技(合肥)有限公司 Image receiving device and method for judging screen arranging state of electronic device
CN106095084A (en) * 2016-06-06 2016-11-09 乐视控股(北京)有限公司 Throw screen method and device
CN205789050U (en) * 2016-07-05 2016-12-07 田江华 A kind of screen of being thrown by mobile phone plane plate is to the perpendicular system shielding advertisement machine
CN107105184A (en) * 2017-04-01 2017-08-29 深圳市蓝莓派科技有限公司 A kind of same screen projective techniques of mobile terminal in portrait layout advertisement machine
CN108334651A (en) * 2018-02-08 2018-07-27 北京小米移动软件有限公司 Collect method, apparatus and storage medium that user's end data realizes preset need
CN110267073A (en) * 2019-07-24 2019-09-20 深圳市颍创科技有限公司 A kind of throwing screen picture, which show and throws, shields picture spinning solution
CN110647303A (en) * 2019-08-30 2020-01-03 北京文渊佳科技有限公司 Multimedia playing method, device, storage medium and electronic equipment
CN112153459A (en) * 2020-09-01 2020-12-29 三星电子(中国)研发中心 Method and device for screen projection display

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022161401A1 (en) * 2021-01-28 2022-08-04 青岛海信传媒网络技术有限公司 Screen-projection data processing method and display device
CN113010135A (en) * 2021-04-29 2021-06-22 深圳Tcl新技术有限公司 Data processing method and device, display terminal and storage medium
CN113010135B (en) * 2021-04-29 2024-03-12 深圳Tcl新技术有限公司 Data processing method and device, display terminal and storage medium
CN113891167A (en) * 2021-08-27 2022-01-04 荣耀终端有限公司 Screen projection method and electronic equipment
CN116887005A (en) * 2021-08-27 2023-10-13 荣耀终端有限公司 Screen projection method and electronic equipment
CN116887005B (en) * 2021-08-27 2024-05-03 荣耀终端有限公司 Screen projection method, electronic device and computer readable storage medium
CN113905268A (en) * 2021-09-28 2022-01-07 四川长虹电器股份有限公司 Black edge removing method for screen projection display of mobile terminal
WO2024012345A1 (en) * 2022-07-14 2024-01-18 华为技术有限公司 Mirroring picture processing method and related apparatus
CN116737097A (en) * 2022-09-30 2023-09-12 荣耀终端有限公司 Screen projection image processing method and electronic equipment

Similar Documents

Publication Publication Date Title
CN112667184A (en) Display device
CN112672195A (en) Remote controller key setting method and display equipment
CN114302190A (en) Display device and image quality adjusting method
CN112601117B (en) Display device and content presentation method
CN113535019A (en) Display device and display method of application icons
WO2022161401A1 (en) Screen-projection data processing method and display device
CN113111214A (en) Display method and display equipment for playing records
CN112597110A (en) Display device and file display method
CN112601109A (en) Audio playing method and display device
CN112328553A (en) Thumbnail capturing method and display device
CN113453069B (en) Display device and thumbnail generation method
CN113064691B (en) Display method and display equipment for starting user interface
CN112911381B (en) Display device, mode adjustment method, device and medium
CN112926420B (en) Display device and menu character recognition method
CN116980554A (en) Display equipment and video conference interface display method
CN114302101A (en) Display apparatus and data sharing method
CN112668546A (en) Video thumbnail display method and display equipment
CN113596559A (en) Method for displaying information in information bar and display equipment
CN113064534A (en) Display method and display equipment of user interface
CN112911371A (en) Double-channel video resource playing method and display equipment
CN114302203A (en) Image display method and display device
CN113286185A (en) Display device and homepage display method
CN113132809A (en) Channel switching method, channel program playing method and display equipment
CN112601116A (en) Display device and content display method
CN112367550A (en) Method for realizing multi-title dynamic display of media asset list and display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20221017

Address after: 83 Intekte Street, Devon, Netherlands

Applicant after: VIDAA (Netherlands) International Holdings Ltd.

Address before: 266100 Songling Road, Laoshan District, Qingdao, Shandong Province, No. 399

Applicant before: QINGDAO HISENSE MEDIA NETWORKS Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210416