CN112399254A - Display device and color gamut space dynamic adjustment method - Google Patents

Display device and color gamut space dynamic adjustment method Download PDF

Info

Publication number
CN112399254A
CN112399254A CN201911018447.0A CN201911018447A CN112399254A CN 112399254 A CN112399254 A CN 112399254A CN 201911018447 A CN201911018447 A CN 201911018447A CN 112399254 A CN112399254 A CN 112399254A
Authority
CN
China
Prior art keywords
color gamut
gamut space
application
package name
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911018447.0A
Other languages
Chinese (zh)
Other versions
CN112399254B (en
Inventor
孙永瑞
修建竹
王之奎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to PCT/CN2020/085007 priority Critical patent/WO2021031589A1/en
Publication of CN112399254A publication Critical patent/CN112399254A/en
Application granted granted Critical
Publication of CN112399254B publication Critical patent/CN112399254B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/22Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
    • G09G3/30Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels
    • G09G3/32Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED]
    • G09G3/3208Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED] organic, e.g. using organic light-emitting diodes [OLED]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4854End-user interface for client configuration for modifying image parameters, e.g. image brightness, contrast

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application is used for the display equipment with the camera to provide a powerful social function. The application provides a display device and a dynamic color gamut space adjusting method, which judges whether a current start application is a camera related application or not by monitoring the start state of an application program in the display device in real time, and converts a display color gamut space of a display into a second color gamut space provided by the camera related application if the current start application is the camera related application, or adjusts environmental image data collected by a camera to perform color gamut space conversion. The method can dynamically adjust the display color gamut space according to the started application program, improve the difference between the display color gamut space and the color gamut space corresponding to the application program, and relieve the problem of color distortion.

Description

Display device and color gamut space dynamic adjustment method
The present application claims priority of chinese patent application entitled "a dynamic color gamut space adjustment method and display device" filed by chinese patent office on 18/8/2019 under application number 201910761462.8, which is incorporated herein by reference in its entirety.
Technical Field
The application relates to the technical field of smart televisions, in particular to a display device and a color gamut space dynamic adjusting method.
Background
The color gamut space, also called color space, refers to the range of colors that can be represented by display media such as screens, digital output devices, and print reproduction devices, and includes a plurality of color gamut standards, such as bt.709, bt.2020, etc. established by the ITU international telecommunication union for high definition television, depending on the application scenario. Due to the development of high-definition television display technology, the color gamut space of the television tends to be larger and larger.
Meanwhile, the smart television which is not limited to the display function is also developed, and the smart television can obtain additional functions through a built-in or external sensor, a control device and the like on the basis of the television. For example, a camera is built in a television to collect image data, and the collected image data is displayed on a screen of the television in real time in cooperation with software application in the television, so that functions of 'looking into a mirror', 'chatting while looking', and the like are realized.
However, limited to the hardware of the internal or external device, the color gamut space corresponding to the internal or external device may be different from the color gamut space displayed by the television. For example, the color gamut space of the built-in camera adopts the bt.709 standard, and the color gamut space of the television display is larger than the bt.709 standard. Such color gamut differences can cause some color distortions when the television displays images provided by the internal or external devices. For example, when an image of the bt.709 standard is displayed on a television set larger than the bt.709 standard, the color distortion of the red portion is serious.
Therefore, when the color gamut space of the image content corresponding to the built-in or external device is different from the color gamut space displayed by the television, color distortion may occur in the display effect, and the viewing experience of the user is affected.
Disclosure of Invention
The application provides a display device and a color gamut space dynamic adjustment method, which aim to solve the problem of color distortion caused by color gamut space difference.
In a first aspect, the present application provides a display device comprising:
a display configured to display a user interface and ambient image data, the display defaulting to a display gamut space that is a first gamut space;
the camera is configured to collect environment image data, and a color gamut space to which an image provided by the camera belongs is a second color gamut space different from the first color gamut space;
a controller in communication with the display, the controller configured to perform presenting a user interface, and,
judging whether the currently started application is related to the camera or not;
and if so, converting the display color gamut space of the display into the second color gamut space.
Optionally, the controller is further configured to:
receiving a stop instruction of the camera related application;
and if the stop instruction is received, reducing the display color gamut space into the first color gamut space.
Optionally, the controller is further configured to:
acquiring package name information of a currently started application;
matching preset camera related application package name information according to the package name information;
and if the package name information is the same as the package name information of the related application of the preset camera, determining that the current application program is the related application of the camera.
Optionally, the display device is provided with a first hardware system and a second hardware system inside, and the controller is further configured to control the first hardware system to execute the following program steps:
acquiring a package name set file from a second hardware system; the package name set file comprises package name information of all camera related applications and a color gamut space to which the provided images belong;
loading the package name set file to monitor the starting state of each application according to the package name set file;
and extracting the package name information of the currently started application from the package name set file.
Optionally, the controller is further configured to control the second hardware system to perform the following program steps:
creating a package name set file according to the currently installed application, or acquiring the package name set file from a cloud server;
storing the package name set file;
and sending the package name set file to the first hardware system through a communication interface.
Optionally, the controller is further configured to control the second hardware system to perform the following program steps:
acquiring package name updating data from a cloud server;
modifying the stored package name set file according to the updating data;
and sending the modified package name set file to the first hardware system through a communication interface.
In a second aspect, the present application provides a display device comprising:
a display configured to display a user interface and ambient image data, the display defaulting to a display gamut space that is a first gamut space;
the camera is configured to collect environment image data, and a color gamut space to which an image provided by the camera belongs is a second color gamut space different from the first color gamut space;
a controller in communication with the display, the controller configured to perform presenting a user interface, and,
judging whether the currently started application is related to the camera or not;
and if so, performing color gamut space conversion on the environment image data acquired by the camera.
Optionally, the controller is further configured to:
acquiring package name information of a currently started application;
matching preset camera related application package name information according to the package name information;
and if the package name information is the same as the package name information of the related application of the preset camera, determining that the current application program is the related application of the camera.
Optionally, the display device is provided with a first hardware system and a second hardware system inside, and the controller is further configured to control the first hardware system to execute the following program steps:
acquiring a package name set file from a second hardware system; the package name set file comprises package name information of all camera related applications and a color gamut space to which the provided images belong;
loading the package name set file to monitor the starting state of each application according to the package name set file;
and extracting the package name information of the currently started application from the package name set file.
Optionally, the controller is further configured to control the second hardware system to perform the following program steps:
creating a package name set file according to the currently installed application, or acquiring the package name set file from a cloud server;
storing the package name set file;
and sending the package name set file to the first hardware system through a communication interface.
Optionally, the controller is further configured to control the second hardware system to perform the following program steps:
acquiring package name updating data from a cloud server;
modifying the stored package name set file according to the updating data;
and sending the modified package name set file to the first hardware system through a communication interface.
In a third aspect, the present application provides a method for dynamically adjusting a color gamut space, including:
acquiring current starting application information;
judging whether the current started application is related to the camera or not, wherein a color gamut space to which an image provided by the related camera application belongs is a second color gamut space different from a default display color gamut space;
and if so, converting the display color gamut space of the display into the second color gamut space.
In a fourth aspect, the present application provides a method for dynamically adjusting a color gamut space, including:
acquiring current starting application information and environment image data acquired by a camera;
judging whether the current started application is related to the camera or not, wherein a color gamut space to which an image provided by the related camera application belongs is a second color gamut space different from a default display color gamut space;
and if so, performing color gamut space conversion on the environment image data acquired by the camera.
According to the technical scheme, whether the current starting application is related to the camera is judged by monitoring the starting state of the application program in the display device in real time, if so, the display color gamut space of the display is converted into a second color gamut space provided by the related application of the camera, or the environmental image data collected by the camera is adjusted to perform color gamut space conversion. The method can dynamically adjust the display color gamut space according to the started application program, improve the difference between the display color gamut space and the color gamut space corresponding to the application program, and relieve the problem of color distortion.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of an operational scenario between a display device and a control apparatus according to an embodiment;
fig. 2 is a block diagram of a hardware configuration of a control apparatus according to an embodiment;
fig. 3 is a block diagram of a hardware configuration of a hardware system in the display device according to the embodiment;
FIG. 4 is a block diagram of a hardware architecture of a display device according to the present application;
FIG. 5 is a schematic diagram of a functional configuration of a display device according to the present application;
FIG. 6a is a schematic diagram of a software system in a display device according to the present application;
FIG. 6b is a schematic diagram illustrating the configuration of an application program in a display device according to the present application;
FIG. 7 is a schematic diagram of a user interface in a display device of the present application;
fig. 8 is a schematic flowchart of a dynamic color gamut space adjusting method according to the present application;
FIG. 9 is a schematic flow chart illustrating the process of obtaining package name information according to the present application;
FIG. 10 is a schematic flow chart illustrating the process of acquiring a package name collection file according to the present application;
FIG. 11 is a flowchart illustrating updating a package name collection file according to the present application;
FIG. 12 is a flow chart illustrating interface invocation according to the present application;
FIG. 13 is a flowchart illustrating a process of determining an exit status of an application according to the present application;
fig. 14 is a schematic logical structure diagram of a dynamic color gamut space adjusting display device according to the present application;
fig. 15 is a schematic flowchart of another dynamic adjustment method of color gamut space according to the present application.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
It should be noted that, in the embodiment of the present application, a dual-system display device is taken as an example for description, and it is obvious that the color gamut space dynamic adjustment method of the present application may also operate in a single-system environment. Therefore, based on the color gamut space dynamic adjustment method of the present application, those skilled in the art can think that the technical solution applied in any system environment falls within the scope of the present application without creative efforts.
For the convenience of users, various external device interfaces are usually provided on the display device to facilitate connection of different peripheral devices or cables to implement corresponding functions. When a high-definition camera is connected to an interface of the display device, if a hardware system of the display device does not have a hardware interface of a high-pixel camera receiving the source code, data received by the camera cannot be displayed on a display screen of the display device.
Furthermore, due to the hardware structure, the hardware system of the conventional display device only supports one path of hard decoding resources, and usually only supports video decoding with a resolution of 4K at most, so when a user wants to perform video chat while watching a network television, the user needs to use the hard decoding resources (usually GPU in the hardware system) to decode the network video without reducing the definition of the network video screen, and in this case, the user can only process the video chat screen by using a general-purpose processor (e.g. CPU) in the hardware system to perform soft decoding on the video.
The soft decoding is adopted to process the video chat picture, so that the data processing burden of a CPU (central processing unit) can be greatly increased, and when the data processing burden of the CPU is too heavy, the problem of picture blocking or unsmooth flow can occur. Further, due to the data processing capability of the CPU, when the CPU performs soft decoding on the video chat screen, multi-channel video calls cannot be generally implemented, and when a user wants to perform video chat with multiple other users in the same chat scene, access is blocked.
In view of the above aspects, to overcome the above drawbacks, the present application discloses a dual hardware system architecture to implement multiple channels of video chat data (at least one channel of local video).
The concept to which the present application relates will be first explained below with reference to the drawings. It should be noted that the following descriptions of the concepts are only for the purpose of facilitating understanding of the contents of the present application, and do not represent limitations on the scope of the present application.
The term "module," as used in various embodiments of the present application, may refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in the embodiments of the present application refers to a component of an electronic device (such as the display device disclosed in the present application) that is capable of wirelessly controlling the electronic device, typically over a short distance. The component may typically be connected to the electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in the embodiments of the present application refers to a user behavior used to express an intended idea, action, purpose, or result through a change in hand shape or an action such as hand movement.
The term "hardware system" used in the embodiments of the present application may refer to a physical component having computing, controlling, storing, inputting and outputting functions, which is formed by a mechanical, optical, electrical and magnetic device such as an Integrated Circuit (IC), a Printed Circuit Board (PCB) and the like. In various embodiments of the present application, a hardware system may also be referred to as a motherboard (or chip).
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display apparatus 200 through the control device 100.
The control device 100 may be a remote controller 100A, which can communicate with the display device 200 through an infrared protocol communication, a bluetooth protocol communication, a ZigBee (ZigBee) protocol communication, or other short-range communication, and is used to control the display device 200 in a wireless or other wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
The control apparatus 100 may also be a smart device, such as a mobile terminal 100B, a tablet computer, a notebook computer, etc., which may communicate with the display device 200 through a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), or other networks, and implement control of the display device 200 through an application program corresponding to the display device 200.
For example, the mobile terminal 100B and the display device 200 may each have a software application installed thereon, so that connection communication between the two can be realized through a network communication protocol, and the purpose of one-to-one control operation and data communication can be further realized. Such as: a control instruction protocol can be established between the mobile terminal 100B and the display device 200, a remote control keyboard is synchronized to the mobile terminal 100B, and the function of controlling the display device 200 is realized by controlling a user interface on the mobile terminal 100B; the audio and video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
As shown in fig. 1, the display apparatus 200 may also perform data communication with the server 300 through various communication means. In various embodiments of the present application, the display device 200 may be allowed to be communicatively coupled to the server 300 via a local area network, a wireless local area network, or other network. The server 300 may provide various contents and interactions to the display apparatus 200.
Illustratively, the display device 200 receives software Program updates, or accesses a remotely stored digital media library by sending and receiving information, and Electronic Program Guide (EPG) interactions. The servers 300 may be a group or groups, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through the server 300.
The display device 200 may be a liquid crystal display, an oled (organic Light Emitting diode) display, a projection display device, or an intelligent tv. The specific display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function that provides a computer support function in addition to the broadcast receiving tv function. Examples include a web tv, a smart tv, an Internet Protocol Tv (IPTV), and the like.
As shown in fig. 1, a camera may be connected or disposed on the display device, and is used to present a picture taken by the camera on a display interface of the display device or other display devices, so as to implement interactive chat between users. Specifically, the picture shot by the camera can be displayed on the display device in a full screen mode, a half screen mode or any optional area.
As an optional connection mode, the camera is connected with the display rear shell through the connecting plate, is fixedly installed in the middle of the upper side of the display rear shell, and can be fixedly installed at any position of the display rear shell as an installable mode, so that an image acquisition area is ensured not to be shielded by the rear shell, for example, the display orientation of the image acquisition area is the same as that of the display equipment.
As another alternative connection mode, the camera is connected to the display rear shell through a connection board or other conceivable connector, the camera is capable of lifting, the connector is provided with a lifting motor, when a user wants to use the camera or an application program wants to use the camera, the camera is lifted out of the display, and when the camera is not needed, the camera can be embedded in the rear shell to protect the camera from being damaged.
As an embodiment, the camera adopted in the present application may have 1600 ten thousand pixels, so as to achieve the purpose of ultra high definition display. In actual use, cameras higher or lower than 1600 ten thousand pixels may also be used.
After the camera is installed on the display device, the contents displayed by different application scenes of the display device can be fused in various different modes, so that the function which cannot be realized by the traditional display device is achieved.
Illustratively, a user may conduct a video chat with at least one other user while watching a video program. The presentation of the video program may be as a background frame over which a window for video chat is displayed. The function is called 'chat while watching'.
Optionally, in a scene of "chat while watching", at least one video chat is performed across terminals while watching a live video or a network video.
In another example, a user can conduct a video chat with at least one other user while entering the educational application for learning. For example, a student may interact remotely with a teacher while learning content in an educational application. Vividly, this function can be called "chatting while learning".
In another example, a user conducts a video chat with a player entering a card game while playing the game. For example, a player may enable remote interaction with other players when entering a gaming application to participate in a game. Figuratively, this function may be referred to as "watch while playing".
Optionally, the game scene is fused with the video picture, the portrait in the video picture is scratched and displayed in the game picture, and the user experience is improved.
Optionally, in the motion sensing game (such as ball hitting, boxing, running and dancing), the human posture and motion, limb detection and tracking and human skeleton key point data detection are obtained through the camera, and then the human posture and motion, the limb detection and tracking and the human skeleton key point data detection are fused with the animation in the game, so that the game of scenes such as sports and dancing is realized.
In another example, a user may interact with at least one other user in a karaoke application in video and voice. Vividly, this function can be called "sing while watching". Preferably, when at least one user enters the application in a chat scenario, a plurality of users can jointly complete recording of a song.
In another example, a user may turn on a camera locally to take pictures and videos, figurative, which may be referred to as "looking into the mirror".
In other examples, more or less functionality may be added. The function of the display device is not particularly limited in the present application.
Fig. 2 is a block diagram schematically showing a hardware configuration of the control apparatus 100 according to the embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communicator 130, a user input/output interface 140, a memory 190, and a power supply 180.
The control apparatus 100 is configured to control the display device 200, and to receive an input operation instruction from a user, and convert the operation instruction into an instruction recognizable and responsive by the display device 200, and to mediate interaction between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications that control the display device 200 according to user demands.
In some embodiments, as shown in fig. 1, the mobile terminal 100B or other intelligent electronic device may function similar to the control apparatus 100 after installing an application for manipulating the display device 200. Such as: the user may implement the functions of controlling the physical keys of the apparatus 100 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 100B or other intelligent electronic devices.
The controller 110 includes a processor 112, a RAM113 and a ROM114, a communication interface, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components for communication and coordination and external and internal data processing functions.
The communicator 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display apparatus 200. The communicator 130 may include at least one of a WIFI module 131, a bluetooth module 132, an NFC module 133, and the like.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touch pad 142, a sensor 143, a key 144, and the like. Such as: the user can realize a user instruction input function through actions such as voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display apparatus 200. In some embodiments, it may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
In some embodiments, the control device 100 includes at least one of a communicator 130 and an output interface. The communicator 130 is configured in the control device 100, such as: the modules of WIFI, bluetooth, NFC, etc. may send the user input command to the display device 200 through the WIFI protocol, or the bluetooth protocol, or the NFC protocol code.
And a memory 190 for storing various operation programs, data and applications for driving and controlling the control apparatus 100 under the control of the controller 110. The memory 190 may store various control signal commands input by a user.
And a power supply 180 for providing operational power support to the components of the control device 100 under the control of the controller 110. A battery and associated control circuitry.
A hardware configuration block diagram of a hardware system in the display device 200 according to the embodiment is exemplarily shown in fig. 3.
When a dual hardware system architecture is adopted, the structural relationship of the hardware system can be shown in fig. 3. For convenience of description, one hardware system in the dual hardware system architecture will be referred to as a first hardware system or a system, a-chip, and the other hardware system will be referred to as a second hardware system or N-system, N-chip. The chip A comprises a controller of the chip A and various modules connected with the controller of the chip A through various interfaces, and the chip N comprises a controller of the chip N and various modules connected with the controller of the chip N through various interfaces. The a-chip and the N-chip may each have a separate operating system installed therein, so that there are two separate but interrelated subsystems in the display apparatus 200.
As shown in fig. 3, the a chip and the N chip may be connected, communicated and powered through a plurality of different types of interfaces. The interface type of the interface between the a chip and the N chip may include a General-purpose input/output (GPIO) interface, a USB interface, an HDMI interface, a UART interface, and the like. One or more of these interfaces may be used for communication or power transfer between the a-chip and the N-chip. For example, as shown in fig. 3, in the dual hardware system architecture, the N chip may be powered by an external power source (power), and the a chip may not be powered by the external power source but by the N chip.
In addition to the interface for connecting with the N chip, the a chip may further include an interface for connecting other devices or components, such as an MIPI interface for connecting a Camera (Camera) shown in fig. 3, a bluetooth interface, and the like.
Similarly, in addition to the interface for connecting with the N chip, the N chip may further include an VBY interface for connecting with a display screen tcon (timer Control register), and an i2S interface for connecting with a power Amplifier (AMP) and a Speaker (Speaker); and an IR/Key interface, a USB interface, a Wifi interface, a bluetooth interface, an HDMI interface, a Tuner interface, and the like.
The dual hardware system architecture of the present application is further described below with reference to fig. 4. It should be noted that fig. 4 is only an exemplary illustration of the dual hardware system architecture of the present application, and does not represent a limitation of the present application. In actual practice, both hardware systems may contain more or less hardware or interfaces as desired.
A block diagram of the hardware architecture of the display device 200 according to fig. 3 is exemplarily shown in fig. 4. As shown in fig. 4, the hardware system of the display device 200 may include an a chip and an N chip, and a module connected to the a chip or the N chip through various interfaces.
The N-chip may include a tuner demodulator 220, a communicator 230, an external device interface 250, a controller 210, a memory 290, a user input interface, a video processor 260-1, an audio processor 260-2, a display 280, an audio output interface 270, and a power supply. The N-chip may also include more or fewer modules in other embodiments.
The tuning demodulator 220 is configured to perform modulation and demodulation processing such as amplification, mixing, resonance and the like on a broadcast television signal received in a wired or wireless manner, so as to demodulate an audio/video signal carried in a frequency of a television channel selected by a user and additional information (e.g., an EPG data signal) from a plurality of wireless or wired broadcast television signals. Depending on the broadcast system of the television signal, the signal path of the tuner 220 may be various, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; according to different modulation types, the adjustment mode of the signal can be a digital modulation mode or an analog modulation mode; and depending on the type of television signal being received, tuner demodulator 220 may demodulate analog and/or digital signals.
The tuner demodulator 220 is also operative to respond to the user-selected television channel frequency and the television signals carried thereby, in accordance with the user selection, and as controlled by the controller 210.
In other exemplary embodiments, the tuner/demodulator 220 may be in an external device, such as an external set-top box. In this way, the set-top box outputs television audio/video signals after modulation and demodulation, and the television audio/video signals are input into the display device 200 through the external device interface 250.
The communicator 230 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communicator 230 may include a WIFI module 231, a bluetooth communication protocol module 232, a wired ethernet communication protocol module 233, and other network communication protocol modules such as an infrared communication protocol module or a near field communication protocol module.
The display apparatus 200 may establish a connection of a control signal and a data signal with an external control apparatus or a content providing apparatus through the communicator 230. For example, the communicator may receive a control signal of the remote controller 100A according to the control of the controller.
The external device interface 250 is a component for providing data transmission between the N-chip controller 210 and the a-chip and other external devices. The external device interface may be connected with an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
The external device interface 250 may include: a High Definition Multimedia Interface (HDMI) terminal 251, a Composite Video Blanking Sync (CVBS) terminal, such as any one or more of an AV interface 252, an analog or digital component terminal 353, a Universal Serial Bus (USB) terminal 254, a Red Green Blue (RGB) terminal (not shown in the figure), and the like. The number and type of external device interfaces are not limited by this application.
The controller 210 controls the operation of the display device 200 and responds to the user's operation by running various software control programs (e.g., an operating system and/or various application programs) stored on the memory 290.
As shown in fig. 4, the controller 210 includes a read only memory RAM214, a random access memory ROM213, a graphics processor 216, a CPU processor 212, a communication interface 218, and a communication bus. The RAM214, the ROM213, the graphic processor 216, the CPU processor 212, and the communication interface 218 are connected via a bus.
A ROM213 for storing instructions for various system boots. If the display device 200 is powered on upon receipt of the power-on signal, the CPU processor 212 executes a system boot instruction in the ROM and copies the operating system stored in the memory 290 to the RAM214 to start running the boot operating system. After the start of the operating system is completed, the CPU processor 212 copies the various application programs in the memory 290 to the RAM214, and then starts running and starting the various application programs.
A graphics processor 216 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And a renderer for generating various objects based on the operator and displaying the rendered result on the display 280.
A CPU processor 212 for executing operating system and application program instructions stored in memory 290. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 212 may include a plurality of processors. The plurality of processors may include a main processor and a plurality of or a sub-processor. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. A plurality of or one sub-processor for performing an operation in a standby mode or the like.
The communication interfaces may include a first interface 218-1 through an nth interface 218-n. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 210 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to an icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
The memory 290 includes a memory for storing various software modules for driving and controlling the display apparatus 200. Such as: various software modules stored in memory 290, including: the system comprises a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like.
The basic module is a bottom layer software module for signal communication between hardware in the display device 200 and sending processing and control signals to an upper layer module. The detection module is a management module used for collecting various information from various sensors or user input interfaces, and performing digital-to-analog conversion and analysis management.
For example: the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is a module for controlling the display 280 to display image content, and may be used to play information such as multimedia image content and UI interface. The communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing data communication between the browsing servers. The service module is a module for providing various services and various application programs.
Meanwhile, the memory 290 is also used to store visual effect maps and the like for receiving external data and user data, images of respective items in various user interfaces, and a focus object.
A user input interface for transmitting an input signal of a user to the controller 210 or transmitting a signal output from the controller to the user. For example, the control device (e.g., a mobile terminal or a remote controller) may send an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by a user to the user input interface, and then the input signal is forwarded to the controller by the user input interface; alternatively, the control device may receive an output signal such as audio, video, or data output from the user input interface via the controller, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter a user command on a Graphical User Interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
The video processor 260-1 is configured to receive a video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on the display 280.
Illustratively, the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, such as a 24Hz, 25Hz, 30Hz, or 60Hz video, into a 60Hz, 120Hz, or 240Hz frame rate, where the input frame rate may be related to a source video stream, and the output frame rate may be related to an update rate of a display. The input is realized in a common format by using a frame insertion mode.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output an RGB data signal.
And a display 280 for receiving the image signal input from the video processor 260-1 and displaying the video content and image and the menu manipulation interface. The display 280 includes a display component for presenting a picture and a driving component for driving the display of an image. The video content may be displayed from the video in the broadcast signal received by the tuner/demodulator 220, or from the video content input from the communicator or the external device interface. And a display 220 simultaneously displaying a user manipulation interface UI generated in the display apparatus 200 and used to control the display apparatus 200.
And, a driving component for driving the display according to the type of the display 280. Alternatively, in case the display 280 is a projection display, it may also comprise a projection device and a projection screen.
The audio processor 260-2 is configured to receive an audio signal, decompress and decode the audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, amplification and other audio data processing to obtain an audio signal that can be played in the speaker 272.
An audio output interface 270 for receiving the audio signal output by the audio processor 260-2 under the control of the controller 210, wherein the audio output interface may include a speaker 272 or an external sound output terminal 274 for outputting to a generating device of an external device, such as: external sound terminal or earphone output terminal.
In other exemplary embodiments, video processor 260-1 may comprise one or more chip components. The audio processor 260-2 may also include one or more chips.
And, in other exemplary embodiments, the video processor 260-1 and the audio processor 260-2 may be separate chips or may be integrated in one or more chips with the controller 210.
And a power supply for supplying power supply support to the display apparatus 200 from the power input from the external power source under the control of the controller 210. The power supply may include a built-in power supply circuit installed inside the display apparatus 200, or may be a power supply installed outside the display apparatus 200, such as a power supply interface for providing an external power supply in the display apparatus 200.
Similar to the N-chip, as shown in fig. 4, the a-chip may include a controller 310, a communicator 330, a detector 340, and a memory 390. A user input interface, a video processor 360, an audio processor, a display, an audio output interface may also be included in some embodiments. In some embodiments, there may also be a power supply that independently powers the A-chip.
The communicator 330 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communicator 330 may include a WIFI module 331, a bluetooth communication protocol module 332, a wired ethernet communication protocol module 333, and other network communication protocol modules such as an infrared communication protocol module or a near field communication protocol module.
The communicator 330 of the a-chip and the communicator 230 of the N-chip also interact with each other. For example, the N-chip WiFi module 231 is used to connect to an external network, generate network communication with an external server, and the like. The WiFi module 331 of the a chip is used to connect to the WiFi module 231 of the N chip without making a direct connection with an external network or the like. Therefore, for the user, a display device as in the above embodiment displays a WiFi account to the outside.
The detector 340 is a component of the display device a chip for collecting signals of an external environment or interacting with the outside. The detector 340 may include a light receiver 342, a sensor for collecting the intensity of ambient light, which may be used to adapt to display parameter changes, etc.; the system may further include an image collector 341, such as a camera, a video camera, etc., which may be configured to collect external environment scenes, collect attributes of the user or interact gestures with the user, adaptively change display parameters, and identify user gestures, so as to implement a function of interaction with the user.
An external device interface 350, which provides a component for data transmission between the controller 310 and the N-chip or other external devices. The external device interface may be connected with an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner.
The controller 310 controls the operation of the display device 200 and responds to the user's operation by running various software control programs stored on the memory 390 (e.g., using installed third party applications, etc.), and interacting with the N-chip.
As shown in fig. 4, the controller 310 includes a read only memory ROM313, a random access memory RAM314, a graphics processor 316, a CPU processor 312, a communication interface 318, and a communication bus. The ROM313 and the RAM314, the graphic processor 316, the CPU processor 312, and the communication interface 318 are connected via a bus.
A ROM313 for storing instructions for various system boots. CPU processor 312 executes system boot instructions in ROM and copies the operating system stored in memory 390 to RAM314 to begin running the boot operating system. After the start of the operating system is completed, the CPU processor 312 copies various application programs in the memory 390 to the RAM314, and then starts running and starting various application programs.
The CPU processor 312 is used for executing the operating system and application program instructions stored in the memory 390, communicating with the N chip, transmitting and interacting signals, data, instructions, etc., and executing various application programs, data and contents according to various interaction instructions received from the outside, so as to finally display and play various audio and video contents.
The communication interfaces may include a first interface 318-1 through an nth interface 318-n. These interfaces may be network interfaces connected to external devices via a network, or may be network interfaces connected to the N-chip via a network.
The controller 310 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
A graphics processor 316 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And a renderer for generating various objects based on the operator and displaying the rendered result on the display 280.
Both the A-chip graphics processor 316 and the N-chip graphics processor 216 are capable of generating various graphics objects. In distinction, if application 1 is installed on the a-chip and application 2 is installed on the N-chip, the a-chip graphics processor 316 generates a graphics object when a user performs a command input by the user in application 1 at the interface of application 1. When a user makes a command input by the user in the interface of the application 2 and within the application 2, a graphic object is generated by the graphic processor 216 of the N chip.
Fig. 5 is a diagram exemplarily showing a functional configuration of a display device according to the embodiment.
As shown in fig. 5, the memory 390 of the a-chip and the memory 290 of the N-chip are used to store an operating system, an application program, contents, user data, and the like, respectively, and perform system operations for driving the display device 200 and various operations in response to a user under the control of the controller 310 of the a-chip and the controller 210 of the N-chip. The A-chip memory 390 and the N-chip memory 290 may include volatile and/or non-volatile memory.
The memory 290 is specifically configured to store an operating program for driving the controller 210 in the display device 200, and store various applications installed in the display device 200, various applications downloaded by a user from an external device, various graphical user interfaces related to the applications, various objects related to the graphical user interfaces, user data information, and internal data of various supported applications. The memory 290 is used to store system software such as an Operating System (OS) kernel, middleware, and applications, and to store input video data and audio data, and other user data.
The memory 290 is specifically used for storing drivers and related data such as the video processor 260-1 and the audio processor 260-2, the display 280, the communication interface 230, the tuner demodulator 220, the input/output interface, and the like.
In some embodiments, memory 290 may store software and/or programs, software programs for representing an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (e.g., the middleware, APIs, or applications), and the kernel may provide interfaces to allow the middleware and APIs, or applications, to access the controller to implement controlling or managing system resources.
The memory 290, for example, includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a power control module 2910, an operating system 2911, and other application programs 2912, a browser module, and the like. The controller 210 performs functions such as: the system comprises a broadcast television signal receiving and demodulating function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction identification function, a communication control function, an optical signal receiving function, an electric power control function, a software control platform supporting various functions, a browser function and other various functions.
The memory 390 includes a memory storing various software modules for driving and controlling the display apparatus 200. Such as: various software modules stored in memory 390, including: the system comprises a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like. Since the functions of the memory 390 and the memory 290 are similar, reference may be made to the memory 290 for relevant points, and thus, detailed description thereof is omitted here.
Illustratively, the memory 390 includes an image control module 3904, an audio control module 3906, an external instruction recognition module 3907, a communication control module 3908, a light receiving module 3909, an operating system 3911, and other application programs 3912, a browser module, and the like. The controller 210 performs functions such as: the system comprises an image control function, a display control function, an audio control function, an external instruction identification function, a communication control function, an optical signal receiving function, an electric power control function, a software control platform supporting various functions, a browser function and other various functions.
Differently, the external instruction recognition module 2907 of the N-chip and the external instruction recognition module 3907 of the a-chip can recognize different instructions.
Illustratively, since the image receiving device such as a camera is connected with the a-chip, the external instruction recognition module 3907 of the a-chip may include an image recognition module 3907-1, a graphic database is stored in the image recognition module 3907-1, and when the camera receives an external graphic instruction, the camera corresponds to the instruction in the graphic database to perform instruction control on the display device. Since the voice receiving device and the remote controller are connected to the N-chip, the external command recognition module 2907 of the N-chip may include a voice recognition module 2907-2, a voice database is stored in the voice recognition module 2907-2, and when the voice receiving device receives an external voice command or the like, the voice receiving device and the like perform a corresponding relationship with a command in the voice database to perform command control on the display device. Similarly, a control device 100 such as a remote controller is connected to the N-chip, and the key command recognition module 2907-3 performs command interaction with the control device 100.
A block diagram of the configuration of the software system in the display device 200 according to an embodiment is exemplarily shown in fig. 6 a.
For an N-chip, as shown in fig. 6a, the operating system 2911, which includes executing operating software for handling various basic system services and for performing hardware related tasks, serves as an intermediary between applications and hardware components for data processing.
In some embodiments, portions of the operating system kernel may contain a series of software to manage the display device hardware resources and provide services to other programs or software code.
In other embodiments, portions of the operating system kernel may include one or more device drivers, which may be a set of software code in the operating system that assists in operating or controlling the devices or hardware associated with the display device. The drivers may contain code that operates the video, audio, and/or other multimedia components. Examples include a display, a camera, Flash, WiFi, and audio drivers.
The accessibility module 2911-1 is configured to modify or access the application program to achieve accessibility and operability of the application program for displaying content.
A communication module 2911-2 for connection to other peripherals via associated communication interfaces and a communication network.
The user interface module 2911-3 is configured to provide an object for displaying a user interface, so that each application program can access the object, and user operability can be achieved.
Control applications 2911-4 for controlling process management, including runtime applications and the like.
The event transmission system 2914 may be implemented within the operating system 2911 or within the application 2912. In some embodiments, an aspect is implemented within the operating system 2911, while implemented in the application 2912, for listening for various user input events, and will implement one or more sets of predefined operations in response to various events referring to the recognition of various types of events or sub-events.
The event monitoring module 2914-1 is configured to monitor an event or a sub-event input by the user input interface.
The event identification module 2914-2 is used to input various event definitions for various user input interfaces, identify various events or sub-events, and transmit them to the process for executing one or more sets of their corresponding handlers.
The event or sub-event refers to an input detected by one or more sensors in the display device 200 and an input of an external control device (e.g., the control apparatus 100). Such as: the method comprises the following steps of inputting various sub-events through voice, inputting a gesture sub-event through gesture recognition, inputting a remote control key command of a control device and the like. Illustratively, the one or more sub-events in the remote control include a variety of forms including, but not limited to, one or a combination of key presses up/down/left/right/, ok keys, key presses, and the like. And non-physical key operations such as move, hold, release, etc.
The interface layout management module 2913, directly or indirectly receiving the input events or sub-events from the event transmission system 2914, monitors the input events or sub-events, and updates the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the size, position, and level of the container, which are related to the layout of the interface.
Since the functions of the operating system 3911 of the a chip are similar to those of the operating system 2911 of the N chip, reference may be made to the operating system 2911 for relevant points, and details are not repeated here.
Fig. 6b schematically shows a configuration of an application in a display device according to an embodiment; as shown in fig. 6b, the application layer of the display device contains various applications that can be executed at the display device 200.
The N-chip application layer 2912 may include, but is not limited to, one or more applications such as: a video-on-demand application, an application center, a game application, and the like. The application layer 3912 of the a-chip may include, but is not limited to, one or more applications such as: live television applications, media center applications, and the like. It should be noted that what applications are respectively contained in the a chip and the N chip is determined according to an operating system and other designs, and the application does not need to specifically limit and divide the applications contained in the a chip and the N chip.
The live television application program can provide live television through different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
A video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
The media center application program can provide various applications for playing multimedia contents. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
The application program center can provide and store various application programs. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on a display device. The application center may obtain these applications from different sources, store them in local storage, and then be operable on the display device 200.
A schematic diagram of a user interface in a display device 200 according to an embodiment is illustrated in fig. 7. As shown in fig. 7, the user interface includes a plurality of view display areas, illustratively, a first view display area 201 and a play screen 202, wherein the play screen includes a layout of one or more different items. And a selector in the user interface indicating that the item is selected, the position of the selector being movable by user input to change the selection of a different item.
It should be noted that the multiple view display areas may present display screens of different hierarchies. For example, a first view display area may present video chat project content and a second view display area may present application layer project content (e.g., web page video, VOD presentations, application screens, etc.).
Optionally, the different view display areas are presented with different priorities, and the display priorities of the view display areas are different among the view display areas with different priorities. If the priority of the system layer is higher than that of the application layer, when the user uses the acquisition selector and picture switching in the application layer, the picture display of the view display area of the system layer is not blocked; and when the size and the position of the view display area of the application layer are changed according to the selection of the user, the size and the position of the view display area of the system layer are not influenced.
For convenience of maintenance, the installed application programs may be different in the subsystems corresponding to the a chip and the N chip, for example, the subsystem corresponding to the a chip may focus on improving the extended functions of the display device, such as live tv broadcast, video website client, instant messaging, etc., through the application programs; and the subsystems corresponding to the N chips may focus on the adjustment of the display device itself, such as intelligent brightness adjustment, intelligent color adjustment, picture enhancement, and the like.
Correspondingly, the a chip can be connected with hardware equipment corresponding to the extended application program, for example, an application program named "mirror" installed in a subsystem corresponding to the a chip, and the purpose is to acquire image signals through a camera built in the a chip and display the image signals on a screen in real time so as to achieve the mirror-looking effect. According to the industry specification of the camera, the supportable color gamut space is a fixed standard value, for example, the commonly used camera supports the color gamut space as bt.709, but with the display requirement of high definition video, the display technology is continuously developed, and the supportable color gamut space is generally higher than that of the camera, for example, the color gamut space of the 4k television is bt.2020 standard. Thus, the color gamut space to which the image provided by the "mirror" application belongs is bt.709, while the display color gamut space of the television is bt.2020, i.e. there is a color gamut space difference, which is likely to cause color distortion when the "mirror" application is used.
In order to alleviate the problem of color distortion, the application provides a display device and a color gamut space dynamic adjustment method. Fig. 8 is a schematic flow chart of a color gamut space dynamic adjustment method according to the present application. Referring to fig. 14, which is a schematic structural diagram of a display device according to the present application, as can be seen from fig. 8 and 14, the display device provided in the present application includes: the system comprises a camera, a display and a controller, wherein the camera is configured to collect environment image data, the display is configured to display a user interface, a communication connection is established between the controller and the display, the controller is configured to execute presentation of the user interface, and whether a currently started application is a camera-related application or not is judged. The default display color gamut space of the display is a first color gamut space, and the color gamut space to which the image provided by the camera-related application belongs is a second color gamut space different from the first color gamut space. If yes, converting the display color gamut space of the display into a second color gamut space.
Correspondingly, the color gamut space dynamic adjustment method provided by the application comprises the following steps:
s11: and acquiring the current starting application information.
According to the technical scheme, the starting condition of the application program can be monitored in real time through a predefined process so as to obtain the current starting application information. In the application, the current application starting information can be represented by the package name information.
For example, when any application program is started, the currently started application program is determined by acquiring the corresponding package name information. The package name information is a program description file preset according to the model state of the display device, and may be a segment of character string, and when the corresponding application program is started, the character string may be acquired in the system memory. For example, the package name information corresponding to the "mirror" program may be "com.
For a dual system display device, package name information corresponding to its application program may be stored in a package name set file created or stored by the second hardware system. The package name set file may be stored by a partition in the second hardware system. For example, in a subsystem corresponding to the N chip, a partition named tvconfig is divided in advance, and an XML file is stored in the tvconfig partition as a package name set file according to the model of the display device. The XML file stores package name information corresponding to each application program. The package name set file may be used for dynamic configuration to inform the a-chip that an application such as "mirror" needs to adjust the color gamut space after it is started.
In some embodiments of the present application, as shown in fig. 9, the step of obtaining name information of a currently-started application package further includes the following steps:
s101: the first hardware system acquires a package name set file from the second hardware system; the package name set file comprises package name information of all application programs and a color gamut space to which the provided image belongs;
s102: the first hardware system loads the package name set file to monitor the starting state of each application program according to the package name set file;
s103: and the first hardware system extracts the currently started application program package name information from the package name set file.
In this embodiment, the first hardware system corresponding to the a chip may extract the package name set file, that is, extract the XML file, from the second hardware system partition corresponding to the N chip. In order to facilitate the subsequent determination of the color gamut space to which the image provided by the application program belongs, in this embodiment, the package name information and the color gamut space to which the provided image belongs may be further preset in the package name set file. For example, for a "mirror" application, which corresponds to the information in the package name set file, "com.
Optionally, an Activity Monitor Service module may be configured in the first hardware system corresponding to the a-card chip. The module starts to operate after the system is started, and reads a package name set file in a second hardware system through an RPC communication interface so as to determine the application package name information needing to set a color gamut space. After the package name set file is obtained, the starting state of each application program can be monitored according to the package name set file. When any application program is started, the Activity Monitor Service module can extract the packet name information corresponding to the application program from the packet name set file according to the started application program. Further, information corresponding to the color gamut space to which the image provided by the application program belongs can be extracted from the package name set file.
In order to realize the above functions, a Message Trans Service module may be built in the second hardware system corresponding to the N chip, and after the start-up, packet name information of the "look-to-mirror" application is read from the data separation and stored in the memory for the first hardware system corresponding to the a card to perform subsequent judgment.
Since the N chip is a chip that focuses on the system control function itself, the second hardware system corresponding to the N chip may be a basic system. Further, the cloud server can communicate with a second hardware system corresponding to the N chip to realize the cloud service of the display device. Correspondingly, the first hardware system can realize the separation function of the N-card data through a mode of acquiring the package name set file through the second hardware system, so that the color gamut space needs to be adjusted by dynamically configuring which application programs, and when the application programs are updated, the cloud service is configured through the N-card system, so that the package name set file is updated in real time, and the subsequent application expansion is facilitated. In addition, the management mode of the package name set file can be realized, the package names acquired from the second hardware system corresponding to the N chips are adapted, the application programs are uniformly set, and each application program does not need to be set, so that management and control are facilitated.
Thus, as shown in fig. 10, in some embodiments of the present application, the method further comprises the steps of:
s104: the second hardware system creates a package name set file according to the currently installed application program, or acquires the package name set file from the cloud server;
s105: storing the package name set file;
s106: and sending the package name set file to the first hardware system through a communication interface.
In this embodiment, the package name set file may be automatically created by the second hardware system according to the currently installed application program, or may be uniformly configured by the cloud server. The cloud server can be used for uniformly managing the package name set file by a manufacturer so as to uniformly issue the updating service. After the package name set file is created or acquired, the second hardware system may store the package name set file, so that the file is sent to the first hardware system through the communication interface after each startup, and is used for the first hardware system corresponding to the a chip to perform judgment.
As can be seen, in this embodiment, the package name set file is completed by the second hardware system corresponding to the N card, and the update data can be directly obtained from the cloud server through the second hardware system, so as to continuously improve the package name set file. Meanwhile, the occupation of the memory space of the first hardware system can be reduced, so that the first hardware system can run the application program more smoothly.
In some embodiments of the present application, as shown in fig. 11, the second hardware system may update the package name set file to implement application expansion by the following steps:
s1041: the second hardware system acquires package name updating data from the cloud server;
s1042: modifying the stored package name set file according to the updating data;
s1043: and sending the modified package name set file to the first hardware system through a communication interface.
In this embodiment, the second hardware system corresponding to the N chips may obtain the update data of the package name set file in a manner of downloading from the cloud server, and the specific update manner may be periodically queried and downloaded by the second hardware system or may be uniformly issued in real time by the cloud server. After the update data is acquired, the second hardware system may modify the stored package name set file according to the acquired update data, and replace the package name information and the color gamut space data to which the image provided by the application program belongs in the package name set file. And after the data is replaced, after the system is started, the modified package name set file can be sent to the first hardware system through the communication interface, so that the first hardware system can perform subsequent judgment according to the modified package name set file, and the color gamut space to which the image provided by the application program belongs is determined.
S12: and judging whether the current started application is related to the camera or not, wherein the color gamut space to which the image provided by the related camera application belongs is a second color gamut space different from the default display color gamut space.
In the technical solution provided by the present application, the color gamut space to which the image provided by the currently-started application belongs is a first color gamut space that is the same as the default display color gamut space, or a second color gamut space that is different from the default display color gamut space. The first color gamut space is a default display color gamut space, and the application belonging to the first color gamut space is an application without the need of converting the color gamut space. For example, a "movie" application, a "live tv" application, etc., needs to be displayed through a wide color gamut in order to highlight the layering sense of picture colors, and therefore, the color gamut space to which an image provided by the movie application belongs should be the same as the display color gamut space, i.e., the maximum color gamut space supported by the display device. Therefore, according to the package name information "com.hisense.tv.movie" of the "movie" application, the color gamut space to which the image provided by the "movie" application belongs may be determined to be the first color gamut space.
The second color gamut space refers to a color gamut space different from the default display color gamut space, and needs to be subjected to color gamut conversion to obtain a better display effect, and the application of the second color gamut space can be various. For example, a "mirror" application is limited by the maximum color gamut space that can be provided by the camera hardware, and the maximum color gamut space supported by the "mirror" application may not reach the range of the display color gamut space, that is, the color gamut space to which the image belongs, which is provided by the "mirror" application, is different from the display color gamut space and is the maximum color gamut space that can be reached by the built-in or external device. Therefore, according to the packet name information "com.hisense.tv.mirror" of the "mirror" application, the color gamut space to which the image attribution provided by the "mirror" application is attributed can be determined as the second color gamut space.
It should be noted that, in the technical solution provided in the present application, the display color gamut space may be a color gamut space supported by the current display device, and the first color gamut space and the second color gamut space may be unified classifications performed on each application in advance. If the color gamut space to which the image provided by the application program belongs is the same as the display color gamut space, the corresponding application belongs to the first color gamut space; if the color gamut space to which the image provided by the application program belongs is different from the display color gamut space, the corresponding application belongs to the second color gamut space. For example, if the maximum color gamut space standard of the display device is bt.2020, then the application of non-bt.2020 color gamut space such as "look at the mirror" is assigned to the classification of the second color gamut space; and bt.2020 standard color gamut space applications such as "movie" or other applications without limitation to color gamut space belong to the category of the first color gamut space.
In the technical solution provided in the present application, the first color gamut space and the second color gamut space may also be two relative state classifications generated by comparing the color gamut space supported by the image provided by the application with the current display color gamut space, that is, the same image provided by the application in different use stages may belong to different color gamut spaces. For example, the currently launched application is "mirror" and the maximum color gamut space supported by the provided image is bt.709, and if the current display color gamut space is also bt.709, the color gamut space to which the image provided by the "mirror" application belongs at this time is the first color gamut space. For another example, if the currently launched application is "mirror" and the maximum color gamut space supported by the provided image is bt.709, and the current display color gamut space is bt.2020, then the color gamut space to which the image provided by the "mirror" application belongs is the second color gamut space.
In the technical scheme provided by the application, if the display device comprises the dual system, the color gamut space to which the image provided by the application program belongs can be determined through the first system corresponding to the a chip. The specific determination method may be that the first hardware system acquires packet name information corresponding to the currently started application program after monitoring the currently started application program. And according to the package name information, matching the color gamut space supported by the current application in the package name set file, and then determining whether the color gamut space to which the image provided by the current application belongs is a first color gamut space or a second color gamut space by comparing the color gamut space supported by the application program with the current display color gamut space.
In some embodiments of the present application, as shown in fig. 12, the step of determining whether the currently-started application is a camera related application includes:
s201: acquiring package name information of a currently started application;
s202: matching preset camera related application package name information according to the package name information;
s203: and if the package name information is the same as the package name information of the related application of the preset camera, determining that the current application program is the related application of the camera.
In this embodiment, matching may be performed between the package name information of the currently started application and preset package name information of the camera related application, and it is determined whether the current application is the camera related application according to a matching result. For example, if the packet name information of the currently started application is "com.
S13: and if so, converting the display color gamut space of the display into the second color gamut space.
In the technical scheme provided by the application, in order to relieve color difference, the display color gamut space of the display can be converted into the second color gamut space in a software mode. For example, the currently launched application is "look at the mirror", the maximum color gamut space supported by the application is bt.709, the current display color gamut space is bt.2020, and the image attribution color gamut space provided by the current application is determined to be the second color gamut space. At this time, an image provided by a "mirror" application is displayed by a wide color gamut display device, and color distortion easily occurs. Therefore, the display color gamut space can be set to be the second color gamut space by adjusting the display color gamut space of the display device, that is, the current display color gamut space of the display device is adjusted to be bt.709, so as to alleviate the color gamut space difference between the application program and the display device and alleviate the color distortion problem.
In the technical scheme provided by the application, if the display device is the above dual-system device, since the second hardware system corresponding to the N chip is mainly used for controlling and adjusting the system basic function, the display color gamut space can be adjusted to the second color gamut space through the second hardware system corresponding to the N chip. Optionally, a process, for example, an hstvos process, may be defined in the second hardware system, and the process may keep running in a background after the system is booted, and when the first hardware system corresponding to the a chip determines that the currently started application program needs to adjust the color gamut space, a setting signal may be transmitted through the RPC communication interface to activate the hstvos process to run and set, so as to adjust the display color gamut space to the second color gamut space.
It should be noted that the present application describes the dynamic adjustment method of the color gamut space by taking the "look-at-mirror" application startup procedure as an example. In practical applications, different adjustments to the display color gamut space may be performed according to specific application environments. For example, when the display device switches from a "mirror" application to a "movie" application, the display gamut space has been adjusted to bt.709 since to accommodate "mirror", whereas "movie" applications require a wide gamut to improve display quality and therefore also require the display gamut space to be restored back to the bt.2020 standard gamut space. That is, in some embodiments of the present application, the method further comprises: the second hardware system sets the display color gamut space as the second color gamut space through a setting interface; or the second hardware system sets the display color gamut space as the first color gamut space through the restoration interface.
In addition, for the display device provided by the application, the proportion of the number of the camera related applications to the number of all the applications is small, and the frequency of the corresponding applications is not high. In most cases, the display gamut space of the display device is not adjusted. Therefore, after the application program which needs to adjust the color gamut space is run, the display color gamut space can be automatically adjusted back to the first color gamut space, so that the display color gamut space is the maximum color gamut space supported by the display device.
That is, in some embodiments of the present application, as shown in fig. 13, the method further includes the steps of:
s401: receiving a stop instruction of the camera related application;
s402: and if the stop instruction is received, reducing the display color gamut space into the first color gamut space.
The stop instruction is an instruction automatically generated when the camera related application program is exited. For example, after running the "look mirror" application, the user exits the "look mirror" application. At this time, the first hardware system may determine the stop instruction generated when the "looking at mirror" application exits by detecting the running state of the application in the memory. The stop instruction can be generated by judging that the application package name information in the memory is no longer in an activated state; or by the operating state of the user controller, such as generating a stop instruction when detecting that an exit key of a television remote controller is pressed. In response to receiving the stop instruction, it may be determined whether the application program has exited, and if the application has exited, the display gamut space may be restored back to the default display gamut space.
Adjusting the display color gamut space for an application includes setting and restoring, which may occur when an application is opened, switched, and closed. The switching application program may have multiple situations, and corresponding control may be performed for different situations. For example, when the current display device is switched from the "mirror" application to the "movie" application, since the color gamut spaces of the images provided by the two applications are different, the second hardware system is required to adjust the display color gamut space to bt.2020 corresponding to the "movie" application; if the application switch on the current display device is from a "movie" application to a "mirror" application, then the second hardware system is required to adjust the display color gamut space from bt.2020 to bt.709; if the application switching on the current display device is from the movie application to the live television application, the display color gamut space does not need to be adjusted to improve the application switching speed because the color gamut spaces to which the images provided by the movie application and the live television application belong are the same.
According to the technical scheme, whether the current starting application is related to the camera is judged by monitoring the starting state of the application program in the display device in real time, if so, the display color gamut space of the display is converted into a second color gamut space provided by the related application of the camera, or the environmental image data collected by the camera is adjusted to perform color gamut space conversion. The method can dynamically adjust the display color gamut space according to the started application program, improve the difference between the display color gamut space and the color gamut space corresponding to the application program, and relieve the problem of color distortion.
In another embodiment of the present application, the environmental image data collected by the camera may be subjected to color gamut space conversion, so that the environmental image data conforms to the color gamut space of the display. Therefore, in the present application, the dynamic adjustment can also be performed according to the following steps:
s21: acquiring current starting application information and environment image data acquired by a camera;
s22: judging whether the current started application is related to the camera or not, wherein a color gamut space to which an image provided by the related camera application belongs is a second color gamut space different from a default display color gamut space;
s23: and if so, performing color gamut space conversion on the environment image data acquired by the camera.
Corresponding to the dynamic adjustment method of the color gamut space, an embodiment of the present application further provides a display device, configured to implement the adjustment method, where the display device includes: the system comprises a display, a camera and a controller, wherein the controller is in communication connection with the display, the display is configured to display a user interface and environment image data, and the display defaults to display a color gamut space as a first color gamut space; the camera is configured to collect environment image data, and a color gamut space to which an image provided by the camera belongs is a second color gamut space different from the first color gamut space; the controller is configured to execute presenting a user interface and determine whether a currently launched application is a camera related application; and if so, performing color gamut space conversion on the environment image data acquired by the camera.
The difference between this embodiment and the above embodiments is that this embodiment implements dynamic adjustment by performing color gamut space conversion on the environment image data collected by the camera. For example, colors in the image are transformed according to the default display color gamut space, thereby eliminating the difference between the ambient image data and the default display color gamut space and mitigating color differences.
The display device provided by the application can be a device with a display function, such as an intelligent television, an intelligent display and an intelligent projection device. The display in the display device may be a liquid crystal screen, an LED screen, or a projection lens, etc. The display can convert the input image signal into a specific picture for display. The controller is a control device built in the display device, and can perform data transmission and reception and calculation. The controller is internally provided with a processor chip with an operation function, such as a CPU, a singlechip, a programmable logic controller and the like. The controller is also internally provided with a running memory, a storage, various functional interfaces and the like which are matched with the processor chip. According to different specific models of display equipment, the hardware specifications of the controller are different so as to meet the actual data processing requirements.
The controller also supports conversion of the environmental image data collected by the camera and adjustment of the color gamut space of the display to control the display screen to switch between the first color gamut space and the second color gamut space. The memory of the display device can be provided with various application programs, and the application programs can be called by the data processing device and run to realize various functions. Obviously, the application programs in the memory include the control program of the system itself, and an additional program and a third party application, etc. as an extension. The display device can be internally and externally connected with other components to meet the use requirements of different application programs. For example, a camera may be built in the display device to capture images. The camera is connected with the data processing device, sends the acquired image to the data processing device, and then carries out processing such as decoding through the data processing device so as to send the acquired image to the display screen for displaying.
The display device can also be internally provided with a control device and a communication device, such as an infrared remote control receiving device and a network interface, the control of the picture display function of the display device can be realized through the control device and the communication device, and the application installed in the memory can be updated or uninstalled to meet different use requirements.
Optionally, the controller is further configured to:
obtaining exit information of the camera related application;
judging whether the camera related application is quitted or not according to the quitting information;
if so, the display color gamut space is reduced to the default display color gamut space.
Optionally, the controller is further configured to:
acquiring package name information of a currently started application;
matching preset camera related application package name information according to the package name information;
and if the package name information is the same as the package name information of the related application of the preset camera, determining that the current application program is the related application of the camera.
Optionally, the display device is provided with a first hardware system and a second hardware system inside, and the controller is further configured to control the first hardware system to execute the following program steps:
acquiring a package name set file from a second hardware system; the package name set file comprises package name information of all camera related applications and a color gamut space to which the provided images belong;
loading the package name set file to monitor the starting state of each application according to the package name set file;
and extracting the package name information of the currently started application from the package name set file.
Optionally, the controller is further configured to control the second hardware system to perform the following program steps:
creating a package name set file according to the currently installed application, or acquiring the package name set file from a cloud server;
storing the package name set file;
and sending the package name set file to the first hardware system through a communication interface.
Optionally, the controller is further configured to control the second hardware system to perform the following program steps:
acquiring package name updating data from a cloud server;
modifying the stored package name set file according to the updating data;
and sending the modified package name set file to the first hardware system through a communication interface.
In a specific implementation, the present application further provides a computer storage medium, where the computer storage medium may store a program, and when the program is executed, the program may include some or all of the steps in each embodiment of the control method for displaying the dual-system application upgrade interface provided by the present application. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
Those skilled in the art will clearly understand that the techniques in the embodiments of the present application may be implemented by way of software plus a required general hardware platform. Based on such understanding, the technical solutions in the embodiments of the present application may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the embodiments or some parts of the embodiments of the present application.
The same and similar parts in the various embodiments in this specification may be referred to each other. Particularly, for the embodiment of the control device for displaying the upgrade interface of the dual system application, since it is basically similar to the embodiment of the method, the description is simple, and the relevant points can be referred to the description in the embodiment of the method.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments shown in the present application without inventive effort, shall fall within the scope of protection of the present application. Moreover, while the disclosure herein has been presented in terms of exemplary one or more examples, it is to be understood that each aspect of the disclosure can be utilized independently and separately from other aspects of the disclosure to provide a complete disclosure.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims of the present application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can be implemented in sequences other than those illustrated or otherwise described herein with respect to the embodiments of the application, for example.
Furthermore, the terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (13)

1. A display device, comprising:
a display configured to display a user interface, the display defaulting to a display gamut space as a first gamut space;
the camera is configured to collect environment image data, and a color gamut space to which an image provided by the camera belongs is a second color gamut space different from the first color gamut space;
a controller in communication with the display, the controller configured to perform presenting a user interface, and,
judging whether the currently started application is related to the camera or not; and if so, converting the display color gamut space of the display into the second color gamut space.
2. The display device of claim 1, wherein the controller is further configured to:
receiving a stop instruction of the camera related application;
and if the stop instruction is received, reducing the display color gamut space into the first color gamut space.
3. The display device of claim 1, wherein the controller is further configured to:
acquiring package name information of a currently started application;
matching preset camera related application package name information according to the package name information;
and if the package name information is the same as the package name information of the related application of the preset camera, determining that the current application program is the related application of the camera.
4. The display device of claim 3, wherein the display device has a first hardware system and a second hardware system built therein, and wherein the controller is further configured to control the first hardware system to perform the following program steps:
acquiring a package name set file from a second hardware system; the package name set file comprises package name information of all camera related applications and a color gamut space to which the provided images belong;
loading the package name set file to monitor the starting state of each application according to the package name set file;
and extracting the package name information of the currently started application from the package name set file.
5. The display device of claim 4, wherein the controller is further configured to control the second hardware system to perform the following program steps:
creating a package name set file according to the currently installed application, or acquiring the package name set file from a cloud server;
storing the package name set file;
and sending the package name set file to the first hardware system through a communication interface.
6. The display device of claim 5, wherein the controller is further configured to control the second hardware system to perform the following program steps:
acquiring package name updating data from a cloud server;
modifying the stored package name set file according to the updating data;
and sending the modified package name set file to the first hardware system through a communication interface.
7. A display device, comprising:
a display configured to display a user interface and ambient image data, the display defaulting to a display gamut space that is a first gamut space;
the camera is configured to collect environment image data, and a color gamut space to which an image provided by the camera belongs is a second color gamut space different from the first color gamut space;
a controller in communication with the display, the controller configured to perform presenting a user interface, and,
judging whether the currently started application is related to the camera or not; and if so, performing color gamut space conversion on the environment image data acquired by the camera.
8. The display device of claim 7, wherein the controller is further configured to:
acquiring package name information of a currently started application;
matching preset camera related application package name information according to the package name information;
and if the package name information is the same as the package name information of the related application of the preset camera, determining that the current application program is the related application of the camera.
9. The display device of claim 8, wherein the display device has a first hardware system and a second hardware system built therein, and wherein the controller is further configured to control the first hardware system to perform the following program steps:
acquiring a package name set file from a second hardware system; the package name set file comprises package name information of all camera related applications and a color gamut space to which the provided images belong;
loading the package name set file to monitor the starting state of each application according to the package name set file;
and extracting the package name information of the currently started application from the package name set file.
10. The display device of claim 9, wherein the controller is further configured to control the second hardware system to perform the following program steps:
creating a package name set file according to the currently installed application, or acquiring the package name set file from a cloud server;
storing the package name set file;
and sending the package name set file to the first hardware system through a communication interface.
11. The display device of claim 10, wherein the controller is further configured to control the second hardware system to perform the following program steps:
acquiring package name updating data from a cloud server;
modifying the stored package name set file according to the updating data;
and sending the modified package name set file to the first hardware system through a communication interface.
12. A method for dynamic adjustment of color gamut space, comprising:
acquiring current starting application information;
judging whether the current started application is related to the camera or not, wherein a color gamut space to which an image provided by the related camera application belongs is a second color gamut space different from a default display color gamut space; and if so, converting the display color gamut space of the display into the second color gamut space.
13. A method for dynamic adjustment of color gamut space, comprising:
acquiring current starting application information and environment image data acquired by a camera;
judging whether the current started application is related to the camera or not, wherein a color gamut space to which an image provided by the related camera application belongs is a second color gamut space different from a default display color gamut space; and if so, performing color gamut space conversion on the environment image data acquired by the camera.
CN201911018447.0A 2019-08-18 2019-10-24 Display device and color gamut space dynamic adjustment method Active CN112399254B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/085007 WO2021031589A1 (en) 2019-08-18 2020-04-16 Display device and dynamic color gamut space adjustment method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910761462 2019-08-18
CN2019107614628 2019-08-18

Publications (2)

Publication Number Publication Date
CN112399254A true CN112399254A (en) 2021-02-23
CN112399254B CN112399254B (en) 2022-06-14

Family

ID=74603699

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911018447.0A Active CN112399254B (en) 2019-08-18 2019-10-24 Display device and color gamut space dynamic adjustment method

Country Status (2)

Country Link
CN (1) CN112399254B (en)
WO (1) WO2021031589A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023232084A1 (en) * 2022-06-01 2023-12-07 青岛海信激光显示股份有限公司 Projection device, display method therefor, and storage medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115119035B (en) * 2021-03-23 2023-08-01 青岛海信商用显示股份有限公司 Display device, image processing method and device
TWI796905B (en) * 2021-12-28 2023-03-21 宏碁股份有限公司 Method for adjusting display parameters and electronic device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110157212A1 (en) * 2009-12-29 2011-06-30 Yanli Zhang Techniques for adapting a color gamut
TW201431383A (en) * 2013-12-19 2014-08-01 Focaltech Systems Ltd Method and apparatus for adjusting the color gamut of a color image
US20140333797A1 (en) * 2013-05-10 2014-11-13 Samsung Display Co., Ltd. Device and method for processing image
CN106782428A (en) * 2016-12-27 2017-05-31 上海天马有机发光显示技术有限公司 A kind of colour gamut method of adjustment and colour gamut adjustment system of display device
CN107786864A (en) * 2017-11-07 2018-03-09 深圳Tcl数字技术有限公司 Television image display methods, equipment and readable storage medium storing program for executing
CN107845363A (en) * 2017-11-23 2018-03-27 维沃移动通信有限公司 A kind of display control method and mobile terminal
CN107925711A (en) * 2015-06-25 2018-04-17 汤姆逊许可公司 Colour posture change Color Gamut Mapping

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101206848B (en) * 2006-12-22 2010-07-14 财团法人工业技术研究院 Multicolor field control display
JP5446474B2 (en) * 2009-05-29 2014-03-19 ソニー株式会社 Information processing apparatus and method, and program
CN107068114B (en) * 2017-04-24 2019-04-30 北京小米移动软件有限公司 Screen color method of adjustment, device, equipment and storage medium
CN109286802A (en) * 2018-10-22 2019-01-29 深圳Tcl新技术有限公司 Color gamut matching method, device, display terminal and readable storage medium storing program for executing

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110157212A1 (en) * 2009-12-29 2011-06-30 Yanli Zhang Techniques for adapting a color gamut
US20140333797A1 (en) * 2013-05-10 2014-11-13 Samsung Display Co., Ltd. Device and method for processing image
TW201431383A (en) * 2013-12-19 2014-08-01 Focaltech Systems Ltd Method and apparatus for adjusting the color gamut of a color image
CN107925711A (en) * 2015-06-25 2018-04-17 汤姆逊许可公司 Colour posture change Color Gamut Mapping
CN106782428A (en) * 2016-12-27 2017-05-31 上海天马有机发光显示技术有限公司 A kind of colour gamut method of adjustment and colour gamut adjustment system of display device
CN107786864A (en) * 2017-11-07 2018-03-09 深圳Tcl数字技术有限公司 Television image display methods, equipment and readable storage medium storing program for executing
CN107845363A (en) * 2017-11-23 2018-03-27 维沃移动通信有限公司 A kind of display control method and mobile terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023232084A1 (en) * 2022-06-01 2023-12-07 青岛海信激光显示股份有限公司 Projection device, display method therefor, and storage medium

Also Published As

Publication number Publication date
CN112399254B (en) 2022-06-14
WO2021031589A1 (en) 2021-02-25

Similar Documents

Publication Publication Date Title
CN112073797B (en) Volume adjusting method and display device
CN112399213B (en) Display device and remote controller key multiplexing method
CN111405338B (en) Intelligent image quality switching method and display device
CN110708581B (en) Display device and method for presenting multimedia screen saver information
CN112399232A (en) Display equipment, camera priority use control method and device
CN112399254B (en) Display device and color gamut space dynamic adjustment method
CN112073774A (en) Image quality processing method and display device
CN112463267A (en) Method for presenting screen saver information on screen of display device and display device
CN112068987A (en) Method and device for rapidly restoring factory settings
CN111385631B (en) Display device, communication method and storage medium
CN112995733B (en) Display device, device discovery method and storage medium
CN112073789A (en) Sound processing method and display device
CN112423042A (en) Upgrading method and system for dual-system Bluetooth remote controller
CN112073812B (en) Application management method on smart television and display device
CN112073776B (en) Voice control method and display device
CN113448529B (en) Display apparatus and volume adjustment method
CN112399071B (en) Control method and device for camera motor and display equipment
CN112073808A (en) Color space switching method and display device
CN112073803A (en) Sound reproduction method and display equipment
CN112073773A (en) Screen interaction method and device and display equipment
CN112073759A (en) Method and device for selecting and scheduling communication modes between two systems and display equipment
CN112399223B (en) Method for improving moire fringe phenomenon and display device
CN112995113B (en) Display device, port control method and storage medium
CN112911353B (en) Display device, port scheduling method and storage medium
CN112073772B (en) Key seamless transmission method based on dual systems and display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant