WO2022057286A1 - 一种显示方法及显示设备 - Google Patents

一种显示方法及显示设备 Download PDF

Info

Publication number
WO2022057286A1
WO2022057286A1 PCT/CN2021/093438 CN2021093438W WO2022057286A1 WO 2022057286 A1 WO2022057286 A1 WO 2022057286A1 CN 2021093438 W CN2021093438 W CN 2021093438W WO 2022057286 A1 WO2022057286 A1 WO 2022057286A1
Authority
WO
WIPO (PCT)
Prior art keywords
color
value
camera
color extraction
pixel
Prior art date
Application number
PCT/CN2021/093438
Other languages
English (en)
French (fr)
Inventor
邵肖明
刘东东
刘晋
于江
李保成
司洪龙
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202010982475.0A external-priority patent/CN112118468A/zh
Priority claimed from CN202011049293.4A external-priority patent/CN112188098A/zh
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Publication of WO2022057286A1 publication Critical patent/WO2022057286A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals

Definitions

  • the present application relates to the technical field of display devices, and in particular, to a display method and a display device.
  • display devices include smart TVs, smart set-top boxes, smart boxes, and products with smart display screens. Taking smart TVs as an example, smart TVs make more and more scenes, not only as a device for watching TV programs in the family, but also for games, electronic albums, information display, etc. At the same time, the interaction ability between smart TVs and peripheral devices has also developed rapidly, mainly reflected in the somatosensory games of human-computer interaction.
  • Embodiments of the present application provide a display method and a display device.
  • the present application provides a display device, including:
  • a display configured to display a user screen
  • Peripheral devices configured to render different colors
  • controller connected to the display and the peripheral device, the controller configured to:
  • the corresponding described color feature value is converted into the RGB mean value of RGB color space
  • the RGB mean value of each color extraction area is sent to the corresponding peripheral device, and the peripheral device presents the color corresponding to the RGB mean value.
  • FIG. 1 exemplarily shows a schematic diagram of an operation scene between a display device and a control apparatus according to some embodiments
  • FIG. 2 exemplarily shows a hardware configuration block diagram of a display device 200 according to some embodiments
  • FIG. 3 exemplarily shows a hardware configuration block diagram of the control device 100 according to some embodiments
  • FIG. 4 exemplarily shows a structural block diagram of a display device according to some embodiments
  • Figure 5 exemplarily shows a schematic diagram of a device list according to some embodiments
  • FIG. 6 exemplarily shows a schematic diagram of setting a plurality of peripheral devices in a display device according to some embodiments
  • FIG. 7 exemplarily shows a flowchart of a method for a peripheral device color to follow a screen color change according to some embodiments
  • Figure 8 exemplarily shows a schematic diagram of a color extraction region according to some embodiments.
  • FIG. 9 exemplarily shows a flowchart of a method for extracting color feature values according to some embodiments.
  • FIG. 10 exemplarily shows a flowchart of another method for extracting color feature values according to some embodiments.
  • FIG. 11 is a schematic diagram of a display device 200 according to an exemplary embodiment of the present application.
  • FIG. 12 is a schematic diagram of a display device 200 according to another exemplary embodiment of the present application.
  • FIG. 13 is a schematic diagram of an implementation scenario shown in the present application according to an exemplary embodiment
  • FIG. 14 is a schematic diagram of an implementation scenario shown in the present application according to an exemplary embodiment
  • FIG. 15 is a schematic diagram of an implementation scenario shown in the present application according to an exemplary embodiment
  • FIG. 16 is a schematic diagram of an implementation scenario shown in the present application according to an exemplary embodiment
  • FIG. 17 is a flowchart of a method for controlling a breathing light of a camera according to an exemplary embodiment of the present application.
  • module refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code capable of performing the functions associated with that element.
  • FIG. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment. As shown in FIG. 1 , a user can operate the display device 200 through the smart device 300 or the control device 100 .
  • the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication or Bluetooth protocol communication, and other short-distance communication methods, and the display device 200 is controlled wirelessly or wiredly.
  • the user can control the display device 200 by inputting user instructions through keys on the remote control, voice input, control panel input, and the like.
  • a smart device 300 eg, a mobile terminal, a tablet computer, a computer, a notebook computer, etc.
  • the display device 200 is controlled using an application running on the smart device.
  • the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300.
  • the module for acquiring voice commands configured inside the display device 200 may directly receive the user's voice command for control.
  • the user's voice command control can also be received through a voice control device provided outside the display device 200 device.
  • the display device 200 is also in data communication with the server 400 .
  • the display device 200 may be allowed to communicate via local area network (LAN), wireless local area network (WLAN), and other networks.
  • the server 400 may provide various contents and interactions to the display device 200 .
  • FIG. 2 exemplarily shows a configuration block diagram of the control apparatus 100 according to an exemplary embodiment.
  • the control device 100 includes a controller 110 , a communication interface 130 , a user input/output interface 140 , a memory, and a power supply.
  • the control device 100 can receive the user's input operation instruction, and convert the operation instruction into an instruction that the display device 200 can recognize and respond to, and play an intermediary role between the user and the display device 200 .
  • FIG. 3 is a block diagram showing a hardware configuration of the display apparatus 200 according to an exemplary embodiment.
  • the display apparatus 200 includes at least one of a tuner-demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface.
  • the display 260 includes a display screen component for presenting pictures, and a driving component for driving image display, for receiving image signals output from the controller, components for displaying video content, image content, and menu manipulation interfaces, and user manipulation UI interfaces .
  • the display 260 may be a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
  • the communicator 220 is a component for communicating with external devices or servers according to various communication protocol types.
  • the communicator may include at least one of a Wifi module, a Bluetooth module, a wired Ethernet module and other network communication protocol chips or near field communication protocol chips, and an infrared receiver.
  • the display device 200 may establish transmission and reception of control signals and data signals with the external control device 100 or the server 400 through the communicator 220 .
  • the user interface can be used to receive control signals from the control device 100 (eg, an infrared remote control, etc.).
  • control device 100 eg, an infrared remote control, etc.
  • the detector 230 is used to collect external environment or external interaction signals.
  • the detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which can be used to collect external environmental scenes, user attributes or user interaction gestures, or , the detector 230 includes a sound collector, such as a microphone, for receiving external sound.
  • the external device interface 240 may include but is not limited to the following: any one of a high-definition multimedia interface interface (HDMI), an analog or data high-definition component input interface (component), a composite video input interface (CVBS), a USB input interface (USB), an RGB port, etc. or multiple interfaces. It may also be a composite input/output interface formed by a plurality of the above-mentioned interfaces.
  • HDMI high-definition multimedia interface interface
  • component analog or data high-definition component input interface
  • CVBS composite video input interface
  • USB USB input interface
  • RGB port etc.
  • It may also be a composite input/output interface formed by a plurality of the above-mentioned interfaces.
  • the controller 250 and the tuner 210 may be located in different separate devices, that is, the tuner 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
  • the controller 250 controls the operation of the display device and responds to the user's operation through various software control programs stored in the memory.
  • the controller 250 controls the overall operation of the display apparatus 200 . For example, in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
  • Objects can be any of the optional objects, such as hyperlinks, icons, or other actionable controls.
  • the operations related to the selected object include: displaying operations connected to hyperlinked pages, documents, images, etc., or executing operations of programs corresponding to the icons.
  • the user may input user commands on a graphical user interface (GUI) displayed on the display 260, and the user input interface receives the user input commands through the graphical user interface (GUI).
  • GUI graphical user interface
  • the user may input a user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through a sensor to receive the user input command.
  • GUI Graphical User Interface
  • control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, etc. visual interface elements.
  • the peripheral device when a display device is used as a display terminal to interact with a peripheral device, the peripheral device only acts as an input device.
  • the interactive content between the display device and the peripheral device is only presented in the display device, so that the content of the display device is relatively closed, and the display device does not share information with the peripheral device, so that the peripheral device will not produce corresponding changes during the interaction process. , which will lead to a relatively simple content presentation form in the interactive scene, and the user experience will not be high.
  • an embodiment of the present invention provides a display device that can play TV pictures, games, singing and other entertainment interaction scenarios,
  • the display color of the control peripheral device can change with the color change of the user screen.
  • the peripheral device that interacts with the display device can select a device capable of emitting light and changing color, for example, a light bulb with a variable color, an LED light bar, and the like.
  • the display color of the peripheral device changes with the color of the user screen of the display device, which can enhance the atmosphere of viewing the display content of the display device, increase the presentation form of the display content, and improve the user experience of using the display device.
  • FIG. 4 exemplarily shows a structural block diagram of a display device according to some embodiments.
  • an embodiment of the present invention provides a display device 200 , referring to FIG. 4 , which includes a display 275 , a peripheral device 201 and a controller 250 .
  • the display 275 is configured to display a user screen;
  • the peripheral device 201 is a device that can emit light and change color, and is configured to present different colors;
  • the controller 250 is connected with the display 275 and the peripheral device 201 , and the controller 250 is used to display 275 in the display 275
  • the color of the displayed user screen is synchronized to the peripheral device 201 in real time, so that the display color of the peripheral device 201 is updated synchronously with the color of the user screen.
  • peripheral device In order to use the peripheral device to present the color of the user screen more accurately, and to improve the effect of setting the atmosphere, in some embodiments, there may be multiple peripheral devices that interact with the display device at the same time. For example, four peripheral devices can be set, and each peripheral device is connected to the display device, and the connection mode can be wired connection or wireless connection, and it is enough to ensure that they are in the same local area network.
  • connection the display device When using the wireless connection method, connect the display device to the router, and the router assigns the LAN IP to the display device; then connect the peripheral device to the router, and the router assigns the LAN IP to the peripheral device.
  • the connection between the display device and each peripheral device in the same network segment can be realized through the router, and the display device can obtain all the peripheral devices connected to the local area network through scanning.
  • FIG. 5 A schematic diagram of a device list according to some embodiments is exemplarily shown in FIG. 5 .
  • the display device when the display device scans, it sets all IP segments to be scanned in its own local area network, scans through network commands, obtains the IP address of each peripheral device, and scans each peripheral devices are displayed in the device list. If four peripheral devices are scanned, the names of the four peripheral devices will be displayed in the device list.
  • the name of each peripheral device can also be modified, and other devices that are not related to color rendering can be deleted from the device list.
  • FIG. 6 exemplarily shows a schematic diagram of setting a plurality of peripheral devices in a display device according to some embodiments.
  • a plurality of peripheral devices can be arranged around the display of the display device.
  • the first peripheral device 201a is arranged on the upper side of the display
  • the second peripheral device 201b is arranged on the upper side of the display.
  • the third peripheral device 201c is arranged on the left side of the display
  • the fourth peripheral device 201d is arranged on the right side of the display.
  • the four peripheral devices arranged around the display 275 can respectively present the colors presented at the corresponding positions of the user screen, that is, the first peripheral device 201a is used to present the color of the upper end of the user screen; the second peripheral device 201b is used to present the user The color at the bottom of the screen; the third peripheral device 201c is used to present the color at the left end of the user screen; the fourth peripheral device 201d is used to present the color at the right end of the user screen.
  • Each peripheral device corresponds to a corresponding position of the user screen in the display, so that the display color of each peripheral device 201 can be updated synchronously with the color of the corresponding position of the user screen, thereby improving the effect of the display color of the peripheral device.
  • FIG. 7 exemplarily shows a flowchart of a method for a peripheral device color to follow a screen color change according to some embodiments.
  • the configured controller 250 is configured to perform the following steps when executing the method for changing the color of the peripheral device following the color of the screen:
  • the display device of each peripheral device will only follow the color change of the corresponding position of the user screen. Therefore, in order to accurately determine the user interface that each peripheral device can present For the color of the corresponding position of the screen, the user screen can be divided into multiple color extraction areas, and the color extraction area is used to provide the color presented at the corresponding position of the user screen.
  • the controller is further configured to perform the following steps when performing dividing the user screen into a plurality of color extraction areas:
  • Step 11 Acquire the user picture presented by the display.
  • Step 12 Divide the user screen into regions according to a preset division rule to obtain a plurality of color extraction regions, and each color extraction region does not overlap.
  • the controller acquires the user picture presented by the display, and divides the user picture into multiple color extraction areas according to preset division rules.
  • the preset division rule may be to divide the user screen into multiple non-overlapping color extraction areas, and the sum of the multiple color extraction areas may be less than or equal to the complete area presented by the user image.
  • FIG. 8 A schematic diagram of a color extraction region according to some embodiments is exemplarily shown in FIG. 8 .
  • the positions of the four color extraction areas can be located at the upper, lower, left and right positions of the user screen respectively, and the first color extraction area A1 is located at On the upper side of the user screen, the second color extraction area A 2 is located on the lower side of the user screen, the third color extraction area A 3 is located on the left side of the user screen, and the fourth color extraction area A 4 is located on the right side of the user screen .
  • the number of divisions of the color extraction area is the same as the number of peripheral devices, so that the color extraction area and peripheral devices are in a one-to-one binding relationship, so that one peripheral device can receive the color of one color extraction area, so as to facilitate the peripheral devices
  • the device can display the color of the user screen corresponding to the location of the region to avoid confusion, so as to present the effect of different peripheral devices displaying different colors.
  • the corresponding peripheral device When establishing a one-to-one binding relationship between a peripheral device and the color extraction area, the corresponding peripheral device is flashed by triggering the target peripheral device name in the device list to accurately determine the device to be bound. for which one. Then, based on the user's personalized setting, a binding relationship can be established between the flashing peripheral device and one of the color extraction areas. In order to facilitate the consistency of the color of the user screen presented by the peripheral device, the binding relationship between the peripheral device and the color extraction area may be established according to the rule of setting the same position.
  • the setting positions of the peripheral devices correspond to the positions of the color extraction areas one-to-one.
  • the first peripheral device 201a located on the upper side of the display corresponds to the first color extraction area A1 located on the upper side of the user screen
  • the The second peripheral device 201b disposed on the lower side of the display corresponds to the second color extraction area A2 located on the lower side of the user screen
  • the third peripheral device 201c disposed on the left side of the display is corresponding to the second color extraction area A2 located on the left side of the user screen.
  • the fourth peripheral device 201d disposed on the right side of the display corresponds to the fourth color extraction area A 4 located on the right side of the user screen.
  • the divided color extraction areas are used to provide displayed colors for corresponding peripheral devices. Therefore, color extraction can be performed on the picture content presented in each color extraction area to determine the color feature value corresponding to each color extraction area.
  • the color feature value can represent the color standard value of the corresponding color extraction area.
  • the color feature value When determining the color feature value, it can be directly extracted from each color extraction area, and you can also first obtain the user screen displayed on the current display (that is, take a screenshot), and then extract the color feature value of each color extraction area from the screenshot. .
  • FIG. 9 exemplarily shows a flowchart of a method for extracting color feature values according to some embodiments.
  • the controller is further configured to perform the following steps when performing acquiring the color feature value of the picture content presented by each color extraction area:
  • a color histogram is a color feature that describes the proportion of different colors in the entire image.
  • the combined value of red pixels can be 64RColor, which refers to the combined value of the red pixel values presented by all colors in the specified screen content corresponding to the specified color extraction area;
  • the combined value of green pixels can be 64GColor, which refers to the specified color extraction area corresponding to the specified screen.
  • the combined value of blue pixels can be 64BColor, which refers to the combined value of blue pixel values presented by all colors in the specified picture content corresponding to the specified color extraction area.
  • Each color extraction area corresponds to a color feature value.
  • the first color extraction area A 1 corresponds to a color feature value C 1 (64R 1 Color, 64G 1 Color, 64B 1 Color)
  • the second color extraction area A 2 Corresponding color feature value C 2 (64R 2 Color, 64G 2 Color, 64B 2 Color)
  • the third color extraction area A 3 corresponds to color feature value C 3 (64R 3 Color, 64G 3 Color, 64B 3 Color)
  • the fourth The color extraction areas A 4 correspond to color feature values C 4 (64R 4 Color, 64G 4 Color, 64B 4 Color).
  • FIG. 10 exemplarily shows a flowchart of another method for extracting color feature values according to some embodiments.
  • the controller is further configured to perform the following steps when executing the acquisition of the color feature value of the picture content presented by each color extraction area:
  • a screenshot of the picture presented by the user screen at a certain moment may be taken first to obtain a complete picture. Then, the color histogram of each part of the complete picture is extracted separately.
  • the determination of each part in the complete picture can be divided based on the division rules for dividing the color extraction area, that is, the complete picture is divided into multiple partial pictures, and each part corresponds to a part A picture, multiple partial pictures do not overlap each other, and the sum of the areas of the multiple partial pictures is less than or equal to the total area of the complete picture.
  • the partial pictures obtained by each division do not overlap each other, which can prevent the color of each partial picture from being affected, and then can accurately extract the representative color of the corresponding partial picture and present it on the peripheral device. For example, if part of the picture B 1 and part of the picture B 2 overlap, and the part of the picture B 1 appears red as a whole, and the part of the picture B 2 appears yellow as a whole, after the overlap, the overall color of the part of the picture B 1 will appear red and yellow.
  • the overall color of the partial picture B2 is yellowish to reddish, so that the final displayed color is not the original color of the corresponding partial picture.
  • the number of partial screens divided into the complete screen is the same as that of the color extraction area, and the positions are the same.
  • the divided pictures are also divided into four parts, and the positions of the four partial pictures are also located at the top, bottom, left and right positions of the complete picture.
  • each color extraction area corresponds to a partial picture content, so that the color presented by the specified partial picture content is used as the color of the specified color extraction area.
  • the color histogram of the part of the screen content can be obtained.
  • the color feature values are extracted from the color histogram corresponding to part of the picture content, including the combined value of red pixels, the combined value of green pixels and the combined value of blue pixels.
  • the combined value of red pixels can be 64RColor, which refers to the combined value of the red pixel values presented by all colors in the specified screen content corresponding to the specified color extraction area;
  • the combined value of green pixels can be 64GColor, which refers to the specified color extraction area corresponding to the specified screen.
  • the combined value of blue pixels can be 64BColor, which refers to the combined value of blue pixel values presented by all colors in the specified picture content corresponding to the specified color extraction area.
  • the eigenvalue C 1 (64R 1 Color, 64G 1 Color, 64B 1 Color) is the color eigenvalue corresponding to the first color extraction area A 1
  • the color eigenvalue C 2 (64R 2 Color, 64G 2 of the part of the picture content B 2 ) Color, 64B 2 Color) is the color feature value corresponding to the second color extraction area A 2
  • the color feature value C 3 (64R 3 Color, 64G 3 Color, 64B 3 Color) of part of the screen content B 3 is the third color.
  • the color feature value corresponding to the extraction area A 3 is extracted
  • the color feature value C 4 (64R 4 Color, 64G 4 Color, 64B 4 Color) of the partial screen content B 4 is the color feature value corresponding to the fourth color extraction area A 4 .
  • the color feature value After determining the color feature value (red pixel composite value, green pixel composite value and blue pixel composite value) of each color extraction area, the color feature value can be converted into the RGB mean value of the RGB color space.
  • the color feature value can be converted into the RGB mean value of the RGB color space.
  • the controller is further configured to perform the following steps when performing the conversion of the corresponding color feature value into the RGB mean value of the RGB color space based on the pixel area of each color extraction region:
  • Step 31 Calculate the pixel area of each color extraction area.
  • Step 32 calculating the mean value of red pixels obtained by the combined value of red pixels and the pixel area, calculating the mean value of green pixels obtained by calculating the combined value of green pixels and the pixel area, and calculating the mean value of blue pixels obtained by calculating the combined value of blue pixels and the pixel area,
  • the red pixel mean, the green pixel mean, and the blue pixel mean are taken as the RGB mean of the RGB color space.
  • the color extraction area can be divided into rectangles, circles, etc.
  • the pixel coordinate values of the two vertices of the specified color extraction area can be obtained to determine the pixel area of the specified color extraction area.
  • a coordinate system is established based on the user screen. The origin of the coordinates is the upper left corner of the user screen. The rightward direction of the user screen is the positive X-axis, and the downward direction of the user screen is the positive Y-axis.
  • the two vertices are selected at the upper left point and the lower right point of the rectangle.
  • obtain the pixel coordinate value (X 1 , Y 1 ) of the upper left point P 1 of the rectangle corresponding to the specified color extraction area and the pixel coordinate value of the lower right point P 2 Pixel coordinate values (X 2 , Y 2 ), and then determine the pixel area S (X 2 -X 1 )*(Y 2 -Y 1 ) of the specified color extraction area.
  • the pixel area of each color extraction area is calculated by this method, and the pixel areas of the four color extraction areas can be determined as S 1 , S 2 , S 3 , and S 4 respectively.
  • the quotient of the color feature values (red pixel combined value, green pixel combined value and blue pixel combined value) corresponding to each color extraction area and the corresponding pixel area can be converted into color feature values.
  • the RGB mean value of each color extraction area is sent to the corresponding peripheral device, and the peripheral device presents the color corresponding to the RGB mean value.
  • the RGB mean value of each color extraction area After the RGB mean value of each color extraction area is determined, the RGB mean value can be sent to the corresponding peripheral device for display. There is a one-to-one binding relationship between each color extraction area and the peripheral device. Therefore, the controller can send the RGB mean value of each color extraction area to the corresponding peripheral device based on the binding relationship.
  • the RGB average value Color.RGB(red 1 , green 1 , blue 1 ) corresponding to the first color extraction area A 1 is sent to the first peripheral device 201a, and the first peripheral device 201a presents the RGB average value Color. .RGB(red 1 , green 1 , blue 1 ) corresponds to the color.
  • the controller When the controller sends the RGB average value to the corresponding peripheral device, it can first package the RGB average value into a network packet and send it.
  • the peripheral device receives the network packet, parses it, obtains the corresponding color value, displays the color, and changes the display color of the peripheral device.
  • the controller obtains the color feature value of the specified color extraction area every 300 milliseconds, and converts it into an RGB average value, and then the peripheral device displays the color corresponding to the RGB average value, so that the user screen and the peripheral device are shared
  • the displayed color presents a gradient color effect, which can ensure that the user screen and the display color of the peripheral device are displayed synchronously, so that the display color of the peripheral device changes with the change of the user screen.
  • the controller divides the user screen into multiple color extraction areas, establishes a binding relationship between each peripheral device and each color extraction area, and obtains each color extraction area.
  • the color feature value of the picture content presented in the color extraction area based on the pixel area of each color extraction area, the corresponding color feature value is converted into the RGB mean value of the RGB color space; the RGB mean value of each color extraction area is sent to the corresponding Peripheral device, the color corresponding to the average value of RGB is rendered by the peripheral device.
  • the display device acquires the color of the user screen and sends the color to the peripheral device for display, so that the display color of the peripheral device can change with the color change of the user screen, and the display content is increased.
  • the presentation form of the display device improves the user's experience of using the display device.
  • FIG. 7 exemplarily shows a flowchart of a method for a peripheral device color to follow a screen color change according to some embodiments.
  • a method for changing the color of a peripheral device following a screen color provided by an embodiment of the present invention is executed by a controller configured in the display device provided by the foregoing embodiment, and the method includes:
  • the dividing the user picture into multiple color extraction areas includes: acquiring the user picture presented by the display; Color extraction regions, each of which is non-coincident.
  • acquiring the color feature value of the picture content presented by each color extraction area includes: acquiring a color histogram of the picture content presented by each of the color extraction areas; The corresponding red pixel composite value, green pixel composite value and blue pixel composite value are extracted from the histogram as the color feature value corresponding to each of the color extraction regions.
  • the acquiring the color feature value of the picture content presented in each color extraction area includes: intercepting the complete picture presented in the user picture; The complete picture is divided, and the partial picture content corresponding to each color extraction area is determined; the color histogram of the partial picture content presented by each of the color extraction areas is obtained; the corresponding red color is extracted from each of the color histograms The pixel composite value, the green pixel composite value and the blue pixel composite value are used as the color feature value corresponding to each of the color extraction regions.
  • the color feature values include a combined red pixel value, a combined green pixel value, and a combined blue pixel value; and, based on the pixel area of each of the color extraction regions, the corresponding Described color feature value is converted into the RGB mean value of RGB color space, including: Calculate the pixel area of each color extraction area; Calculate the red pixel mean value obtained by the red pixel combined value and the pixel area, calculate the green pixel combined value and The green pixel mean value obtained by the pixel area, and, calculating the blue pixel mean value obtained by the combined value of the blue pixel and the pixel area, and using the red pixel mean value, green pixel mean value and blue pixel mean value as the RGB color space The RGB mean of .
  • Embodiments of the present application also provide a display device, including a camera, a camera breathing light, and a controller;
  • the controller connected to the camera breathing light, is used to obtain breathing light control parameters, and the breathing light control parameters correspond to the lighting effects required by the application when using the camera; control the breathing light according to the breathing light control parameters the camera breathing light;
  • the camera breathing light is used to present the lighting effect required by the application.
  • An embodiment of the present application further provides a display device, further comprising a breathing lamp control module, the controller is connected to the breathing lamp control module; the controller controls the camera breathing lamp according to the breathing lamp control parameters, including :
  • the controller generates a breathing lamp control instruction according to the breathing lamp control parameter; sends the breathing lamp control instruction to the breathing lamp control module;
  • the breathing light control module is connected to the camera breathing light, and is used for controlling the camera breathing light according to the received breathing light control instruction.
  • An embodiment of the present application further provides a display device, wherein the controller includes a camera control service, and the camera control service provides an interface for applying incoming breathing light control parameters; the controller acquiring the breathing light control parameters includes: The camera control service receives the breathing light control parameters passed in by the application according to the interface; when the camera control service receives the breathing light control parameters, it generates a breathing light control instruction according to the breathing light control parameters, and converts all the breathing light control parameters. The breathing light control instruction is sent to the breathing light control module.
  • An embodiment of the present application further provides a display device, wherein the breathing light control parameter is a parameter used to instruct and control the breathing light of the camera to be always on, a parameter used to indicate and control the flickering and flickering frequency of the breathing light of the camera, or a parameter used to control the breathing light of the camera. Indicates the parameter that controls the camera breathing light to turn off.
  • the breathing light control parameter is a parameter used to instruct and control the breathing light of the camera to be always on, a parameter used to indicate and control the flickering and flickering frequency of the breathing light of the camera, or a parameter used to control the breathing light of the camera. Indicates the parameter that controls the camera breathing light to turn off.
  • Embodiments of the present application also provide a display device, including a camera, a camera breathing light, and a controller;
  • the controller which is respectively connected to the camera breathing light and the camera, is used to control the camera breathing light to be turned on while starting the camera in response to the camera activation instruction, and to control the camera breathing light to turn on during the on-time duration of the camera breathing light.
  • the breathing light of the camera is controlled to be turned off.
  • An embodiment of the present application further provides a display device, further comprising a breathing light control module, the controller is connected to the breathing light control module; the controller includes a camera control service, and the camera control service The interface for the control parameters of the breathing light;
  • the controller controls the camera breathing light to turn on while starting the camera in response to the camera startup instruction, including:
  • the camera control service receives a first control parameter input by the application instructing to activate the camera according to the interface; generates a breathing light turn-on command according to the first control parameter, and sends the breathing light turn-on command to the breath light control module;
  • the breathing light control module configured to control the camera breathing light to turn on in response to receiving the breathing light turning on instruction
  • the controller controls the camera breathing light to turn off when the on-time duration of the camera breathing light reaches a preset duration, including:
  • the camera control service receives a second control parameter that is input according to the interface when the ON duration of the camera breathing light reaches a preset duration by an application instructing to start the camera; generates a breathing light off instruction according to the second control parameter, and sending the breathing light off instruction to the breathing light control module;
  • the breathing light control module is further configured to control the breathing light of the camera to turn off in response to receiving the command to turn off the breathing light.
  • Embodiments of the present application also provide a display device, including a camera, a camera breathing light, and a controller;
  • the camera is used to capture images or record videos
  • the controller connected with the camera breathing light, is used for:
  • the breathing light of the camera is controlled to flash at a first specific frequency
  • the breathing light of the camera is controlled to flash at a second specific frequency; in response to the user input triggering the video recording end instruction, the camera is controlled to end the recording. At the same time as the video, control the camera breathing light to turn off.
  • An embodiment of the present application further provides a display device, further comprising a breathing light control module, the controller is connected to the breathing light control module; the controller includes a camera control service, and the camera control service The interface for the control parameters of the breathing light;
  • the controller controls the camera to capture images while controlling the camera breathing light to flash at a first specific frequency, including:
  • the camera control service receives a first frequency parameter, and the first frequency parameter is input according to the interface when the application receives a user input that triggers an image capture instruction; generates a breathing light flashing instruction according to the first frequency parameter, and sending the breathing light flashing instruction to the breathing light control module;
  • the breathing light control module is configured to control the breathing light of the camera to flash at a first specific frequency according to the breathing light flashing instruction.
  • An embodiment of the present application further provides a display device, in which the controller, in response to a user input triggering a video recording start instruction, controls the camera to record video while controlling the camera breathing light to flash at a second specific frequency, including:
  • the camera control service receives a second frequency parameter, and the second frequency parameter is passed in according to the interface when the application receives a user input that triggers a video recording instruction; generates a breathing light flashing instruction according to the second frequency parameter, and sending the breathing light flashing instruction to the breathing light control module;
  • the breathing light control module is configured to control the breathing light of the camera to flash at a second specific frequency according to the breathing light flashing instruction.
  • Embodiments of the present application also provide a display device, including a camera, a camera breathing light, and a controller;
  • the controller connected with the camera breathing light, is configured to control the camera breathing light to flash at a specific frequency in response to receiving a call request.
  • Embodiments of the present application further provide a method for controlling a display device, which is applied to the display device, where the display device includes a camera and a camera breathing light, and the method includes:
  • breathing light control parameters correspond to lighting effects required by the application when using the camera
  • the camera breathing light is controlled according to the breathing light control parameter, so that the camera breathing light exhibits the lighting effect required by the application.
  • Embodiments of the present application further provide a method for controlling a display device, which is applied to the display device, where the display device includes a camera and a camera breathing light, and the method includes:
  • the breathing light of the camera is controlled to be turned on
  • the camera breathing light is controlled to be turned off.
  • Embodiments of the present application further provide a method for controlling a display device, which is applied to the display device, where the display device includes a camera and a camera breathing light, and the method includes:
  • the breathing light of the camera is controlled to flash at a first specific frequency
  • the breathing light of the camera is controlled to flash at a second specific frequency
  • the breathing light of the camera is controlled to be turned off while the camera is controlled to end the video recording.
  • Embodiments of the present application further provide a method for controlling a display device, which is applied to the display device, where the display device includes a camera and a camera breathing light, and the method includes:
  • the breathing light of the camera is controlled to stop blinking.
  • FIG. 11 is a schematic diagram of the appearance of a display device according to an exemplary embodiment of the present application.
  • the display device includes a camera 231 and a camera breathing light 2311 , and the camera breathing light can be displayed under the control of the display device controller.
  • Different lighting effects to indicate the working status of the camera For example, when the camera breathing light is continuously input at high level, it will show a light effect that is always on. When it is input at a low level, it will show a light effect that is always off. When high level and low level are input alternately, It will show a flickering light effect.
  • FIG. 12 is a schematic diagram of the appearance of a display device according to another exemplary embodiment of the present application. Different from the display device shown in FIG. 11 , the display device includes 231 and a plurality of camera breathing lights 2311 . It should be understood that the multiple camera breathing lights can be controlled by the display device controller synchronously, or individually controlled by the display device controller.
  • the camera may be a built-in camera in the display device, or may be an external camera connected to the display device controller through an external device interface.
  • the camera breathing light may be a built-in breathing light with the display device, or an external breathing light connected with the display device through an external device interface.
  • the controller 250 in order to enable the camera application to display the lighting effects required by the camera application when the camera application is activated, used and closed, the controller 250 is configured to: obtain the breathing lamp control parameters, the breathing lamp The control parameters correspond to the lighting effects required by the camera application when using the camera; the camera breathing light is controlled according to the breathing light control parameters, so that the camera breathing light presents the lighting effects required by the camera application.
  • the camera application refers to an application that can use a camera, such as a "magic mirror” application, a “hime” application, a “K song” application, and so on.
  • the breathing light control parameter may be a parameter used to instruct and control the breathing light of the camera to be always on.
  • "255" is pre-defined as a parameter to indicate that the breathing light of the camera is always on, which may be used to indicate the control of the breathing light of the camera.
  • the parameters of flickering and flickering frequency such as pre-defined "1 ⁇ 5" as the parameter to instruct the control camera breathing light to flicker at 1 ⁇ 5 times/second, or the parameter to instruct to control the camera breathing light to turn off, such as pre-defined "" 0" is the parameter indicating the control camera breathing light is turned off.
  • the controller 250 is directly electrically connected to the breathing light of the camera, so that the controller 250 can control the high/low level input of the breathing light of the camera according to the acquired breathing light control parameters, so that the breathing light of the camera presents The lighting effects corresponding to the breathing light control parameters are displayed.
  • the camera has a separate single-chip microcomputer, the controller 250 is connected to the single-chip microcomputer, and the single-chip microcomputer is connected to the breathing light of the camera, so that the control of the breathing light of the camera is completed through the single-chip microcomputer.
  • the microcontroller for directly controlling the breathing light of the camera is also called a breathing light control module.
  • the controller 250 after acquiring the breathing light control parameters, the controller 250 generates a breathing light control instruction according to the breathing light control parameters, and sends the breathing light control instruction to the breathing light control module; the breathing light control module, according to the received The breathing light control command controls the camera breathing light by controlling the high/low level input of the camera breathing light.
  • the camera control service in the framework layer provides an interface for the application to pass in the breathing light control parameters, and then the application needs to control
  • the breathing light control parameters corresponding to the desired lighting effect can be passed in by calling this interface.
  • the camera control service When this interface is called, the camera control service generates the breathing light control command according to the incoming breathing light control parameters. And send the breathing light control command to the breathing light control module. In this way, any camera application can control the camera breathing light to display the lighting effect it desires.
  • the controller 250 is configured to: in response to the camera startup instruction, control the camera breathing light to turn on when the camera is started, and when the camera breathing light is turned on for a preset duration, control The camera breathing light is off.
  • a preset duration for example, 2 seconds
  • the camera application can start the camera by calling the CameraService service in the framework layer when receiving the user input that triggers the camera start instruction, and at the same time, according to the camera control service
  • the provided above-mentioned interface passes in the first control parameter; after receiving the first control parameter passed in by the camera application, the camera control service will generate a breathing light turn-on instruction according to the first control parameter, and send the breathing light turn-on command to the breathing light control module; the breathing light control module controls the camera breathing light to turn on in response to receiving the breathing light turning on instruction.
  • the camera application again passes in the second control parameter according to the above-mentioned interface provided by the camera control service; after the camera control service receives the second control parameter passed in by the camera application, it will generate a breathing light according to the second control parameter. Turn off the command, and send the command to turn off the breathing light to the breathing light control module; the breathing light control module controls the camera to turn off the breathing light in response to receiving the command to turn off the breathing light.
  • the first control parameter may be "255" mentioned in the foregoing embodiment
  • the second control parameter may be "0" mentioned in the foregoing embodiment.
  • FIG. 13 is a schematic diagram of a camera breathing light control scenario shown in the present application according to an exemplary embodiment.
  • the user can input in any form such as voice, keys, gestures, etc. to start the application
  • the instruction of the "Magic Mirror” application in the center shows that the device controller starts the “Magic Mirror” application in response to the instruction input by the user.
  • the "Magic Mirror” application After the "Magic Mirror” application is successfully started, it starts the camera by calling the CameraService service in the framework layer, and the At the same time, according to the above interface provided by the camera control service, the breathing light control parameters are passed in, so that the camera control service cooperates with the breathing light control module to turn on the breathing light of the camera according to the incoming breathing light control parameters, and turn on the breathing light of the camera for a long period of time. Turn off the camera breathing light. Furthermore, it is shown to the user that when the "Magic Mirror” application starts the camera, the camera breathing light continues to light up for a preset period of time.
  • the controller 250 is configured to: in response to a user input triggering an image capturing instruction, while controlling the camera to capture an image, control the camera breathing light to flash at a first specific frequency, eg, flash once. In this way, in the shooting scene, every time the user presses the shutter button, the breathing light will flash with a certain frequency, which can make the user visually perceive the time cutoff of the camera shooting, eliminate the user's trouble, and improve the user's shooting experience.
  • a first specific frequency eg, flash once.
  • the controller 250 is configured to: in response to the user input triggering the video recording start instruction, while controlling the camera to record video, control the camera breathing light to blink at a second specific frequency, for example, blink 5 times per second ; In response to the user input triggering the video recording start instruction, while controlling the camera to end the video recording, the breathing light of the camera is controlled to be turned off. In this way, in the recording scene, from the start of the camera to the end of the recording, the breathing light will always flicker with a certain frequency, which can make the user visually perceive the recording process of the camera, eliminate the user's trouble, and improve the user's recording experience. .
  • the above-mentioned user input for triggering the image capturing instruction includes, but is not limited to, a key input, voice input, gesture input, etc. that instruct to start capturing a picture.
  • the above-mentioned user input that triggers the video recording start instruction and the video recording end instruction includes but is not limited to key input, voice input, gesture input, etc. for instructing to start recording and end recording.
  • the camera application when it receives a user input that triggers an image capture instruction, it imports the first frequency parameter according to the above-mentioned interface provided by the camera control service; Using the incoming first frequency parameter to generate a breathing light flashing command, and sending the breathing light flashing command to the breathing light control module; the breathing light control module is used to control the camera breathing light to flash according to the first specific frequency according to the breathing light flashing command.
  • the second frequency parameter is passed in according to the above interface provided by the camera control service; the camera control service generates the breathing light flashing instruction according to the second frequency parameter passed in by the camera application, The breathing light flashing instruction is sent to the breathing light control module; the breathing light control module is used to control the camera breathing light to flash according to the second specific frequency according to the breathing light flashing instruction.
  • the second control parameter is passed in according to the above-mentioned interface provided by the camera control service; the camera control service generates the breathing light off instruction according to the second control parameter passed in by the camera application, Send the breathing light off instruction to the breathing light control module; the breathing light control module controls the camera to turn off the breathing light in response to receiving the breathing light off instruction.
  • the first frequency parameter and the second frequency parameter may be any one of “1 to 5” mentioned in the above-mentioned embodiment.
  • the first frequency parameter is “1”. In this way, when the camera application receives an instruction to shoot When the user instructs the picture, the breathing light of the control camera will flash once.
  • FIG. 14 is a schematic diagram of a camera breathing light control scene shown in the present application according to an exemplary embodiment.
  • the foreground application interface displayed by the display device is the camera preview interface provided by the “Magic Mirror” application.
  • the display device controller responds to the instruction input by the user, and after the user inputs the instruction, the display device takes a picture through the camera, and at the same time, The camera breathing light flashes once; if the user inputs the command to trigger image capture again, the display device captures another image through the camera, and at the same time, the camera breathing light flashes again; Both will be accompanied by a certain frequency of breathing lights flashing, which can make the user visually perceive the time cutoff of the camera shooting, eliminate the user's troubles, and improve the user's photographing experience.
  • Fig. 15 is a schematic diagram of a camera breathing light control scene shown in the present application according to an exemplary embodiment.
  • the foreground application interface displayed by the display device is the camera preview interface provided by the "Magic Mirror" application.
  • the display device controller responds to the instruction input by the user, and when the user inputs the instruction, the display device records the video through the camera.
  • the camera breathing light will continue to flash at a frequency of 5 times per second until the video recording ends, that is, until the user enters an instruction to end the video recording.
  • the controller 250 is configured to control the camera breathing light to flash at a specific frequency in response to receiving a call request.
  • the call request may be a voice call request or a video call request.
  • the call application receives a call request sent by a friend's device, it controls the breathing light of the camera to flash at a specific frequency, so as to remind the user.
  • the call application when it receives a call request, it passes in specific frequency parameters according to the above interface provided by the camera control service; the camera control service generates a breathing light flashing instruction according to the frequency parameters passed in by the camera application, and sends the breathing light
  • the flashing instruction is sent to the breathing light control module; the breathing light control module is used to control the camera breathing light to flash at a specific frequency according to the breathing light flashing command.
  • FIG. 16 is a schematic diagram of a camera breathing light control scene according to an exemplary embodiment of the present application.
  • the display device displays a user interface when a call request is received. From when the display device receives the call request, until the display device rejects or accepts the call request, the camera breathing light will continue to flash at a specific frequency, so as to remind the user.
  • the present application provides a display device, the display device includes a camera, a camera breathing light, and a controller; the controller is connected to the camera breathing light, and is used to obtain breathing light control parameters, the The breathing light control parameter corresponds to the lighting effect required by the application when using the camera; the camera breathing light is controlled according to the breathing light control parameter; the camera breathing light is used to present the lighting effect required by the application.
  • the camera is equipped with a breathing light function, and the breathing light of the camera can present the lighting effect required by the application when the camera is working, thereby improving the user experience.
  • any camera application can control the camera breathing light to present the required lighting effect according to the requirements, so as to achieve the purpose of reminding the user and improving the user experience.
  • control scenarios for the camera breathing light by the camera application are not limited to the scenarios listed in the embodiments of this application. Other control scenarios implemented based on the display device provided in this application belong to the protection scope of this application.
  • the fitness application can control the camera breathing light to present different lighting effects according to the difficulty, amplitude, intensity and other factors of the fitness action.
  • Embodiments of the present application also provide a method for controlling a breathing light of a camera.
  • the method includes but is not limited to being applied to the display device provided by the embodiments of the present application. As shown in FIG. 17 , the method may include:
  • Step 121 Obtain breathing light control parameters, where the breathing light control parameters correspond to lighting effects required by the application when using the camera;
  • Step 122 Control the breathing light of the camera according to the breathing light control parameter, so that the breathing light of the camera presents the lighting effect required by the application.
  • Embodiments of the present application further provide a method for controlling a breathing light of a camera.
  • the method includes but is not limited to being applied to the display device provided by the embodiments of the present application.
  • the method may include: in response to a camera start instruction, while starting the camera, The camera breathing light is controlled to be turned on; when the on-time duration of the camera breathing light reaches a preset duration, the camera breathing light is controlled to be turned off.
  • An embodiment of the present application further provides a method for controlling a breathing light of a camera, the method includes but is not limited to being applied to the display device provided by the embodiment of the present application, and the method may include: in response to a user input triggering an image capturing instruction, in the control While the camera is capturing images, the breathing light of the camera is controlled to flash at a first specific frequency; in response to a user input triggering a video recording start instruction, the breathing light of the camera is controlled to flash at a second specific frequency while the camera is controlled to record a video ; In response to the user input triggering the video recording start instruction, while controlling the camera to end the video recording, the breathing light of the camera is controlled to be turned off.
  • Embodiments of the present application further provide a method for controlling a breathing light of a camera.
  • the method includes but is not limited to being applied to the display device provided by the embodiments of the present application.
  • the method may include: in response to receiving a call request, controlling the breathing of the camera The light flashes at a specific frequency; in response to receiving a user input triggering acceptance or rejection of the call request, the camera breathing light is controlled to stop flashing.
  • the method for controlling the breathing light of the camera may include all the steps of configuring the controller of the display device of the present application.
  • the description in the embodiment of the display device please refer to the description in the embodiment of the display device, which will not be repeated here.
  • the present application also provides a computer storage medium, wherein the computer storage medium can store a program, and when the program is executed, the program can include the various embodiments of the method for the color of the peripheral device to follow the color change of the screen provided by the present invention. some or all of the steps.
  • the storage medium can be a magnetic disk, an optical disc, a read-only memory (English: read-only memory, abbreviated as: ROM) or a random access memory (English: random access memory, abbreviated as: RAM), etc.
  • the technology in the embodiments of the present invention can be implemented by means of software plus a necessary general hardware platform.
  • the technical solutions in the embodiments of the present invention may be embodied in the form of software products in essence or the parts that make contributions to the prior art, and the computer software products may be stored in a storage medium, such as ROM/RAM , magnetic disk, optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in various embodiments or some parts of the embodiments of the present invention.
  • a computer device which may be a personal computer, a server, or a network device, etc.

Abstract

本申请公开了一种显示方法及显示设备,控制器将用户画面划分为多个颜色提取区域,建立每个外设设备与每个颜色提取区域的绑定关系,获取每个颜色提取区域所呈现画面内容的色彩特征值;基于每个颜色提取区域的像素面积,将对应的色彩特征值转换成RGB颜色空间的RGB均值;将每个颜色提取区域的RGB均值发送至对应的外设设备,由外设设备呈现RGB均值对应的颜色。

Description

一种显示方法及显示设备
本申请要求在2020年9月29日提交中国专利局、申请号为202011049293.4申请名称为“显示设备及摄像头呼吸灯控制方法”的优先权;本申请要求在2020年9月17日提交中国专利局、申请号为202010982475.0申请名称为“一种外设设备颜色跟随画面颜色变化的方法及显示设备”的优先权;其全部内容通过引用结合在本申请中。
技术领域
本申请涉及显示设备技术领域,尤其涉及一种显示方法及显示设备。
背景技术
随着显示设备的快速发展,显示设备的功能将越来越丰富,性能也越来越强大,目前,显示设备包括智能电视、智能机顶盒、智能盒子,以及带有智能显示屏幕的产品等。以智能电视为例,智能电视使场景越来越多,不只是在家庭中作为观看电视节目的设备,还可以进行游戏、播放电子相册、信息展示等。与此同时,智能电视与外设设备的交互能力也发展迅速,主要体现在人机交互的体感游戏上。
发明内容
本申请实施方式提供了一种显示方法及显示设备。
第一方面,本申请提供了一种显示设备,包括:
显示器,被配置为显示用户画面;
外设设备,被配置为呈现不同的颜色;
与所述显示器和所述外设设备连接的控制器,所述控制器被配置为:
将所述用户画面划分为多个颜色提取区域,建立每个所述外设设备与每个所述颜色提取区域的绑定关系,所述外设设备的数量与所述颜色提取区域的数量相同;
获取每个所述颜色提取区域所呈现画面内容的色彩特征值;
基于每个所述颜色提取区域的像素面积,将对应的所述色彩特征值转换成RGB颜色空间的RGB均值;
基于所述绑定关系,将每个所述颜色提取区域的RGB均值发送至对应的所述外设设备,由所述外设设备呈现所述RGB均值对应的颜色。
附图说明
为了更清楚地说明本申请的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,对于本领域普通技术人员而言,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1中示例性示出了根据一些实施例的显示设备与控制装置之间操作场景的示意图;
图2中示例性示出了根据一些实施例的显示设备200的硬件配置框图;
图3中示例性示出了根据一些实施例的控制设备100的硬件配置框图;
图4中示例性示出了根据一些实施例的显示设备的结构框图;
图5中示例性示出了根据一些实施例的设备列表的示意图;
图6中示例性示出了根据一些实施例的显示设备设置多个外设设备的示意图;
图7中示例性示出了根据一些实施例的外设设备颜色跟随画面颜色变化的方法的流程图;
图8中示例性示出了根据一些实施例的颜色提取区域的示意图;
图9中示例性示出了根据一些实施例的提取色彩特征值的一种方法流程图;
图10中示例性示出了根据一些实施例的提取色彩特征值的另一种方法流程图;
图11为本申请根据示例性实施例示出的显示设备200的示意图;
图12为本申请根据另一示例性实施例示出的显示设备200的示意图;
图13为本申请根据示例性实施例示出的一种实施场景示意图;
图14为本申请根据示例性实施例示出的一种实施场景示意图;
图15为本申请根据示例性实施例示出的一种实施场景示意图;
图16为本申请根据示例性实施例示出的一种实施场景示意图;
图17为本申请根据示例性实施例示出的摄像头呼吸灯控制方法流程图。
具体实施方式
为使本申请的目的和实施方式更加清楚,下面将结合本申请示例性实施例中的附图,对本申请示例性实施方式进行清楚、完整地描述,显然,描述的示例性实施例仅是本申请一部分实施例,而不是全部的实施例。
需要说明的是,本申请中对于术语的简要说明,仅是为了方便理解接下来描述的实施方式,而不是意图限定本申请的实施方式。除非另有说明,这些术语应当按照其普通和通常的含义理解。
本申请中说明书和权利要求书及上述附图中的术语“第一”、“第二”、“第三”等是用于区别类似或同类的对象或实体,而不必然意味着限定特定的顺序或先后次序,除非另外注明。应该理解这样使用的用语在适当情况下可以互换。
术语“包括”和“具有”以及他们的任何变形,意图在于覆盖但不排他的包含,例如,包含了一系列组件的产品或设备不必限于清楚地列出的所有组件,而是可包括没有清楚地列出的或对于这些产品或设备固有的其它组件。
术语“模块”是指任何已知或后来开发的硬件、软件、固件、人工智能、模糊逻辑或硬件或/和软件代码的组合,能够执行与该元件相关的功能。
图1为根据实施例中显示设备与控制装置之间操作场景的示意图。如图1所示,用户可通过智能设备300或控制装置100操作显示设备200。
控制装置100可以是遥控器,遥控器和显示设备的通信包括红外协议通信或蓝牙协议通信,及其他短距离通信方式,通过无线或有线方式来控制显示设备200。用户可以通过遥控器上按键、语音输入、控制面板输入等输入用户指令,来控制显示设备200。
在一些实施例中,也可以使用智能设备300(如移动终端、平板电脑、计算机、笔记本电脑等)以控制显示设备200。例如,使用在智能设备上运行的应用程序控制显示设备200。
在一些实施例中,显示设备200还可以采用除了控制装置100和智能设备300之 外的方式进行控制,例如,可以通过显示设备200设备内部配置的获取语音指令的模块直接接收用户的语音指令控制,也可以通过显示设备200设备外部设置的语音控制设备来接收用户的语音指令控制。
在一些实施例中,显示设备200还与服务器400进行数据通信。可允许显示设备200通过局域网(LAN)、无线局域网(WLAN)和其他网络进行通信连接。服务器400可以向显示设备200提供各种内容和互动。
图2示例性示出了根据示例性实施例中控制装置100的配置框图。如图2所示,控制装置100包括控制器110、通信接口130、用户输入/输出接口140、存储器、供电电源。控制装置100可接收用户的输入操作指令,且将操作指令转换为显示设备200可识别和响应的指令,起用用户与显示设备200之间交互中介作用。
图3示出了根据示例性实施例中显示设备200的硬件配置框图。
显示设备200包括调谐解调器210、通信器220、检测器230、外部装置接口240、控制器250、显示器260、音频输出接口270、存储器、供电电源、用户接口中的至少一种。
显示器260包括用于呈现画面的显示屏组件,以及驱动图像显示的驱动组件,用于接收源自控制器输出的图像信号,进行显示视频内容、图像内容以及菜单操控界面的组件以及用户操控UI界面。
显示器260可为液晶显示器、OLED显示器、以及投影显示器,还可以为一种投影装置和投影屏幕。
通信器220是用于根据各种通信协议类型与外部设备或服务器进行通信的组件。例如:通信器可以包括Wifi模块,蓝牙模块,有线以太网模块等其他网络通信协议芯片或近场通信协议芯片,以及红外接收器中的至少一种。显示设备200可以通过通信器220与外部控制设备100或服务器400建立控制信号和数据信号的发送和接收。
用户接口,可用于接收控制装置100(如:红外遥控器等)的控制信号。
检测器230用于采集外部环境或与外部交互的信号。例如,检测器230包括光接收器,用于采集环境光线强度的传感器;或者,检测器230包括图像采集器,如摄像头,可以用于采集外部环境场景、用户的属性或用户交互手势,再或者,检测器230包括声音采集器,如麦克风等,用于接收外部声音。
外部装置接口240可以包括但不限于如下:高清多媒体接口接口(HDMI)、模拟或数据高清分量输入接口(分量)、复合视频输入接口(CVBS)、USB输入接口(USB)、RGB端口等任一个或多个接口。也可以是上述多个接口形成的复合性的输入/输出接口。
控制器250和调谐解调器210可以位于不同的分体设备中,即调谐解调器210也可在控制器250所在的主体设备的外置设备中,如外置机顶盒等。
控制器250,通过存储在存储器上中各种软件控制程序,来控制显示设备的工作和响应用户的操作。控制器250控制显示设备200的整体操作。例如:响应于接收到用于选择在显示器260上显示UI对象的用户命令,控制器250便可以执行与由用户命令选择的对象有关的操作。
对象可以是可选对象中的任何一个,例如超链接、图标或其他可操作的控件。与所选择的对象有关操作有:显示连接到超链接页面、文档、图像等操作,或者执行与所述图标相对应程序的操作。
在一些实施例中,用户可在显示器260上显示的图形用户界面(GUI)输入用户命令,则用户输入接口通过图形用户界面(GUI)接收用户输入命令。或者,用户可通过输入特定的声音或手势进行输入用户命令,则用户输入接口通过传感器识别出声音或手势,来接收用户输入命令。
“用户界面”可以指应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。用户界面常用的表现形式是图形用户界面(Graphic User Interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的一个图标、窗口、控件等界面元素,其中控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。
第一方面:
在一些实施例中,将显示设备作为显示端与外设设备进行交互时,外设设备仅作为输入设备。显示设备与外设设备的交互内容仅呈现在显示设备中,使得显示设备的内容相对封闭,而显示设备并未与外设设备进行信息共享,使得外设设备在交互过程中不会产生相应变化,这将导致交互场景中内容呈现形式较为单一,用户体验感不高。
为便于用户在利用显示设备进行游戏、唱歌等娱乐交互场景时,能够烘托气氛,提高用户体验,本发明实施例提供一种显示设备,可在播放电视画面、游戏或唱歌等娱乐交互场景下,控制外设设备的显示颜色可以随着用户画面的颜色变化而变化。
为实现外设设备的颜色变化,与显示设备进行交互的外设设备可选用能够发光变色的设备,例如,可变颜色的灯泡、LED灯条等。外设设备的显示颜色随显示设备的用户画面的颜色变化而变化,可以烘托观看显示设备的显示内容的氛围,增加显示内容的呈现形式,提升用户使用显示设备的体验。
图4中示例性示出了根据一些实施例的显示设备的结构框图。为此,本发明实施例提供一种显示设备200,参见图4,包括显示器275、外设设备201和控制器250。显示器275被配置为显示用户画面;外设设备201为可发光变色的设备,被配置为呈现不同的颜色;控制器250与显示器275和外设设备201连接,控制器250用于将显示器275中显示的用户画面的颜色实时同步至外设设备201,使得外设设备201的显示颜色与用户画面的颜色同步更新。
为更加准确地利用外设设备呈现用户画面的颜色,提高气氛烘托的效果,在一些实施例中,同时与显示设备交互的外设设备可为多个。例如,可设置四个外设设备,每个外设设备与显示设备进行连接,连接方式可为有线连接或无线连接,保证在同一局域网中即可。
在采用无线连接方式时,将显示设备连接到路由器,路由器给显示设备分配局域网IP;再将外设设备连接到路由器,路由器给外设设备分配局域网IP。通过路由器即可实现显示设备与每一个外设设备在同一个网段的连接,显示设备可以通过扫描获得连接该局域网的所有外设设备。
图5中示例性示出了根据一些实施例的设备列表的示意图。参见图5,显示设备在进行扫描时,通过在自身所处的局域网中来设置要扫描的所有IP段,通过网络命令进行扫描,获取每个外设设备的IP地址,并将扫描到的每个外设设备显示在设备列表中。若扫描到四个外设设备,则将四个外设设备的名称显示在设备列表中。
基于设备列表,还可对每个外设设备的名称进行修改,也可将其他与颜色呈现不相关的设备在设备列表中删除。
图6中示例性示出了根据一些实施例的显示设备设置多个外设设备的示意图。参见图6,多个外设设备可设置在显示设备的显示器的周围,以设置四个外设设备为例,第一个外设设备201a设置在显示器的上侧,第二个外设设备201b设置在显示器的下侧,第三个外设设备201c设置在显示器的左侧,第四个外设设备201d设置在显示器的右侧。
显示器275周围设置的四个外设设备,可分别呈现用户画面的对应位置所呈现的颜色,即利用第一个外设设备201a呈现用户画面上端的颜色;利用第二个外设设备201b呈现用户画面下端的颜色;利用第三个外设设备201c呈现用户画面左端的颜色;利用第四个外设设备201d呈现用户画面右端的颜色。
每个外设设备对应显示器中用户画面的相应位置,使得每个外设设备201的显示颜色均可与用户画面的对应位置的颜色同步更新,提高外设设备的显示颜色的效果。
图7中示例性示出了根据一些实施例的外设设备颜色跟随画面颜色变化的方法的流程图。本发明实施例提供的一种显示设备,参见图7,其配置的控制器250在执行外设设备颜色跟随画面颜色变化的方法时,被配置为执行下述步骤:
S1、将用户画面划分为多个颜色提取区域,建立每个外设设备与每个颜色提取区域的绑定关系,外设设备的数量与颜色提取区域的数量相同。
由于与显示设备同时交互的外设设备可为多个,每个外设设备的显示设备仅会跟随用户画面的对应位置的颜色变化,因此,为便于准确确定每个外设设备可呈现的用户画面对应位置的颜色,可将用户画面划分成多个颜色提取区域,颜色提取区域用于提供用户画面相应位置所呈现的颜色。
在划分区域时,在一些实施例中,控制器在执行将用户画面划分为多个颜色提取区域,被进一步配置为执行下述步骤:
步骤11、获取显示器呈现的用户画面。
步骤12、按照预设划分规则,对用户画面进行区域划分,得到多个颜色提取区域,每个颜色提取区域不重合。
控制器获取显示器呈现的用户画面,将用户画面按照预设划分规则划分成多个颜色提取区域。预设划分规则可为将用户画面划分成互不重合的多个颜色提取区域,多个颜色提取区域的总和可小于或等于用户画面所呈现的完整区域。
图8中示例性示出了根据一些实施例的颜色提取区域的示意图。参见图8,在将用户画面划分成四个颜色提取区域时,四个颜色提取区域的位置可分别位于用户画面的上、下、左、右四个位置,第一个颜色提取区域A 1位于用户画面的上侧,第二个颜色提取区域A 2位于用户画面的下侧,第三个颜色提取区域A 3位于用户画面的左侧,第四个颜色提取区域A 4位于用户画面的右侧。
颜色提取区域的划分数量与外设设备的设置数量相同,使得颜色提取区域与外设设备呈现一对一的绑定关系,实现由一个外设设备接收一个颜色提取区域的颜色,以便于外设设备可以显示对应区域位置的用户画面的颜色,避免出现混乱,以呈现不同的外设设备显示不同的颜色的效果。
在将外设设备与颜色提取区域建立一对一的绑定关系时,通过触发设备列表中的 目标外设设备名称,将对应的外设设备呈现闪烁状态,以准确确定将要进行绑定的设备为哪一个。而后,可基于用户的个性化设置,将该闪烁的外设设备与其中一个颜色提取区域建立绑定关系。为便于外设设备呈现用户画面的颜色的一致性,可以按照设定位置相同的规则建立外设设备与颜色提取区域的绑定关系。
外设设备的设置位置与颜色提取区域的位置一一对应,例如,将设置在显示器上侧的第一个外设设备201a与位于用户画面上侧的第一个颜色提取区域A 1对应,将设置在显示器下侧的第二个外设设备201b与位于用户画面下侧的第二个颜色提取区域A 2对应,将设置在显示器左侧的第三个外设设备201c与位于用户画面左侧的第三个颜色提取区域A 3对应,将设置在显示器右侧的第四个外设设备201d与位于用户画面右侧的第四个颜色提取区域A 4对应。
S2、获取每个颜色提取区域所呈现画面内容的色彩特征值。
划分出的颜色提取区域用于为对应的外设设备提供显示的颜色,因此,可对每个颜色提取区域中所呈现的画面内容进行颜色提取,确定每个颜色提取区域对应的色彩特征值。色彩特征值可表征对应颜色提取区域的色彩标准值。
在确定色彩特征值时,可直接从每个颜色提取区域中进行提取,还可先获取当前显示器中显示的用户画面(即先截图),再从截图中提取每个颜色提取区域的色彩特征值。
图9中示例性示出了根据一些实施例的提取色彩特征值的一种方法流程图。在一些实施例中,采用直接提取时,参见图9,控制器在执行获取每个颜色提取区域所呈现画面内容的色彩特征值,被进一步配置为执行下述步骤:
S211、获取每个颜色提取区域所呈现画面内容的颜色直方图。
S212、在每个颜色直方图中提取对应的红色像素合值、绿色像素合值和蓝色像素合值,作为每个颜色提取区域对应的色彩特征值。
由于每个外设设备仅会呈现与其产生绑定关系的颜色提取区域所呈现画面内容的颜色,因此,控制器可直接提取每一个颜色提取区域所呈现画面内容的颜色直方图。颜色直方图是一种颜色特征,其所描述的是不同色彩在整幅图像中所占的比例。
在指定颜色提取区域对应的颜色直方图中提取色彩特征值,包括红色像素合值、绿色像素合值和蓝色像素合值。红色像素合值可为64RColor,是指指定颜色提取区域对应的指定画面内容中所有色彩所呈现的红色像素值的合值;绿色像素合值可为64GColor,是指指定颜色提取区域对应的指定画面内容中所有色彩所呈现的绿色像素值的合值;蓝色像素合值可为64BColor,是指指定颜色提取区域对应的指定画面内容中所有色彩所呈现的蓝色像素值的合值。
每个颜色提取区域均对应一个色彩特征值,例如,第一个颜色提取区域A 1对应色彩特征值C 1(64R 1Color,64G 1Color,64B 1Color),第二个颜色提取区域A 2对应色彩特征值C 2(64R 2Color,64G 2Color,64B 2Color),第三个颜色提取区域A 3对应色彩特征值C 3(64R 3Color,64G 3Color,64B 3Color),第四个颜色提取区域A 4对应色彩特征值C 4(64R 4Color,64G 4Color,64B 4Color)。
图10中示例性示出了根据一些实施例的提取色彩特征值的另一种方法流程图。在一些实施例中,采用截图方式提取时,参见图10,控制器在执行获取每个颜色提取区域所呈现画面内容的色彩特征值,被进一步配置为执行下述步骤:
S221、截取用户画面中所呈现的完整画面。
S222、按照颜色提取区域的划分规则,将完整画面进行划分,确定每个颜色提取区域对应的部分画面内容。
S223、获取每个颜色提取区域所呈现部分画面内容的颜色直方图。
S224、在每个颜色直方图中提取对应的红色像素合值、绿色像素合值和蓝色像素合值,作为每个颜色提取区域对应的色彩特征值。
在采用截图方式提取时,可先对用户画面在某一时刻所呈现的画面进行截屏,得到完整画面。然后分别提取完整画面每个部位的颜色直方图,完整画面中每个部位的确定,可基于划分颜色提取区域的划分规则进行划分,即将完整画面划分成多个部分画面,每隔部位对应一个部分画面,多个部分画面互不重合,多个部分画面的面积总和小于或等于完整画面的总面积。
各个划分得到的部分画面互不重合,可以避免每个部分画面的颜色被影响,进而可以准确提取出相应部分画面的代表颜色,并呈现在外设设备上。例如,如果部分画面B 1与部分画面B 2有重合,而部分画面B 1整体呈现红色,部分画面B 2整体呈现黄色,那么重合后,会导致部分画面B 1的整体颜色呈红色偏黄色,部分画面B 2的整体颜色呈现黄色偏红色,导致最终呈现的颜色并非相应部分画面原本所应呈现的颜色。
完整画面划分成的部分画面数量与颜色提取区域的划分数量相同,位置相同,即若颜色提取区域为四个,位置分别位于用户画面的上、下、左、右四个位置,则完整画面的划分成的部分画面也为四个,四个部分画面的位置也位于完整画面的上、下、左、右四个位置。
完整画面被划分后,每个颜色提取区域对应一个部分画面内容,以将指定部分画面内容所呈现的颜色作为指定颜色提取区域的颜色。在确定指定部分画面内容所呈现的颜色时,可获取该部分画面内容的颜色直方图。
在部分画面内容对应的颜色直方图中提取色彩特征值,包括红色像素合值、绿色像素合值和蓝色像素合值。红色像素合值可为64RColor,是指指定颜色提取区域对应的指定画面内容中所有色彩所呈现的红色像素值的合值;绿色像素合值可为64GColor,是指指定颜色提取区域对应的指定画面内容中所有色彩所呈现的绿色像素值的合值;蓝色像素合值可为64BColor,是指指定颜色提取区域对应的指定画面内容中所有色彩所呈现的蓝色像素值的合值。
将各个部分画面内容对应的颜色特征值作为各个对应的颜色提取区域的颜色特征值,一个部分画面内容的颜色特征值作为一个颜色提取区域的一个色彩特征值,例如,部分画面内容B 1的颜色特征值C 1(64R 1Color,64G 1Color,64B 1Color)为第一个颜色提取区域A 1对应的色彩特征值,部分画面内容B 2的颜色特征值C 2(64R 2Color,64G 2Color,64B 2Color)为第二个颜色提取区域A 2对应的色彩特征值,部分画面内容B 3的颜色特征值C 3(64R 3Color,64G 3Color,64B 3Color)为第三个颜色提取区域A 3对应的色彩特征值,部分画面内容B 4的颜色特征值C 4(64R 4Color,64G 4Color,64B 4Color)为第四个颜色提取区域A 4对应的色彩特征值。
S3、基于每个颜色提取区域的像素面积,将对应的色彩特征值转换成RGB颜色空间的RGB均值。
在确定出每个颜色提取区域的色彩特征值(红色像素合值、绿色像素合值和蓝色 像素合值)后,即可将该色彩特征值转换成RGB颜色空间的RGB均值。在转换时,需利用每个颜色提取区域的像素面积,通过求取RGB颜色空间三原色(红、绿、蓝)均值的方式,确定RGB均值。
在一些实施例中,控制器在执行基于每个颜色提取区域的像素面积,将对应的色彩特征值转换成RGB颜色空间的RGB均值,被进一步配置为执行下述步骤:
步骤31、计算每个颜色提取区域的像素面积。
步骤32、计算红色像素合值与像素面积得到的红色像素均值,计算绿色像素合值与像素面积得到的绿色像素均值,以及,计算蓝色像素合值与像素面积得到的蓝色像素均值,将红色像素均值、绿色像素均值和蓝色像素均值作为RGB颜色空间的RGB均值。
每个颜色提取区域在划分时,可将颜色提取区域划分成矩形、圆形等。例如,在划分成矩形时,可获取指定颜色提取区域两个顶点的像素坐标值,来确定该指定颜色提取区域的像素面积。基于用户画面建立坐标系,坐标原点为用户画面左上角的位置,沿用户画面向右的方向为X轴正向,沿用户画面向下的方向为Y轴正向。
两个顶点选取在矩形的左上点和右下点,在坐标系中,获取指定颜色提取区域对应的矩形的左上点P 1的像素坐标值(X 1,Y 1)和右下点P 2的像素坐标值(X 2,Y 2),进而确定指定颜色提取区域的像素面积S=(X 2-X 1)*(Y 2-Y 1)。每个颜色提取区域的像素面积均采用该方法进行计算,可确定出四个颜色提取区域的像素面积分别为S 1、S 2、S 3、S 4
在计算均值时,由每个颜色提取区域对应的色彩特征值(红色像素合值、绿色像素合值和蓝色像素合值)和对应的像素面积进行求商,即可将色彩特征值转换成RGB颜色空间的RGB均值。
在转换指定颜色提取区域的色彩特征值为RGB均值时,按照式red=64RColor/S,计算红色像素合值与像素面积的商值,得到的红色像素均值red;按照式green=64GColor/S,计算绿色像素合值与像素面积的商值,得到的绿色像素均值green;按照式blue=64BColor/S,计算蓝色像素合值与像素面积的商值,得到的蓝色像素均值blue。最后,将Color.RGB(red,green,blue)作为RGB颜色空间的RGB均值,实现色彩特征值到RGB均值的转换。
例如,在将第一个颜色提取区域A 1对应的色彩特征值C 1(64R 1Color,64G 1Color,64B 1Color)转换为RGB均值时,获取像素面积S 1,按照式red 1=64R 1Color/S 1,计算红色像素合值与像素面积的商值,得到的红色像素均值red 1;按照式green 1=64G 1Color/S 1,计算绿色像素合值与像素面积的商值,得到的绿色像素均值green 1;按照式blue 1=64B 1Color/S 1,计算蓝色像素合值与像素面积的商值,得到的蓝色像素均值blue 1。最后,将Color.RGB(red 1,green 1,blue 1)作为第一个颜色提取区域A 1对应的RGB颜色空间的RGB均值,实现色彩特征值C 1到RGB均值的转换。
在将第二个颜色提取区域A 2对应的色彩特征值C 2(64R 2Color,64G 2Color,64B 2Color)转换为RGB均值时,获取像素面积S 2,按照式red 2=64R 2Color/S 2,计算红色像素合值与像素面积的商值,得到的红色像素均值red 2;按照式green 2=64G 2Color/S 2,计算绿色像素合值与像素面积的商值,得到的绿色像素均值green 2;按照式blue 2=64B 2Color/S 2,计算蓝色像素合值与像素面积的商值,得到的蓝 色像素均值blue 2。最后,将Color.RGB(red 2,green 2,blue 2)作为第二个颜色提取区域A 2对应的RGB颜色空间的RGB均值,实现色彩特征值C 2到RGB均值的转换。
在将第三个颜色提取区域A 3对应的色彩特征值C 3(64R 3Color,64G 3Color,64B 3Color)转换为RGB均值时,获取像素面积S 3,按照式red 3=64R 3Color/S 3,计算红色像素合值与像素面积的商值,得到的红色像素均值red 3;按照式green 3=64G 3Color/S 3,计算绿色像素合值与像素面积的商值,得到的绿色像素均值green 3;按照式blue 3=64B 3Color/S 3,计算蓝色像素合值与像素面积的商值,得到的蓝色像素均值blue 3。最后,将Color.RGB(red 3,green 3,blue 3)作为第三个颜色提取区域A 3对应的RGB颜色空间的RGB均值,实现色彩特征值C 3到RGB均值的转换。
在将第四个颜色提取区域A 4对应的色彩特征值C 4(64R 4Color,64G 4Color,64B 4Color)转换为RGB均值时,获取像素面积S 4,按照式red 4=64R 4Color/S 4,计算红色像素合值与像素面积的商值,得到的红色像素均值red 4;按照式green 4=64G 4Color/S 4,计算绿色像素合值与像素面积的商值,得到的绿色像素均值green 4;按照式blue 4=64B 4Color/S 4,计算蓝色像素合值与像素面积的商值,得到的蓝色像素均值blue 4。最后,将Color.RGB(red 4,green 4,blue 4)作为第四个颜色提取区域A 4对应的RGB颜色空间的RGB均值,实现色彩特征值C 4到RGB均值的转换。
S4、基于绑定关系,将每个颜色提取区域的RGB均值发送至对应的外设设备,由外设设备呈现RGB均值对应的颜色。
在确定出每个颜色提取区域的RGB均值后,即可将该RGB均值发送至对应的外设设备进行显示。每个颜色提取区域与外设设备存在一一对应的绑定关系,因此,控制器可将每个颜色提取区域的RGB均值基于绑定关系发送至对应的外设设备。
例如,将第一个颜色提取区域A 1对应的RGB均值Color.RGB(red 1,green 1,blue 1)发送至第一个外设设备201a,由第一个外设设备201a呈现RGB均值Color.RGB(red 1,green 1,blue 1)对应的颜色。将第二个颜色提取区域A 2对应的RGB均值Color.RGB(red 2,green 2,blue 2)发送至第二个外设设备201b,由第二个外设设备201b呈现RGB均值Color.RGB(red 2,green 2,blue 2)对应的颜色。将第三个颜色提取区域A 3对应的RGB均值Color.RGB(red 3,green 3,blue 3)发送至第三个外设设备201c,由第三个外设设备201c呈现RGB均值Color.RGB(red 3,green 3,blue 3)对应的颜色。将第四个颜色提取区域A 4对应的RGB均值Color.RGB(red 4,green 4,blue 4)发送至第四个外设设备201d,由第四个外设设备201d呈现RGB均值Color.RGB(red 4,green 4,blue 4)对应的颜色。
控制器在将RGB均值发送至对应的外设设备时,可先将RGB均值打包成网络包的形式进行发送。外设设备接收到网络包,进行解析,得到对应的颜色值,并将该颜色进行显示,改变该外设设备的显示颜色。
在一些实施例中,控制器每隔300毫秒获取一次指定颜色提取区域的色彩特征值,并转换成RGB均值,再由外设设备显示RGB均值对应的颜色,使得用户画面与外设设备共同所显示的颜色呈现渐变颜色的效果,这样可以保证用户画面与外设设备的显示颜色同步显示,实现外设设备的显示颜色随用户画面的改变而改变。
由以上技术方案可知,本发明实施例提供的一种显示设备,控制器将用户画面划分为多个颜色提取区域,建立每个外设设备与每个颜色提取区域的绑定关系,获取每 个颜色提取区域所呈现画面内容的色彩特征值;基于每个颜色提取区域的像素面积,将对应的色彩特征值转换成RGB颜色空间的RGB均值;将每个颜色提取区域的RGB均值发送至对应的外设设备,由外设设备呈现RGB均值对应的颜色。可见,本发明实施例提供的显示设备,通过获取用户画面的颜色,将该颜色发送至外设设备进行显示,使得外设设备的显示颜色可以随着用户画面的颜色变化而变化,增加显示内容的呈现形式,提升用户使用显示设备的体验。
图7中示例性示出了根据一些实施例的外设设备颜色跟随画面颜色变化的方法的流程图。本发明实施例提供的一种外设设备颜色跟随画面颜色变化的方法,由前述实施例提供的显示设备中配置的控制器执行,所述方法包括:
S1、将显示器中呈现的用户画面划分为多个颜色提取区域,建立每个外设设备与每个所述颜色提取区域的绑定关系,所述外设设备的数量与所述颜色提取区域的数量相同;
S2、获取每个所述颜色提取区域所呈现画面内容的色彩特征值;
S3、基于每个所述颜色提取区域的像素面积,将对应的所述色彩特征值转换成RGB颜色空间的RGB均值;
S4、基于所述绑定关系,将每个所述颜色提取区域的RGB均值发送至对应的所述外设设备,由所述外设设备呈现所述RGB均值对应的颜色。
在本申请一些实施例中,所述将用户画面划分为多个颜色提取区域,包括:获取所述显示器呈现的用户画面;按照预设划分规则,对所述用户画面进行区域划分,得到多个颜色提取区域,每个所述颜色提取区域不重合。
在本申请一些实施例中,所述获取每个颜色提取区域所呈现画面内容的色彩特征值,包括:获取每个所述颜色提取区域所呈现画面内容的颜色直方图;在每个所述颜色直方图中提取对应的红色像素合值、绿色像素合值和蓝色像素合值,作为每个所述颜色提取区域对应的色彩特征值。
在本申请一些实施例中,所述获取每个颜色提取区域所呈现画面内容的色彩特征值,包括:截取所述用户画面中所呈现的完整画面;按照所述颜色提取区域的划分规则,将所述完整画面进行划分,确定每个颜色提取区域对应的部分画面内容;获取每个所述颜色提取区域所呈现部分画面内容的颜色直方图;在每个所述颜色直方图中提取对应的红色像素合值、绿色像素合值和蓝色像素合值,作为每个所述颜色提取区域对应的色彩特征值。
在本申请一些实施例中,所述色彩特征值包括红色像素合值、绿色像素合值和蓝色像素合值;以及,所述基于每个所述颜色提取区域的像素面积,将对应的所述色彩特征值转换成RGB颜色空间的RGB均值,包括:计算每个颜色提取区域的像素面积;计算所述红色像素合值与所述像素面积得到的红色像素均值,计算绿色像素合值与所述像素面积得到的绿色像素均值,以及,计算所述蓝色像素合值与所述像素面积得到的蓝色像素均值,将所述红色像素均值、绿色像素均值和蓝色像素均值作为RGB颜色空间的RGB均值。
第二方面:
本申请实施例还提供一种显示设备,包括摄像头、摄像头呼吸灯和控制器;
所述控制器,与所述摄像头呼吸灯连接,用于获取呼吸灯控制参数,所述呼吸灯控制 参数与应用在使用所述摄像头时所需的灯光效果对应;根据所述呼吸灯控制参数控制所述摄像头呼吸灯;
所述摄像头呼吸灯,用于呈现应用所需的灯光效果。
本申请实施例还提供一种显示设备,还包括呼吸灯控制模块,所述控制器与所述呼吸灯控制模块连接;所述控制器根据所述呼吸灯控制参数控制所述摄像头呼吸灯,包括:
所述控制器,根据所述呼吸灯控制参数生成呼吸灯控制指令;将所述呼吸灯控制指令发送给所述呼吸灯控制模块;
所述呼吸灯控制模块,与所述摄像头呼吸灯连接,用于根据接收到的所述呼吸灯控制指令,控制所述摄像头呼吸灯。
本申请实施例还提供一种显示设备,所述控制器包括摄像头控制服务,所述摄像头控制服务提供用于应用传入呼吸灯控制参数的接口;所述控制器获取呼吸灯控制参数包括:所述摄像头控制服务接收应用根据所述接口传入的呼吸灯控制参数;当所述摄像头控制服务接收到所述呼吸灯控制参数时,根据所述呼吸灯控制参数生成呼吸灯控制指令,并将所述呼吸灯控制指令发送给所述呼吸灯控制模块。
本申请实施例还提供一种显示设备,所述呼吸灯控制参数为用于指示控制所述摄像头呼吸灯常亮的参数、用于指示控制所述摄像头呼吸灯闪烁及闪烁频率的参数或者用于指示控制所述摄像头呼吸灯关闭的参数。
本申请实施例还提供一种显示设备,包括摄像头、摄像头呼吸灯和控制器;
所述控制器,分别与所述摄像头呼吸灯和所述摄像头连接,用于响应于摄像头启动指令,在启动摄像头的同时,控制所述摄像头呼吸灯开启,并在所述摄像头呼吸灯的开启时长达到预设时长时,控制所述摄像头呼吸灯关闭。
本申请实施例还提供一种显示设备,还包括呼吸灯控制模块,所述控制器与所述呼吸灯控制模块连接;所述控制器包括摄像头控制服务,所述摄像头控制服务提供用于应用传入呼吸灯控制参数的接口;
所述控制器响应于摄像头启动指令,在启动摄像头的同时,控制所述摄像头呼吸灯开启,包括:
所述摄像头控制服务,接收指示启动摄像头的应用根据所述接口传入的第一控制参数;根据所述第一控制参数生成呼吸灯开启指令,并将所述呼吸灯开启指令发送给所述呼吸灯控制模块;
所述呼吸灯控制模块,用于响应于接收到所述呼吸灯开启指令,控制所述摄像头呼吸灯开启;
所述控制器在所述摄像头呼吸灯的开启时长达到预设时长时,控制所述摄像头呼吸灯关闭,包括:
所述摄像头控制服务,接收指示启动摄像头的应用在摄像头呼吸灯的开启时长达到预设时长时根据所述接口传入的第二控制参数;根据所述第二控制参数生成呼吸灯关闭指令,并将所述呼吸灯关闭指令发送给所述呼吸灯控制模块;
所述呼吸灯控制模块,还用于响应于接收到所述呼吸灯关闭指令,控制所述摄像头呼吸灯关闭。
本申请实施例还提供一种显示设备,包括摄像头、摄像头呼吸灯和控制器;
所述摄像头,用于拍摄图像或者录制视频;
所述控制器,与所述摄像头呼吸灯连接,用于:
响应于触发图像拍摄指令的用户输入,在控制摄像头拍摄图像的同时,控制所述摄像头呼吸灯以第一特定频率闪烁;
或者,响应于触发视频录制开始指令的用户输入,在控制摄像头录制视频的同时,控制所述摄像头呼吸灯以第二特定频率闪烁;响应于触发视频录制结束指令的用户输入,在控制摄像头结束录制视频的同时,控制所述摄像头呼吸灯关闭。
本申请实施例还提供一种显示设备,还包括呼吸灯控制模块,所述控制器与所述呼吸灯控制模块连接;所述控制器包括摄像头控制服务,所述摄像头控制服务提供用于应用传入呼吸灯控制参数的接口;
所述控制器响应于触发图像拍摄指令的用户输入,在控制摄像头拍摄图像的同时,控制所述摄像头呼吸灯以第一特定频率闪烁,包括:
所述摄像头控制服务,接收第一频率参数,所述第一频率参数由应用接收到触发图像拍摄指令的用户输入时根据所述接口传入;根据所述第一频率参数生成呼吸灯闪烁指令,并将所述呼吸灯闪烁指令发送给所述呼吸灯控制模块;
所述呼吸灯控制模块,用于根据所述呼吸灯闪烁指令控制所述摄像头呼吸灯按照第一特定频率闪烁。
本申请实施例还提供一种显示设备,所述控制器响应于触发视频录制开始指令的用户输入,在控制摄像头录制视频的同时,控制所述摄像头呼吸灯以第二特定频率闪烁,包括:
所述摄像头控制服务,接收第二频率参数,所述第二频率参数由应用接收到触发视频录制指令的用户输入时根据所述接口传入;根据所述第二频率参数生成呼吸灯闪烁指令,并将所述呼吸灯闪烁指令发送给所述呼吸灯控制模块;
所述呼吸灯控制模块,用于根据所述呼吸灯闪烁指令控制所述摄像头呼吸灯按照第二特定频率闪烁。
本申请实施例还提供一种显示设备,包括摄像头、摄像头呼吸灯和控制器;
所述控制器,与所述摄像头呼吸灯连接,用于响应于接收到通话请求,控制所述摄像头呼吸灯以特定频率闪烁。
本申请实施例还提供一种显示设备控制方法,应用于显示设备,所述显示设备包括摄像头和摄像头呼吸灯,所述方法包括:
获取呼吸灯控制参数,所述呼吸灯控制参数与应用在使用所述摄像头时所需的灯光效果对应;
根据所述呼吸灯控制参数控制所述摄像头呼吸灯,以使所述摄像头呼吸灯呈现应用所需的灯光效果。
本申请实施例还提供一种显示设备控制方法,应用于显示设备,所述显示设备包括摄像头和摄像头呼吸灯,所述方法包括:
响应于摄像头启动指令,在启动摄像头的同时,控制所述摄像头呼吸灯开启;
在所述摄像头呼吸灯的开启时长达到预设时长时,控制所述摄像头呼吸灯关闭。
本申请实施例还提供一种显示设备控制方法,应用于显示设备,所述显示设备包括摄像头和摄像头呼吸灯,所述方法包括:
响应于触发图像拍摄指令的用户输入,在控制摄像头拍摄图像的同时,控制所述摄像头呼吸灯以第一特定频率闪烁;
响应于触发视频录制开始指令的用户输入,在控制摄像头录制视频的同时,控制所述摄像头呼吸灯以第二特定频率闪烁;
响应于触发视频录制开始指令的用户输入,在控制摄像头结束录制视频的同时,控制所述摄像头呼吸灯关闭。
本申请实施例还提供一种显示设备控制方法,应用于显示设备,所述显示设备包括摄像头和摄像头呼吸灯,所述方法包括:
响应于接收到通话请求,控制所述摄像头呼吸灯以特定频率闪烁;
响应于接收到触发接受或者拒绝所述通话请求的用户输入,控制所述摄像头呼吸灯停止闪烁。
图11为本申请根据示例性实施例示出的显示设备外观示意图,如图11所示,该显示设备包括摄像头231和一个摄像头呼吸灯2311,摄像头呼吸灯可以在显示设备控制器的控制下呈现出不同的灯光效果,以指示出摄像头的工作状态。例如,摄像头呼吸灯持续高电平输入时,其将呈现出常亮的灯光效果,持续低电平输入时,其将呈现出常灭的灯光效果,高电平与低电平交替输入时,其将呈现闪烁的灯光效果。
图12为本申请根据另一示例性实施例示出的显示设备外观示意图,与图11所示显示设备不同的是,该显示设备包括231和多个摄像头呼吸灯2311。应当理解,该多个摄像头呼吸灯可以为显示设备控制器所同步控制,也可以为显示设备控制器所单独控制。
需要说明的是,摄像头可以是显示设备内置的摄像头,也可以是通过外部装置接口与显示设备控制器连接的外置摄像头。摄像头呼吸灯可以是与显示设备内置的呼吸灯,也可以是通过外部装置接口与显示设备连接的外置呼吸灯。
在一些实施例中,为使摄像头应用可以在启动、使用及关闭摄像头时,摄像头呼吸灯能够呈现出摄像头应用所需的灯光效果,控制器250被配置为:获取呼吸灯控制参数,该呼吸灯控制参数是与摄像头应用在使用摄像头时所需的灯光效果对应的;根据呼吸灯控制参数控制摄像头呼吸灯,以使摄像头呼吸灯呈现出摄像头应用所需的灯光效果。
在上述实施例中,摄像头应用是指可以使用摄像头的应用,如“魔镜”应用、“嗨见”应用、“K歌”应用等等。
在上述实施例中,呼吸灯控制参数可以是用于指示控制摄像头呼吸灯常亮的参数,如预先定义“255”为指示控制摄像头呼吸灯常亮的参数,可以是用于指示控制摄像头呼吸灯闪烁及闪烁频率的参数,如预先定义“1~5”为指示控制摄像头呼吸灯以1~5次/秒闪烁的参数,还可以是用于指示控制摄像头呼吸灯关闭的参数,如预先定义“0”为指示控制摄像头呼吸灯关闭的参数。
在一些实施例中,控制器250直接与摄像头呼吸灯电连接,从而,控制器250可以根据获取到的呼吸灯控制参数,控制摄像头呼吸灯的高/低电平输入,以使摄像头呼吸灯呈现出与呼吸灯控制参数对应的灯光效果。
在另一些实施例中,摄像头具有单独的单片机,控制器250与该单片机连接,单片机与摄像头呼吸灯连接,从而,通过该单片机完成对摄像头呼吸灯的控制。在一些实施例中,该用于直接控制摄像头呼吸灯的单片机又被称为呼吸灯控制模块。在这些实施例中,控制器250获取到呼吸灯控制参数后,根据呼吸灯控制参数生成呼吸灯控制指令,并将呼吸灯控制指令发送给呼吸灯控制模块;呼吸灯控制模块,根据接收到的呼吸灯控制指令,通过控制摄像头呼吸灯的高/低电平输入,控制摄像头呼吸灯。
为了使摄像头应用可以控制摄像头呼吸灯呈现出其所需的灯光效果,由框架层中的摄像头控制服务(CameraControl服务)提供一个用于应用传入呼吸灯控制参数的接口,进而,应用在需要控制摄像头呼吸灯时,通过调用该接口即可传入与其所需的灯光效果对应的呼吸灯控制参数,当该接口被调用时,摄像头控制服务根据传入的呼吸灯控制参数生成呼吸灯控制指令,并将呼吸灯控制指令发送给呼吸灯控制模块。这样,任意摄像头应用均可控制摄像头呼吸灯呈现出其所需的灯光效果。
在一些实施例中,控制器250被配置为:响应于摄像头启动指令,在启动摄像头的同时,控制所述摄像头呼吸灯开启,并在所述摄像头呼吸灯的开启时长达到预设时长时,控制所述摄像头呼吸灯关闭。这样,在启动摄像头的场景中,持续预设时长(如2秒)的灯光点亮效果可以使用户在视觉上感知到摄像头的启动过程,消除用户困扰,提升用户体验。结合前述实施例,在一些更为具体的实施例中,摄像头应用在接收到触发摄像头启动指令的用户输入时,可以通过调用框架层中的CameraService服务启动摄像头,与此同时,根据摄像头控制服务所提供的上述接口传入第一控制参数;摄像头控制服务接收到摄像头应用传入的第一控制参数后,将根据第一控制参数生成呼吸灯开启指令,并将呼吸灯开启指令发送给呼吸灯控制模块;呼吸灯控制模块则响应于接收到呼吸灯开启指令,控制摄像头呼吸灯开启。在预设时长后,摄像头应用再次根据摄像头控制服务所提供的上述接口传入第二控制参数;摄像头控制服务接收到摄像头应用传入的第二控制参数后,将根据第二控制参数生成呼吸灯关闭指令,并将呼吸灯关闭指令发送给呼吸灯控制模块;呼吸灯控制模块则响应于接收到呼吸灯关闭指令,控制摄像头呼吸灯关闭。
示例性的,第一控制参数可如前述实施例提及的“255”,第二控制参数可如前述实施例提及的“0”。
图13为本申请根据示例性实施例示出的摄像头呼吸灯控制场景示意图,如图13所示,当显示设备显示应用中心界面时,用户可以通过语音、按键、手势等任意形式输入用于启动应用中心中“魔镜”应用的指令,显示设备控制器响应于用户输入的该指令,启动“魔镜”应用,“魔镜”应用启动成功后,通过调用框架层中的CameraService服务启动摄像头,与此同时,根据摄像头控制服务所提供的上述接口传入呼吸灯控制参数,以由摄像头控制服务根据传入的呼吸灯控制参数,配合呼吸灯控制模块,开启摄像头呼吸灯,并在预设时间长关闭摄像头呼吸灯。进而,为用户展示出“魔镜”应用启动摄像头的同时,摄像头呼吸灯持续预设时长的灯光点亮效果。在一些实施例中,控制器250被配置为:响应于触发图像拍摄指令的用户输入,在控制摄像头拍摄图像的同时,控制摄像头呼吸灯以第一特定频率闪烁,例如闪烁1次。这样,在拍照场景中,用户每按下一次快门键,都将伴随着一定频率的呼吸灯闪烁,可以使用户在视觉上感知到摄像头拍摄的时间截点,消除用户困扰,提升用户拍照体验。
在另一些实施例中,控制器250被配置为:响应于触发视频录制开始指令的用户输入,在控制摄像头录制视频的同时,控制摄像头呼吸灯以第二特定频率闪烁,例如每秒闪烁5次;响应于触发视频录制开始指令的用户输入,在控制摄像头结束录制视频的同时,控制摄像头呼吸灯关闭。这样,在录像场景中,在摄像头开始录制一直到结束录制的过程中,始终伴随着一定频率的呼吸灯闪烁,可以使用户在视觉上感知到摄像头的录制过程,消除用户困扰,提升用户录像体验。
需要说明的是,上述触发图像拍摄指令的用户输入,包括但不限于指示开始拍摄图片 的按键输入、语音输入、手势输入等。上述触发视频录制开始指令及视频录制结束指令的用户输入,包括但不限于指示开始录制及结束录制的按键输入、语音输入、手势输入等。
结合前述实施例,在一些更为具体的实施例中,当摄像头应用接收到触发图像拍摄指令的用户输入时,根据摄像头控制服务所提供的上述接口传入第一频率参数;摄像头控制服务根据摄像头应用传入的第一频率参数生成呼吸灯闪烁指令,并将呼吸灯闪烁指令发送给呼吸灯控制模块;呼吸灯控制模块,用于根据呼吸灯闪烁指令控制摄像头呼吸灯按照第一特定频率闪烁。当摄像头应用接收到触发视频录制开始指令的用户输入时,根据摄像头控制服务所提供的上述接口传入第二频率参数;摄像头控制服务根据摄像头应用传入的第二频率参数生成呼吸灯闪烁指令,并将呼吸灯闪烁指令发送给呼吸灯控制模块;呼吸灯控制模块,用于根据呼吸灯闪烁指令控制摄像头呼吸灯按照第二特定频率闪烁。当摄像头应用接收到触发视频录制结束指令的用户输入时,根据摄像头控制服务所提供的上述接口传入第二控制参数;摄像头控制服务根据摄像头应用传入的第二控制参数生成呼吸灯关闭指令,并将呼吸灯关闭指令发送给呼吸灯控制模块;呼吸灯控制模块响应于接收到呼吸灯关闭指令,控制摄像头呼吸灯关闭。
示例性的,第一频率参数和第二频率参数可以是上述实施例提及的“1~5”中的任意一个,如第一频率参数为“1”,这样,当摄像头应用接收到指示拍摄图片的用户指令时,将会控制摄像头呼吸灯闪烁一次。
图14为本申请根据示例性实施例示出的摄像头呼吸灯控制场景示意图,如图14所示,显示设备显示的前台应用界面为“魔镜”应用提供的摄像头预览界面,在显示该界面时,用户可以通过语音、按键、手势等任意形式输入用于触发图像拍摄的指令;显示设备控制器响应于用户输入的该指令,当用户输入该指令后,显示设备通过摄像头拍摄一张图片,同时,摄像头呼吸灯闪烁一次;若用户再次输入触发图像拍摄的指令,显示设备通过摄像头再拍摄一张图片,同时,摄像头呼吸灯再闪烁一次;这样,在拍照场景中,用户每按下一次快门键,都将伴随着一定频率的呼吸灯闪烁,可以使用户在视觉上感知到摄像头拍摄的时间截点,消除用户困扰,提升用户拍照体验。
图15为本申请根据示例性实施例示出的摄像头呼吸灯控制场景示意图,如图15所示,显示设备显示的前台应用界面为“魔镜”应用提供的摄像头预览界面,在显示该界面时,用户可以通过语音、按键、手势等任意形式输入用于触发开始视频录制的指令;显示设备控制器响应于用户输入的该指令,当用户输入该指令后,显示设备通过摄像头录制视频,在录制视频的过程中,摄像头呼吸灯将以每秒5次的频率持续闪烁,直到视频录制结束,即直到用户输入用于结束录制视频的指令。这样,在录像场景中,在摄像头开始录制一直到结束录制的过程中,始终伴随着一定频率的呼吸灯闪烁,可以使用户在视觉上感知到摄像头的录制过程,消除用户困扰,提升用户录像体验。在一些实施例中,控制器250被配置为:响应于接收到通话请求,控制所述摄像头呼吸灯以特定频率闪烁。其中,通话请求可以是语音通话请求,也可以是视频通话请求。这样,当通话应用接收到好友设备发送的通话请求时,通过控制摄像头呼吸灯以特定频率闪烁,达到提醒用户的目的。
具体实现时,当通话应用接收到通话请求时,根据摄像头控制服务所提供的上述接口传入特定的频率参数;摄像头控制服务根据摄像头应用传入的频率参数生成呼吸灯闪烁指令,并将呼吸灯闪烁指令发送给呼吸灯控制模块;呼吸灯控制模块,用于根据呼吸灯闪烁指令控制摄像头呼吸灯以特定频率闪烁。
图16为本申请根据示例性实施例示出的摄像头呼吸灯控制场景示意图,如图16所示,显示设备显示接收到通话请求时的用户界面。在显示设备接收到通话请求时起,到显示设备拒绝或接受该通话请求的过程中,摄像头呼吸灯将以特定频率持续闪烁,从而达到提醒用户的目的。
由以上实施例可知,本申请提供一种显示设备,该显示设备包括摄像头、摄像头呼吸灯和控制器;所述控制器,与所述摄像头呼吸灯连接,用于获取呼吸灯控制参数,所述呼吸灯控制参数与应用在使用所述摄像头时所需的灯光效果对应;根据所述呼吸灯控制参数控制所述摄像头呼吸灯;所述摄像头呼吸灯,用于呈现应用所需的灯光效果。本申请显示设备中,其摄像头配置有呼吸灯功能,摄像头呼吸灯可以在摄像头工作时呈现出应用所需的灯光效果,提升用户体验。在本申请提供的显示设备上,任意摄像头应用均可根据需求控制摄像头呼吸灯呈现出其所需的灯光效果,从而达到提醒用户、提升用户体验的目的。
应当理解,摄像头应用对于摄像头呼吸灯的控制场景,不限于本申请实施例已经列举出的场景,基于本申请提供的显示设备所实现的其他控制场景均属于本申请的保护范围,如用户在使用健身应用的场景中,健身应用可以根据健身动作的难度、幅度、剧烈程度等因素控制摄像头呼吸灯呈现出不同的灯光效果。
本申请实施例还提供一种摄像头呼吸灯的控制方法,该方法包括但不限于应用于本申请实施例所提供的显示设备,如图17所示,该方法可以包括:
步骤121,获取呼吸灯控制参数,所述呼吸灯控制参数与应用在使用所述摄像头时所需的灯光效果对应;
步骤122,根据所述呼吸灯控制参数控制所述摄像头呼吸灯,以使所述摄像头呼吸灯呈现应用所需的灯光效果。
本申请实施例还提供一种摄像头呼吸灯的控制方法,该方法包括但不限于应用于本申请实施例所提供的显示设备,该方法可以包括:响应于摄像头启动指令,在启动摄像头的同时,控制所述摄像头呼吸灯开启;在所述摄像头呼吸灯的开启时长达到预设时长时,控制所述摄像头呼吸灯关闭。
本申请实施例还提供一种摄像头呼吸灯的控制方法,该方法包括但不限于应用于本申请实施例所提供的显示设备,该方法可以包括:响应于触发图像拍摄指令的用户输入,在控制摄像头拍摄图像的同时,控制所述摄像头呼吸灯以第一特定频率闪烁;响应于触发视频录制开始指令的用户输入,在控制摄像头录制视频的同时,控制所述摄像头呼吸灯以第二特定频率闪烁;响应于触发视频录制开始指令的用户输入,在控制摄像头结束录制视频的同时,控制所述摄像头呼吸灯关闭。
本申请实施例还提供一种摄像头呼吸灯的控制方法,该方法包括但不限于应用于本申请实施例所提供的显示设备,该方法可以包括:响应于接收到通话请求,控制所述摄像头呼吸灯以特定频率闪烁;响应于接收到触发接受或者拒绝所述通话请求的用户输入,控制所述摄像头呼吸灯停止闪烁。
应当理解,本申请实施例提供的摄像头呼吸灯的控制方法,可以包括本申请显示设备控制器被配置的全部步骤,相关之处参见显示设备实施例中的说明即可,此处不予赘述。
具体实现中,本申请还提供一种计算机存储介质,其中,该计算机存储介质可存储有程序,该程序执行时可包括本发明提供的外设设备颜色跟随画面颜色变化的方法的各实施例中的部分或全部步骤。所述的存储介质可为磁碟、光盘、只读存储记忆体 (英文:read-only memory,简称:ROM)或随机存储记忆体(英文:random access memory,简称:RAM)等。
本领域的技术人员可以清楚地了解到本发明实施例中的技术可借助软件加必需的通用硬件平台的方式来实现。基于这样的理解,本发明实施例中的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在存储介质中,如ROM/RAM、磁碟、光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本发明各个实施例或者实施例的某些部分所述的方法。
本说明书中各个实施例之间相同相似的部分互相参见即可。尤其,对于外设设备颜色跟随画面颜色变化的方法实施例而言,由于其基本相似于显示设备实施例,所以描述的比较简单,相关之处参见显示设备实施例中的说明即可。
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。
为了方便解释,已经结合具体的实施方式进行了上述说明。但是,上述示例性的讨论不是意图穷尽或者将实施方式限定到上述公开的具体形式。根据上述的教导,可以得到多种修改和变形。上述实施方式的选择和描述是为了更好的解释原理以及实际的应用,从而使得本领域技术人员更好的使用所述实施方式以及适于具体使用考虑的各种不同的变形的实施方式。

Claims (10)

  1. 一种显示设备,其特征在于,包括:
    显示器,被配置为显示用户画面;
    外设设备,被配置为呈现不同的颜色;
    与所述显示器和所述外设设备连接的控制器,所述控制器被配置为:
    将所述用户画面划分为多个颜色提取区域,建立每个所述外设设备与每个所述颜色提取区域的绑定关系,所述外设设备的数量与所述颜色提取区域的数量相同;
    获取每个所述颜色提取区域所呈现画面内容的色彩特征值;
    基于每个所述颜色提取区域的像素面积,将对应的所述色彩特征值转换成RGB颜色空间的RGB均值;
    基于所述绑定关系,将每个所述颜色提取区域的RGB均值发送至对应的所述外设设备,由所述外设设备呈现所述RGB均值对应的颜色。
  2. 根据权利要求1所述的显示设备,其特征在于,所述控制器在执行所述将用户画面划分为多个颜色提取区域,被进一步配置为:
    获取所述显示器呈现的用户画面;
    按照预设划分规则,对所述用户画面进行区域划分,得到多个颜色提取区域,每个所述颜色提取区域不重合。
  3. 根据权利要求1所述的显示设备,其特征在于,所述控制器在执行所述获取每个颜色提取区域所呈现画面内容的色彩特征值,被进一步配置为:
    获取每个所述颜色提取区域所呈现画面内容的颜色直方图;
    在每个所述颜色直方图中提取对应的红色像素合值、绿色像素合值和蓝色像素合值,作为每个所述颜色提取区域对应的色彩特征值。
  4. 根据权利要求1所述的显示设备,其特征在于,所述控制器在执行所述获取每个颜色提取区域所呈现画面内容的色彩特征值,被进一步配置为:
    截取所述用户画面中所呈现的完整画面;
    按照所述颜色提取区域的划分规则,将所述完整画面进行划分,确定每个颜色提取区域对应的部分画面内容;
    获取每个所述颜色提取区域所呈现部分画面内容的颜色直方图;
    在每个所述颜色直方图中提取对应的红色像素合值、绿色像素合值和蓝色像素合值,作为每个所述颜色提取区域对应的色彩特征值。
  5. 根据权利要求1、3和4中任一项所述的显示设备,其特征在于,所述色彩特征值包括红色像素合值、绿色像素合值和蓝色像素合值;以及,
    所述控制器在执行所述基于每个所述颜色提取区域的像素面积,将对应的所述色彩特征值转换成RGB颜色空间的RGB均值,被进一步配置为:
    计算每个颜色提取区域的像素面积;
    计算所述红色像素合值与所述像素面积得到的红色像素均值,计算绿色像素合值与所述像素面积得到的绿色像素均值,以及,计算所述蓝色像素合值与所述像素面积得到的蓝色像素均值,将所述红色像素均值、绿色像素均值和蓝色像素均值作为RGB颜色空间的RGB均值。
  6. 一种外设设备颜色跟随画面颜色变化的方法,其特征在于,所述方法包括:
    将显示器中呈现的用户画面划分为多个颜色提取区域,建立每个外设设备与每个所述颜色提取区域的绑定关系,所述外设设备的数量与所述颜色提取区域的数量相同;
    获取每个所述颜色提取区域所呈现画面内容的色彩特征值;
    基于每个所述颜色提取区域的像素面积,将对应的所述色彩特征值转换成RGB颜色空间的RGB均值;
    基于所述绑定关系,将每个所述颜色提取区域的RGB均值发送至对应的所述外设设备,由所述外设设备呈现所述RGB均值对应的颜色。
  7. 根据权利要求6所述的方法,其特征在于,所述将用户画面划分为多个颜色提取区域,包括:
    获取所述显示器呈现的用户画面;
    按照预设划分规则,对所述用户画面进行区域划分,得到多个颜色提取区域,每个所述颜色提取区域不重合。
  8. 根据权利要求6所述的方法,其特征在于,所述获取每个颜色提取区域所呈现画面内容的色彩特征值,包括:
    获取每个所述颜色提取区域所呈现画面内容的颜色直方图;
    在每个所述颜色直方图中提取对应的红色像素合值、绿色像素合值和蓝色像素合值,作为每个所述颜色提取区域对应的色彩特征值。
  9. 根据权利要求6所述的方法,其特征在于,所述获取每个颜色提取区域所呈现画面内容的色彩特征值,包括:
    截取所述用户画面中所呈现的完整画面;
    按照所述颜色提取区域的划分规则,将所述完整画面进行划分,确定每个颜色提取区域对应的部分画面内容;
    获取每个所述颜色提取区域所呈现部分画面内容的颜色直方图;
    在每个所述颜色直方图中提取对应的红色像素合值、绿色像素合值和蓝色像素合值,作为每个所述颜色提取区域对应的色彩特征值。
  10. 根据权利要求6、8和9中任一项所述的方法,其特征在于,所述色彩特征值包括红色像素合值、绿色像素合值和蓝色像素合值;以及,
    所述基于每个所述颜色提取区域的像素面积,将对应的所述色彩特征值转换成RGB颜色空间的RGB均值,包括:
    计算每个颜色提取区域的像素面积;
    计算所述红色像素合值与所述像素面积得到的红色像素均值,计算绿色像素合值与所述像素面积得到的绿色像素均值,以及,计算所述蓝色像素合值与所述像素面积得到的蓝色像素均值,将所述红色像素均值、绿色像素均值和蓝色像素均值作为RGB颜色空间的RGB均值。
PCT/CN2021/093438 2020-09-17 2021-05-12 一种显示方法及显示设备 WO2022057286A1 (zh)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202010982475.0A CN112118468A (zh) 2020-09-17 2020-09-17 一种外设设备颜色跟随画面颜色变化的方法及显示设备
CN202010982475.0 2020-09-17
CN202011049293.4A CN112188098A (zh) 2020-09-29 2020-09-29 显示设备及摄像头呼吸灯控制方法
CN202011049293.4 2020-09-29

Publications (1)

Publication Number Publication Date
WO2022057286A1 true WO2022057286A1 (zh) 2022-03-24

Family

ID=80777614

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/093438 WO2022057286A1 (zh) 2020-09-17 2021-05-12 一种显示方法及显示设备

Country Status (1)

Country Link
WO (1) WO2022057286A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115243086A (zh) * 2022-06-24 2022-10-25 深圳市新龙鹏科技有限公司 一种音视频氛围灯同步控制方法、装置、设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110076297A (ko) * 2009-12-29 2011-07-06 (주)오픈테크놀러지 멀티미디어 컨텐츠 기반의 조명 제어 시스템
CN103324293A (zh) * 2013-07-16 2013-09-25 锤子科技(北京)有限公司 移动终端显示界面的显示控制方法和装置
CN103699373A (zh) * 2013-11-29 2014-04-02 小米科技有限责任公司 界面颜色显示方法、装置及系统
CN106296673A (zh) * 2016-08-03 2017-01-04 深圳微服机器人科技有限公司 一种通过led灯展示用户界面的方法和系统
CN107025087A (zh) * 2017-03-16 2017-08-08 青岛海信电器股份有限公司 一种图像显示方法及设备
US20170311408A1 (en) * 2014-11-11 2017-10-26 Novomatic Ag Display device and method for operating such a display device
CN112118468A (zh) * 2020-09-17 2020-12-22 海信视像科技股份有限公司 一种外设设备颜色跟随画面颜色变化的方法及显示设备

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110076297A (ko) * 2009-12-29 2011-07-06 (주)오픈테크놀러지 멀티미디어 컨텐츠 기반의 조명 제어 시스템
CN103324293A (zh) * 2013-07-16 2013-09-25 锤子科技(北京)有限公司 移动终端显示界面的显示控制方法和装置
CN103699373A (zh) * 2013-11-29 2014-04-02 小米科技有限责任公司 界面颜色显示方法、装置及系统
US20170311408A1 (en) * 2014-11-11 2017-10-26 Novomatic Ag Display device and method for operating such a display device
CN106296673A (zh) * 2016-08-03 2017-01-04 深圳微服机器人科技有限公司 一种通过led灯展示用户界面的方法和系统
CN107025087A (zh) * 2017-03-16 2017-08-08 青岛海信电器股份有限公司 一种图像显示方法及设备
CN112118468A (zh) * 2020-09-17 2020-12-22 海信视像科技股份有限公司 一种外设设备颜色跟随画面颜色变化的方法及显示设备

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115243086A (zh) * 2022-06-24 2022-10-25 深圳市新龙鹏科技有限公司 一种音视频氛围灯同步控制方法、装置、设备及存储介质

Similar Documents

Publication Publication Date Title
US10989993B2 (en) Control device for correcting projection image, projection system, method of controlling same, and storage medium
CN111050199B (zh) 显示设备及显示设备蓝牙通信资源的调度方法
WO2021082569A1 (zh) 拍摄画面的补光方法、智能电视及计算机可读存储介质
CN113630655B (zh) 一种外设设备颜色随画面颜色变化的方法及显示设备
CN111163274A (zh) 一种视频录制方法及显示设备
CN112118468A (zh) 一种外设设备颜色跟随画面颜色变化的方法及显示设备
WO2022252660A1 (zh) 一种视频拍摄方法及电子设备
US20160349948A1 (en) Display system, information processing apparatus, computer readable recording medium, and power source control method
CN112073798B (zh) 一种数据传输方法及设备
WO2023155529A1 (zh) 显示设备、智能家居系统及用于显示设备的多屏控制方法
US20160353096A1 (en) Display device and image quality setting method
CN113094142A (zh) 页面显示方法及显示设备
WO2022057286A1 (zh) 一种显示方法及显示设备
CN111279680A (zh) 一种方形裁剪拍照方法、拍照系统及拍照装置
CN112929592A (zh) 一种视频通话方法、显示设备及服务器
EP4228241A1 (en) Capturing method and terminal device
CN113453069B (zh) 一种显示设备及缩略图生成方法
CN114915833A (zh) 一种显示器控制方法及显示设备、终端设备
WO2021088889A1 (zh) 显示系统、显示方法及计算设备
WO2020248682A1 (zh) 一种显示设备及虚拟场景生成方法
CN112399071B (zh) 一种摄像头马达的控制方法、装置及显示设备
CN112905008A (zh) 一种手势调节显示图像方法及显示设备
US20240146996A1 (en) Display device and control method therefor
JP2020079895A (ja) プロジェクタ、投影システム、プロジェクタの制御方法、投影方法、及び、プログラム
WO2022105410A1 (zh) 一种显示设备及其设备参数的记忆方法、恢复方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21868124

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21868124

Country of ref document: EP

Kind code of ref document: A1