CN112269553A - Display system, display method and computing device - Google Patents

Display system, display method and computing device Download PDF

Info

Publication number
CN112269553A
CN112269553A CN202011217873.XA CN202011217873A CN112269553A CN 112269553 A CN112269553 A CN 112269553A CN 202011217873 A CN202011217873 A CN 202011217873A CN 112269553 A CN112269553 A CN 112269553A
Authority
CN
China
Prior art keywords
instruction
display
target object
information
associated information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011217873.XA
Other languages
Chinese (zh)
Other versions
CN112269553B (en
Inventor
唐甜甜
肖纪臣
刘显荣
陈许
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Laser Display Co Ltd
Original Assignee
Qingdao Hisense Laser Display Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Laser Display Co Ltd filed Critical Qingdao Hisense Laser Display Co Ltd
Publication of CN112269553A publication Critical patent/CN112269553A/en
Application granted granted Critical
Publication of CN112269553B publication Critical patent/CN112269553B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42203Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone

Abstract

The embodiment of the application provides a display system, a display method and a computing device, wherein the display system comprises: a multimedia controller, a laser projection device, and a micro-projection device; wherein the multimedia controller is configured to: receiving a picture searching instruction, responding to the picture searching instruction, determining an image currently displayed by the laser projection equipment, acquiring an identification result of the image, and sending a first instruction to the micro-projection equipment, wherein the first instruction is used for indicating the micro-projection equipment to display the identification result; a micro-projection device configured to: and receiving a first instruction from the multimedia controller, and displaying the identification result in response to the first instruction. The display system enables a user to view the picture displayed by the laser projection equipment without shielding, and view the picture search result, so that the use experience of the user is greatly improved.

Description

Display system, display method and computing device
The application requires that the application number submitted in 2019 on the day 11, the month 04 is 201911067968.5, and the invention name is: priority of chinese patent application for display device, display method and computing device, the entire contents of which are incorporated by reference in the present application.
Technical Field
The present application relates to an intelligent display device technology, and in particular, to a display system, a display method, and a computing device.
Background
With the continuous development of television technology and internet technology, laser projection devices based on the internet have appeared. The laser projection equipment is based on the internet technology, is provided with an open operating system and a chip, is provided with an open application platform, and can support various functions such as audio and video, entertainment, data and the like, thereby meeting the diversified demands of users and bringing brand-new use experience to the users. In the aspect of user interaction, the laser projection equipment can support a voice interaction mode, and a user can control the laser projection equipment through voice.
In the prior art, laser projection devices include a display. When a user presses a picture search key on a remote controller of the laser projection equipment, the laser projection equipment intercepts a current image displayed by a display and sends the image to a cloud server so as to identify people, articles and the like in the image. And the cloud server sends the identification result to the laser projection equipment, and the laser projection equipment displays the identification result on a display. After the user selects a certain recognition result, the laser projection equipment acquires the associated information of the recognition result from the cloud server and displays the associated information on the display. When the laser projection equipment displays the identification result and the associated information, the main picture currently displayed by the display can be shielded. This way may affect the user to watch the main picture normally, resulting in poor user experience.
Disclosure of Invention
The embodiment of the application provides a display system, a display method and a computing device, which are used for improving the watching experience of watching a television, and the technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a display system, including:
the display device comprises a first display, a second display, a first controller and a second controller.
The first controller configured to: receiving a picture search instruction input by a user, responding to the picture search instruction, intercepting the image currently displayed by the first display, acquiring the identification result of the image, and sending a first instruction to the second controller, wherein the first instruction is used for indicating the second controller to display the identification result.
The second controller configured to: displaying the recognition result on the second display in response to the first instruction.
Further, the second controller is configured to: receiving a viewing instruction input by a user, wherein the viewing instruction is used for indicating to view the associated information of the target object in the recognition result; and responding to the viewing instruction, acquiring the associated information of the target object, and displaying the associated information of the target object on the second display.
Further, the second controller is configured to: when the target object is a person, first associated information of the person is obtained, and the first associated information is displayed on the second display, wherein the first associated information comprises introduction information of the person.
Further, the second controller is configured to: and when the target object is a channel, acquiring second associated information of the channel, and displaying the second associated information on the second display, wherein the second associated information comprises program information and a program forecast which are currently played by the channel.
Further, the second controller is configured to: and when the target object is an article, acquiring third related information of the article, and displaying the third related information on the second display, wherein the third related information comprises the same-style commodity information and purchase link information of the article.
In a second aspect, an embodiment of the present application provides a display method, including:
receiving a first instruction, wherein the first instruction is sent by a first controller after receiving a picture search instruction input by a user, intercepting an image currently displayed by a first display and acquiring an identification result of the image, and the first instruction is used for indicating and displaying the identification result.
Displaying the recognition result on a second display in response to the first instruction.
Further, the method further comprises: receiving a viewing instruction input by a user, wherein the viewing instruction is used for indicating to view the associated information of the target object in the recognition result; and responding to the viewing instruction, acquiring the associated information of the target object, and displaying the associated information of the target object on the second display.
Further, the acquiring the associated information of the target object and displaying the associated information of the target object on the second display includes: when the target object is a person, first associated information of the person is obtained, and the first associated information is displayed on the second display, wherein the first associated information comprises introduction information of the person.
Further, the acquiring the associated information of the target object and displaying the associated information of the target object on the second display includes: and when the target object is a channel, acquiring second associated information of the channel, and displaying the second associated information on the second display, wherein the second associated information comprises program information and a program forecast which are currently played by the channel.
Further, the acquiring the associated information of the target object and displaying the associated information of the target object on the second display includes: and when the target object is an article, acquiring third related information of the article, and displaying the third related information on the second display, wherein the third related information comprises the same-style commodity information and purchase link information of the article.
In a third aspect, an embodiment of the present application provides a display system, including:
a multimedia controller, a laser projection device, and a micro-projection device.
Wherein the multimedia controller is configured to: receiving a picture search instruction, responding to the picture search instruction, determining an image currently displayed by the laser projection equipment, acquiring an identification result of the image, and sending a first instruction to the micro-projection equipment, wherein the first instruction is used for instructing the micro-projection equipment to display the identification result.
The micro-projection device configured to: receiving a first instruction from the multimedia controller, and displaying the identification result in response to the first instruction.
Further, the micro-projection device is configured to: determining a first operation, wherein the first operation is used for selecting a target object in the recognition result; in response to the first operation, sending a second instruction to the multimedia controller; the second instruction is used for indicating the target object; receiving a third instruction from the multimedia controller, wherein the third instruction is used for indicating the associated information of the target object; and responding to the third instruction, and displaying the associated information of the target object.
Further, the multimedia controller is configured to: receiving a second instruction from the micro-projection device; responding to the second instruction, and sending a fourth instruction to a cloud server, wherein the fourth instruction is used for instructing the cloud server to search the relevant information of the target object; receiving a fifth instruction from the cloud server, wherein the fifth instruction is used for indicating the association information of the target object; and determining the third instruction according to the fifth instruction, and sending the third instruction to the micro-projection equipment.
Further, when the target object is a person, the associated information of the target object includes introduction information of the person; when the target object is a channel, the associated information of the target object includes at least one of the following items: program information currently played by the channel, and program forecast of the channel; when the target object is an article, the associated washing information of the target object comprises at least one of the following items: the same-style commodity information of the article and the same-style commodity purchasing link information of the article.
In a fourth aspect, an embodiment of the present application provides a display method, including:
a graph search instruction is received.
And responding to the image searching instruction, determining the image currently displayed by the laser projection equipment, and acquiring the identification result of the image.
And sending a first instruction to a micro-projection device, wherein the first instruction is used for instructing the micro-projection device to display the identification result.
Further, the method further comprises: receiving a second instruction from the micro-projection device; responding to the second instruction, and sending a fourth instruction to a cloud server, wherein the fourth instruction is used for instructing the cloud server to search for associated information of a target object; the target object is a selected object in the recognition result of the image; receiving a fifth instruction from the cloud server, wherein the fifth instruction is used for indicating the association information of the target object; and determining the third instruction according to the fifth instruction, and sending the third instruction to the micro-projection equipment.
In a fifth aspect, an embodiment of the present application provides a display method, including:
receiving a first instruction from the multimedia controller; the first instruction is used for instructing the micro-projection equipment to display an identification result; and the identification result is the identification result of the image currently displayed by the laser projection equipment, which is determined by the multimedia controller according to the image searching instruction.
And responding to the first instruction, and displaying the recognition result.
Further, the method further comprises: determining a first operation, wherein the first operation is used for selecting a target object in the recognition result; in response to the first operation, sending a second instruction to the multimedia controller; the second instruction is used for indicating the target object; receiving a third instruction from the multimedia controller, wherein the third instruction is used for indicating the associated information of the target object; and responding to the third instruction, and displaying the associated information of the target object.
In a sixth aspect, an embodiment of the present application provides a computing device, including:
a memory for storing program instructions.
And the processor is used for calling the program instructions stored in the memory and executing the method of the second aspect according to the obtained program.
In a seventh aspect, an embodiment of the present application provides a computing device, including:
a memory for storing program instructions.
And the processor is used for calling the program instructions stored in the memory and executing the method of the fourth aspect according to the obtained program.
In an eighth aspect, an embodiment of the present application provides a computing device, including:
a memory for storing program instructions.
And the processor is used for calling the program instructions stored in the memory and executing the method of the fifth aspect according to the obtained program.
Based on the technical scheme, when a user watches a video program by using the laser projection device, if the user uses the image searching function, the multimedia controller can display the identification result of the image currently displayed by the laser projection device on the micro-projection device. Therefore, the recognition result can not cover the picture displayed by the laser projection equipment, so that the user can view the picture displayed by the laser projection equipment without shielding and view the picture search result at the same time, and the use experience of the user is greatly improved.
Drawings
In order to more clearly illustrate the technical solutions of the present invention or the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a schematic diagram of a display system provided in an embodiment of the present application;
fig. 2 is an interaction flow between a multimedia controller and a laser projection device, and a micro-projection device according to an embodiment of the present disclosure;
fig. 3 is a schematic interface diagram of a front surface of a laser projection apparatus and a front surface of a micro-projection apparatus according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of a rear surface of a laser projection apparatus and a rear surface of a micro-projection apparatus according to an embodiment of the present disclosure;
fig. 5 is a schematic view of an application scenario in which a display system interacts with a control device and a server according to an embodiment of the present application;
fig. 6 is a block diagram of a configuration of a control device 400 according to an embodiment of the present application;
fig. 7 is a system architecture diagram of a display system according to an embodiment of the present application;
fig. 8 is a schematic hardware structure diagram of a laser projection apparatus provided in an embodiment of the present application;
fig. 9 is a schematic application layer diagram of a laser projection apparatus according to an embodiment of the present disclosure;
10-14 illustrate user interfaces interacting with a user in a display system according to an exemplary embodiment;
fig. 15 is a flowchart of a display method according to an embodiment of the present application;
fig. 16 is a flowchart of another display method according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The concept to which the present application relates will be first explained below with reference to the drawings. It should be noted that the following descriptions of the concepts are only for the purpose of facilitating understanding of the contents of the present application, and do not represent limitations on the scope of the present application.
The term "module," as used in various embodiments of the present application, may refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in the various embodiments of the present application refers to a component of an electronic device (e.g., a multimedia controller, a laser projection television, or a micro-projection device as disclosed herein) that is capable of wirelessly controlling the electronic device, typically over a relatively short range of distances. The component may be generally connected to an electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as wireless fidelity (WIFI), wireless, Universal Serial Bus (USB), bluetooth, and motion sensor. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in the embodiments of the present application refers to a user behavior used to express an intended idea, action, purpose, or result through a change in hand shape or an action such as hand movement.
The term "hardware system" used in the embodiments of the present application may refer to a physical component having computing, controlling, storing, inputting and outputting functions, which is formed by a mechanical, optical, electrical and magnetic device such as an Integrated Circuit (IC), a Printed Circuit Board (PCB) and the like. In various embodiments of the present application, a hardware system may also be referred to as a motherboard (or chip).
As shown in fig. 1, a display system 10 provided in the embodiment of the present application includes: a multimedia controller 100, a laser projection device 200, and a micro-projection device 300.
Multimedia controller 100 is communicatively coupled to laser projection device 200 and micro-projection device 300, respectively.
The multimedia controller 100 is configured to analyze and process multimedia information, determine different multimedia information, perform distribution processing, and determine whether each type of multimedia information is displayed by the laser projection apparatus 200 or the micro-projection apparatus 300. After multimedia controller 100 shunts the multimedia information, multimedia controller 100 sends the multimedia information displayed by laser projection device 200 to laser projection device 200; and transmits the multimedia information displayed by the micro-projection device 300 to the micro-projection device 300.
The laser projection device 200 comprises a third controller 201 and a third display 202. The third controller 201 is configured to receive multimedia information sent by the multimedia controller 100, and drive and control the third display 202 to display the multimedia information. The third display 202 is for displaying multimedia information in response to control driving of the third controller 201.
The micro-projection device 300 includes a fourth controller 301 and a fourth display 302. The fourth controller 301 is configured to receive multimedia information sent by the multimedia controller 100, and drive and control the fourth display 302 to display the multimedia information. The fourth display 302 is used to display multimedia information in response to the control driving of the fourth controller 301.
Wherein the third display 202 and the fourth display 302 may be used to display different display screens. For example, the third display 202 may be used to display a screen for a television program and the fourth display 302 may be used to display a screen for notification-like messages, voice assistants, and the like.
Alternatively, the content displayed by the third display 202 and the content displayed by the fourth display 302 may be independent of each other and not affected by each other. For example, while the third display 202 is playing a television program, the fourth display 302 may display information such as time, weather, temperature, reminder messages, etc. that are not related to the television program.
Optionally, there may also be an association between the content displayed by the third display 202 and the content displayed by the fourth display 302. For example, when the third display 202 plays the main screen of a video chat, the fourth display 302 may display information such as the head portrait, the chat duration, and the like of the user currently accessing the video chat.
Optionally, some or all of the content displayed by the fourth display 302 may be adjusted to be displayed by the third display 202. For example, the time, weather, temperature, reminder messages, etc. displayed on the third display 202 may be adjusted to be displayed on the third display 202, while other information is displayed on the fourth display 302.
In addition, the third display 202 displays the multi-party interactive picture while displaying the traditional television program picture, and the multi-party interactive picture does not block the traditional television program picture. The display mode of the traditional television program picture and the multi-party interactive picture is not limited by the application. For example, the position and the size of the traditional television program picture and the multi-party interactive picture can be set according to the priority of the traditional television program picture and the multi-party interactive picture.
Taking the example that the priority of the traditional television program picture is higher than that of the multi-party interactive picture, the area of the traditional television program picture is larger than that of the multi-party interactive picture, and the multi-party interactive picture can be positioned at one side of the traditional television program picture and can also be arranged at one corner of the multi-party interactive picture in a floating manner.
The third display 202 and the fourth display 302 referred to in this application are projection display devices, and the specific display device types, sizes, resolutions, etc. of the third display 202 and the fourth display 302 are not limited, and those skilled in the art will understand that the third display 202 and the fourth display 302 may have some changes in performance and configuration as needed.
As shown in fig. 1, a camera may be connected or disposed on the third display 202, and is used for presenting a picture taken by the camera on a display interface of a laser projection device, a micro-projection device, or other display devices, so as to implement an interactive chat between users. Specifically, the picture shot by the camera may be displayed on the laser projection device in a full screen, a half screen, or any selectable area.
As an optional connection mode, the camera is connected with the rear shell of the laser projection device through the connecting plate, and is fixedly installed in the middle of the upper side of the rear shell of the laser projection device.
As another optional connection mode, the camera is connected to the rear housing of the laser projection device through a connection board or another conceivable connector, the camera is capable of being lifted up and down, the connector is provided with a lifting motor, when a user wants to use the camera or an application program wants to use the camera, the camera is lifted out of the laser projection device, and when the camera does not need to be used, the camera can be embedded into the rear housing, so that the camera is protected from being damaged, and privacy safety of the user is protected.
As an embodiment, the camera adopted in the present application may have 1600 ten thousand pixels, so as to achieve the purpose of ultra high definition display. In actual use, cameras higher or lower than 1600 ten thousand pixels may also be used.
After the camera is installed on the laser projection equipment, the contents displayed by the laser projection equipment in different application scenes can be fused in various different modes, so that the function which cannot be realized by the traditional laser projection equipment is achieved.
Illustratively, a user may conduct a video chat with at least one other user while watching a video program. The presentation of the video program may be as a background frame over which a window for video chat is displayed. The function is called 'chat while watching'.
Optionally, in a scene of "chat while watching", at least one video chat is performed across terminals while watching a live video or a network video.
In another example, a user can conduct a video chat with at least one other user while entering the educational application for learning. For example, a student may interact remotely with a teacher while learning content in an educational application. Vividly, this function can be called "chatting while learning".
In another example, a user conducts a video chat with a player entering a card game while playing the game. For example, a player may enable remote interaction with other players when entering a gaming application to participate in a game. Figuratively, this function may be referred to as "watch while playing".
Optionally, the game scene is fused with the video picture, the portrait in the video picture is scratched and displayed in the game picture, and the user experience is improved.
Optionally, in the motion sensing game (such as ball hitting, boxing, running and dancing), the human posture and motion, limb detection and tracking and human skeleton key point data detection are obtained through the camera, and then the human posture and motion, the limb detection and tracking and the human skeleton key point data detection are fused with the animation in the game, so that the game of scenes such as sports and dancing is realized.
In another example, a user may interact with at least one other user in a karaoke application in video and voice. Vividly, this function can be called "sing while watching". Optionally, when at least one user enters the application in a chat scenario, multiple users may jointly complete recording of a song.
In another example, a user may turn on a camera locally to take pictures and videos, figurative, which may be referred to as "looking into the mirror".
In other examples, more or less functionality may be added. The function of the laser projection apparatus is not particularly limited.
It should be noted that fig. 1 only illustrates that the camera is disposed on the housing of the third display, and in a specific implementation, the position where the camera is disposed may be determined according to actual requirements. For example, the first display device and the second display device are disposed on a housing of the third controller, a housing of the third display device, a housing of the fourth controller, a housing of the fourth display device, or independently, which is not limited in this application.
In one specific implementation, as shown in fig. 2, the interaction flow between the multimedia controller 100 and the laser projection device 200, and the micro-projection device 300 includes the following steps:
s201, the multimedia controller acquires multimedia information.
In one possible implementation, the multimedia information includes at least one of: video information, character information, system notification information, system push information, window information corresponding to a one-key graph search function, streaming media information and video call information.
S202, the multimedia controller classifies the multimedia information and determines first information and second information.
Wherein the first information is information displayed by the laser projection device. The second information is information played by the micro-projection device.
In one example, the first information is video-type information. Such as live television programs, video-on-demand network programs, live video network programs, etc.
The second information is other information except video information in the multimedia information. For example, weather, time, text news, system notification information, system push information, window information corresponding to a one-key map search function, streaming media information, video call information, and the like.
S203, the multimedia controller sends the first information to the third controller. Accordingly, the third controller receives the first information from the multimedia controller.
And S204, the third controller controls and drives the third display to display the first information.
And S205, the multimedia controller sends second information to the fourth controller. Accordingly, the fourth controller receives the first information from the multimedia controller.
And S206, the fourth controller controls and drives the fourth display to display the second information.
Referring to fig. 1, as shown in fig. 3, the multimedia controller 100 and the third controller 201 may be provided in the laser projection apparatus host 203. The laser projection device host is typically placed directly in front of the third display 202.
The fourth display 302 may be disposed on a side of the third display 202. For example, the fourth display 302 is disposed at any position of the third display 202, such as above, below, to the left, and to the right.
It should be noted that the display system described herein may include a plurality of micro-projection devices. For example, the display system includes two micro-projection devices respectively located at left and right sides of the laser projection device.
The present application only uses one micro-projection device as an example for description, and when the display system includes a plurality of micro-projection devices, the specific implementation manner is similar to that when the display system includes one micro-projection device, which is not described herein again.
Referring to fig. 1, as shown in fig. 4, a micro-projection device host 303 may be disposed at the back of the display screen 202 of the laser projection device.
As shown in fig. 5, the display system 10 in the embodiment of the present application is a schematic diagram of an application scenario interacting with the control device 400 and the server 500.
The control device 400 may be a remote controller 400A, which may communicate with the multimedia controller 100 through an infrared protocol communication, a bluetooth protocol communication, a ZigBee (ZigBee) protocol communication, or other short-range communication, and is used to control the multimedia controller 100 through a wireless or other wired manner. A user may enter user commands via keys on remote control 400A, voice input, control panel input, etc. to control multimedia controller 100. Such as: the user may input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller 400A, so as to control the functions of the multimedia controller 100.
The control device 400 may also be an intelligent device, such as a mobile terminal 400B, a tablet computer, a notebook computer, etc., which may communicate with the multimedia controller 100 through a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), or other networks, and implement control of the multimedia controller 100 through an application program corresponding to the multimedia controller 100. Multimedia controller 100 is controlled, for example, using an application running on a smart device. The application may provide various controls to the User through an intuitive User Interface (UI) on a screen associated with the smart device.
As an example, the mobile terminal 400B and the multimedia controller 100 may each be installed with a software application, so that the connection communication between the two can be realized through a network communication protocol, thereby achieving the purpose of one-to-one control operation and data communication. Such as: a control command protocol may be established between mobile terminal 400B and multimedia controller 100, synchronizing a remote control keypad to mobile terminal 400B, and controlling the functionality of multimedia controller 100 by controlling a user interface on mobile terminal 400B; the audio and video contents displayed on the mobile terminal 400B may also be transmitted to the multimedia controller 100, so as to implement a synchronous display function.
The server 500 may be a video server, an Electronic Program Guide (EPG) server, a cloud server, or the like.
Multimedia controller 100 may be in data communication with server 500 via a variety of communication means. In various embodiments of the present application, multimedia controller 100 may be permitted to make a wired or wireless communication connection with server 500 via a local area network, wireless local area network, or other network. Server 500 may provide various content and interactions to multimedia controller 100.
Illustratively, multimedia controller 100 receives software program updates, or accesses a remotely stored digital media library by sending and receiving information, as well as EPG interactions. The servers 500 may be a group or groups, and may be one or more types of servers. Other web service contents such as video on demand and advertisement services are provided through the server 500.
Fig. 6 is a block diagram schematically showing the configuration of the control device 400 according to the exemplary embodiment. As shown in fig. 6, the control device 400 includes a controller 410, a communicator 430, a user input/output interface 440, a memory 490, and a power supply 480.
The control device 400 is configured to control the multimedia controller 100, and receive input operation commands from a user, and convert the operation commands into commands recognizable and responsive by the multimedia controller 100, and play a role in mediating interaction between the user and the multimedia controller 100. Such as: the user responds to the channel up/down operation by operating the channel up/down keys on the control device 400.
In some embodiments, the control apparatus 400 may be a smart device. Such as: the control unit 400 may install various applications that control the multimedia controller 100 according to user requirements.
In some embodiments, mobile terminal 400B, or other intelligent electronic device, may function similarly to control apparatus 400 after installation of an application that operates multimedia controller 100. Such as: the user may implement the functions of controlling the physical keys of the apparatus 400 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 400B or other intelligent electronic devices.
The controller 410 includes a processor 412, a RAM 413 and a ROM 414, a communication interface, and a communication bus. The controller 410 is used to control the operation of the control device 400, as well as the internal components for communication and coordination and external and internal data processing functions.
The communicator 430 enables communication of control signals and data signals with the multimedia controller 100 under the control of the controller 410. Such as: the received user input signal is transmitted to multimedia controller 100. The communicator 430 may include at least one of a WIFI module 431, a bluetooth module 432, a Near Field Communication (NFC) module 433, and the like.
A user input/output interface 440, wherein the input interface includes at least one of a microphone 441, a touch pad 442, a sensor 443, keys 444, a camera 445, and the like. Such as: the user may implement a user command input function through voice, touch, gesture, pressing, etc., and the input interface may transmit the received analog signal to the multimedia controller 100 by converting the digital signal into a digital signal and converting the digital signal into a corresponding command signal.
The output interface includes an interface that transmits the received user instructions to multimedia controller 100. In some embodiments, it may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, a user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the multimedia controller 100 through an infrared sending module. The following steps are repeated: when the radio frequency signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to a radio frequency control signal modulation protocol and then sent to the multimedia controller 100 through a radio frequency sending terminal.
In some embodiments, the control device 400 includes at least one of a communicator 430 and an output interface. The control device 400 is configured with a communicator 430, such as: the modules of WIFI, bluetooth, NFC, etc. may encode the user input command via a WIFI protocol, or a bluetooth protocol, or an NFC protocol, and send the encoded command to multimedia controller 100.
A memory 490 for storing various operation programs, data and applications for driving and controlling the control apparatus 400 under the control of the controller 410. The memory 490 may store various control signal commands input by a user.
And a power supply 480 for providing operation power support for each electrical component of the control device 400 under the control of the controller 410. The power supply 480 may be implemented using a battery and associated control circuitry.
Next, the system architecture of the display system according to the present application is further described with reference to fig. 7, and it should be noted that fig. 7 is only an exemplary illustration of the system architecture of the display system according to the present application, and does not represent a limitation to the present application. In actual implementation, the display system may contain more or less hardware or interfaces as desired.
As shown in fig. 7, a system architecture diagram of a display system provided in the embodiment of the present application includes:
the system comprises a first multimedia unit 701, a first projection display control unit 702, a first control unit 703, an eye protection plate control unit 704, a first imaging projection display unit 705, a second multimedia unit 706, a second control unit 707, a second projection display control unit 708, a second imaging projection display unit 709, a power supply module 710, a light source driving unit 711 and an audio processing unit 712.
Corresponding to fig. 1, the multimedia control unit 100 comprises a first multimedia unit 701.
The laser projection apparatus 200 includes: a first projection display control unit 702, a first control unit 703, an eye-protection plate control unit 704, and a first imaging projection display unit 705.
The micro-projection device 300 includes: a second multimedia unit 706, a second control unit 707, a second projection display control unit 708, and a second imaging projection display unit 709.
The power module 710, the light source driving unit 711, the audio processing unit 712, and the like may be integrated in the host of the laser projection apparatus 200 or the host of the micro-projection apparatus 300.
The functions of the unit modules are described in detail below:
the first multimedia unit 701 is configured to receive an external input signal and send the corresponding input signal to the first projection display unit 702, the first control unit 703 and/or the second multimedia unit 706, respectively. The first multimedia unit 701 may specifically include at least one of: the first Multimedia unit 701 may receive an external input signal according to at least one of the above modules.
In addition, the first multimedia control unit 701 is further configured to communicate with the first control unit 703 via an integrated circuit bus (I2C); I2C communication, serial communication, USB communication, or wireless communication with the second multimedia unit 706; and supplies the VB1 video signal to the first projection display control unit 702.
A first projection display control unit 702, configured to send a Low-Voltage Differential Signaling (LVDS) or a High-Speed Serial Interface (HSSI) video signal to the first imaging projection display unit 705, and a control signal, configured to drive and control the first imaging projection display unit 705 to display a video picture, and control an operating state of the first imaging projection display unit 705. The first projection display control unit 702 is also configured to provide a Pulse Width Modulation (PWM) signal and a Duty signal to the light source driving unit 711, and to communicate with the first control unit 703 via I2C.
The first control unit 703 is configured to control a working state of a heat dissipation device of the laser projection apparatus; monitoring an ambient temperature and a laser temperature; controlling the rotating speed of the speckle eliminating wheel; and controlling the power-on and power-off of the light source driving unit 711. The first control unit 703 is further configured to perform I2C communication or serial port communication with the eye protection plate.
And an eye protection plate control unit 704 for controlling the working mode of the eye protection plate of the laser projection device.
The first imaging projection display unit 705 includes a lens portion of the laser projection device, or an optical system composed of a light valve, an illumination lens and a projection lens of the laser projection device. The first imaging projection display unit 705 is configured to receive the LVDS or HSSI video signal or the like from the first projection display control unit 702 and project the video signal onto the third display.
The second multimedia unit 706 is configured to receive the multimedia signal from the first multimedia unit 701 and send the corresponding multimedia signal to the second control unit 707 and the second projection display control unit 708. The second multimedia unit 706 is also configured to communicate I2C with the second control unit and to send LVDS video signals to the second projection display control unit 708.
A second control unit 707 for controlling an operating state of the heat dissipation device of the laser projection apparatus; monitoring an ambient temperature and a laser temperature; and I2C with the second projection display control unit 708.
A second projection display control unit 708 for sending the LVDS video signal and the control signal to the second imaging projection display 709.
And a second imaging projection display unit 709 including a lens portion of the micro-projection device or an optical system including a light valve, an illumination lens and a projection lens of the micro-projection device. A second imaging projection display unit 709 for receiving the LVDS video signal from the second projection display control unit 705 and projecting the video signal onto a fourth display.
And a power supply module 710 for supplying power to the display system. For example, the power module 710 may provide 12V power for the first control unit; providing a 12V power supply for the first projection display control unit; an 18V power supply is provided for the audio processing unit 712 and the first multimedia unit.
The light source driving unit 711 is used to provide an energy source for the laser.
The laser includes: a blue laser 716, a green laser 717, and a red laser 718 to provide laser light in three primary colors. The three lasers are used for providing laser light sources for the laser projection equipment.
Fig. 8 is a schematic diagram illustrating a hardware structure of a laser projection apparatus provided in an embodiment of the present application, and as shown in fig. 8, the laser projection apparatus includes: a Television (TV) board 810, a display panel 820, a light source 830, and a light source driving circuit 840.
Hereinafter, each device related to fig. 8 will be described in detail:
the TV board 810 is mainly used to receive and decode external audio and video signals. The TV board module 810 is provided with a System on Chip (SoC) capable of decoding data of different data formats into a normalized format and transmitting the data of the normalized format to the display board 820 through, for example, a connector (connector).
The display panel 820 may be provided with a Field Programmable Gate Array (FPGA) 821, a control processing module 822, and an optical modulation device 823.
FPGA821 is used for processing the input video image signal, such as performing Motion Estimation and Motion Compensation (MEMC) frequency multiplication processing, or implementing image enhancement function such as image correction.
And the control processing module 822 is connected with the algorithm processing module FPGA and used for receiving the processed video image processing signal data as image data to be projected. The control processing module 822 outputs a current PWM brightness adjustment signal and an enable control signal according to image data to be transmitted, and implements timing and lighting control of the light source 830 through the light source driving circuit 840.
The optical modulation device 823 may receive a video image signal output from the TV board 810 and analyze and know a divisional luminance signal and an image component of the video image. Alternatively, the optical modulation device 200 may receive the image signal to be projected output by the FPGA821, and the image signal to be projected may include an image brightness signal and an image component after absorption.
The light source 830 includes a red light source, a blue light source and a green light source, and the light sources of the three colors can emit light simultaneously or in a time sequence. The light source 830 is driven to light up according to the timing of image display indicated by the control instruction of the control processing module 822.
As shown in fig. 9, the application layer of the laser projection device contains various applications that may be executed at the laser projection device 200.
The application layer 1912 of the laser projection device 200 may include, but is not limited to, one or more applications such as: live television applications, video-on-demand applications, media center applications, application centers, gaming applications, and the like.
The live television application program can provide live television through different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the laser projection device 200.
A video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
The media center application program can provide various applications for playing multimedia contents. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
The application program center can provide and store various application programs. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on a display device. The application center may obtain these applications from different sources, store them in local storage, and then be operational on the laser projection device 200.
In the embodiment of the application, a user may input a graphic search instruction to instruct the laser projection device to perform a graphic search, where the graphic search instruction may be input by the user pressing a graphic search key on a remote controller of the multimedia controller, or may be input by the user sending a specific voice such as "please search for a graphic". In the embodiment of the application, the image search means that the multimedia controller intercepts an image currently displayed by the third display according to an image search instruction input by a user, the image is sent to the cloud server to be identified, the cloud server identifies people, television channel identifiers, articles and the like in the image, the multimedia controller displays information identified by the cloud server on the fourth display, and then, after the user selects a certain identification result, the associated information of the identification result is displayed on the fourth display. The following embodiments describe in detail the interaction and processing procedures between the interface of the third display and the fourth display, and the multimedia controller, the third display, the fourth controller, the fourth display, and the cloud server after the multimedia controller receives the image search instruction.
Fig. 10-14 illustrate schematic diagrams of a multimedia controller, a laser projection device, and a user interface of a micro-projection device interacting with a user according to an exemplary embodiment.
As shown in fig. 10, the third display is currently playing a program of television channel "AATV". At this time, if the user inputs a graphic search instruction, for example, the user presses a graphic search key of a remote controller of the multimedia controller, or the user inputs a voice of "please search for a graphic", the multimedia controller displays as illustrated in fig. 11. As shown in fig. 11, the fourth display displays the recognition result of the intercepted image including the faces of two persons, one television channel, and three articles. Optionally, if there are many recognized articles and all the recognized articles cannot be displayed on the fourth display, a part of the articles may be displayed first, and a moving icon, for example, an icon moving leftward as shown in fig. 11, may be displayed on the fourth display, and after the user selects and confirms the icon, the remaining recognition results that are not displayed may continue to be sequentially displayed on the fourth display. When the user selects and confirms one recognition result, the fourth display may display the association information of the recognition result selected by the user. In particular, as illustrated in fig. 12, 13 and 14. As illustrated in fig. 12, when the user selects one face icon in the fourth display in fig. 11, the associated information of the person corresponding to the face is displayed on the fourth display. Such as the person's name, occupation, year and month of birth, native place, major works, etc. As illustrated in fig. 13, when the user selects an icon of a tv channel on the fourth display in fig. 11, the name of the tv channel, the name of a program currently being displayed, a preview of the program for one or more future time slots, and the like are displayed on the fourth display. As illustrated in fig. 14, when the user selects an icon of one package in the fourth display in fig. 11, an icon of the same-style item of the package is displayed on the fourth display, and at the same time, a purchase link (i.e., the two-dimensional code in fig. 14) of the currently selected one of the same-style items is displayed. The user can purchase the same type of commodity by scanning the code. In addition to the manner of displaying the information of the money items illustrated in fig. 14, in another manner, at the stage illustrated in fig. 11, that is, at the stage of displaying the recognition result, pressing a specific remote controller button or uttering a specific voice by the user may trigger the multimedia controller to display the money items of all the items in the recognition result and the purchase links of the money items on the fourth display.
In the fourth displays illustrated in fig. 12 to 14, when the related information cannot be completely displayed, part of the related information may be displayed first, and after the user selects and confirms the moving icon of the fourth display, the remaining related information may be continuously and sequentially displayed.
Fig. 15 is a flowchart of a display method provided in an embodiment of the present application, where the display method may be applied to the display system shown in fig. 1, and as shown in fig. 15, the display method includes:
s1501, the multimedia controller receives a picture searching instruction.
In a possible implementation manner, the graph search instruction is sent to the multimedia controller by the control device.
Specifically, when a user is interested in people or commodities appearing in a video during watching a video program played by the laser projection device, the user can send a picture search instruction to the multimedia controller by using a picture search function on the control device.
In one example, a user presses a graphic search key on a remote controller of the multimedia controller, and after the remote controller detects the pressing operation, a graphic search instruction is generated and sent to the multimedia controller.
Still another example is for inputting the contents of "image search" by voice through a voice control function of the remote controller, and generating and transmitting an image search instruction to the multimedia controller after the remote controller detects the voice input.
And S1502, responding to the image searching instruction, the multimedia controller determines the image currently displayed by the laser projection equipment and acquires the identification result of the image.
The image currently displayed by the laser projection equipment is a screenshot of a video interface currently played by the laser projection equipment. The recognition result of the image includes a person, a television channel, an article, a building, an animal, a plant, and the like in the image.
S1503, the multimedia controller sends a first instruction to the micro-projection device. Accordingly, the micro-projection device receives a first instruction from the multimedia controller.
The first instruction is used for instructing the micro-projection equipment to display the identification result.
S1504, responding to the first instruction, and displaying the identification result by the micro-projection equipment.
In one possible implementation, the micro-projection device may display the respective recognition results in the form of icons. The micro-projection equipment can sequence various types of recognition results and display the recognition results according to a preset sequence.
In another possible implementation manner, the micro-projection device directly displays the image, and the image is subjected to region segmentation according to each recognition result. The user selects the corresponding recognition result by selecting the area corresponding to the recognition result.
Based on the technical scheme, when the user watches the video program by using the laser projection equipment, if the user uses the image searching function, the multimedia controller can display the identification result of the image currently displayed by the laser projection equipment on the micro-projection equipment. Therefore, the recognition result can not cover the picture displayed by the laser projection equipment, so that the user can view the picture displayed by the laser projection equipment without shielding and view the picture search result at the same time, and the use experience of the user is greatly improved.
In a possible implementation manner, referring to fig. 15, as shown in fig. 16, the above S1502 may be specifically implemented through the following S1502a-S1502 d.
S1502a, in response to the graphics search instruction, the multimedia controller determines an image currently displayed by the laser projection device.
In one possible implementation, the image displayed by the laser projection device is transmitted to the laser projection device by the multimedia controller. Therefore, after the multimedia controller receives the image search instruction, the multimedia controller determines the image which is sent to the laser projection device by the multimedia controller at the current moment. The multimedia takes the image as the image currently displayed by the laser projection device.
In another possible implementation manner, the laser projection device has a screen capture function, the multimedia controller sends a picture search instruction to the laser projection device after receiving the picture search instruction, and the laser projection device captures a currently displayed image after receiving the picture search instruction and sends the image to the multimedia controller.
S1502b, the multimedia controller sends the currently displayed image of the laser projection device to the cloud server. Accordingly, the cloud server receives images from the multimedia controller.
S1502c, the cloud server searches for the image and determines an image recognition result.
Specifically, after receiving the image, the cloud server performs face recognition and article recognition on the image, and determines a recognition result of the image.
S1502d, the cloud server sends the image recognition result to the multimedia controller. Correspondingly, the multimedia controller receives the image recognition result from the cloud server.
In another possible implementation manner, referring to fig. 15, as shown in fig. 16, after S1504, the method provided in the embodiment of the present application further includes the following steps S1505-S1512, which are specifically described below:
s1505, the micro-casting device determines a first operation.
Wherein the first operation is used for selecting a target object in the recognition result.
The first operation can be directly determined by the micro-projection device or determined by the multimedia control device and forwarded to the micro-projection device.
The first operation may be input by pressing a key by the user operating the control device, or may be input by a voice of the user.
It should be noted that the target object may be one of the recognition results, or may be a plurality of recognition results of the recognition results.
And S1506, responding to the first operation, and sending a second instruction to the multimedia controller by the micro-projection equipment. Accordingly, the multimedia controller receives a second instruction from the micro-projection device.
The second instruction is for indicating a target object.
And S1507, responding to the second instruction, and sending a fourth instruction to the cloud server by the multimedia controller. Correspondingly, the cloud server receives a fourth instruction from the multimedia controller.
The fourth instruction is used for instructing the cloud server to search the associated information of the target object.
The association information may indicate different meanings based on different categories of the target object in the recognition result.
In the first example, the target object in the recognition result is a person, and the related information may be the name, occupation, year and month of birth, native place, work of representation, and the like of the person.
In a second example, if the target object in the recognition result is a television channel, the related information may be a name of the television channel, a program being played, a program announcement, and the like.
In a third example, if the target object in the recognition result is an item, the related information may be a money-like item of the item, a purchase link, and the like.
In a fourth example, if the target object in the recognition result is a building, the related information may be a location, a feature, and the like of the building.
In a fifth example, if the target object in the recognition result is a plant, the related information may be the name, variety, introduction, and the like of the plant.
In the sixth example, if the target object in the recognition result is an animal, the related information may be the name, the living area, the habit, and the like of the animal.
S1508, the cloud server determines the association information of the target object, and generates a fifth instruction according to the association information of the target object.
The fifth instruction is used for indicating the association information of the target object.
In a possible implementation manner, the fourth instruction sent by the multimedia controller to the cloud server includes an identifier (e.g., an icon) of each target object. And the cloud server determines the target objects according to the identifications of the target objects and searches the associated information of the target objects.
And S1509, the cloud server sends a fifth instruction to the multimedia controller. Correspondingly, the multimedia controller receives a fifth instruction from the cloud server.
And S1510, the multimedia controller generates a third instruction according to the fifth instruction.
S1511, the multimedia controller sends a third instruction to the micro-projection device. Accordingly, the micro-projection device receives a third instruction from the multimedia controller.
S1512, responding to the third instruction, and displaying the associated information of the target object by the micro-projection device.
Specifically, when the target object is a person, first related information of the person is acquired, and the first related information is displayed on the second display, and the first related information includes introduction information of the person.
And when the target object is a channel, acquiring second associated information of the channel, and displaying the second associated information on the second display, wherein the second associated information comprises program information and a program forecast which are currently played by the channel.
And when the target object is an article, acquiring third associated information of the article, and displaying the third associated information on the second display, wherein the third associated information comprises the same-style commodity information and purchase link information of the article.
After the associated information of the target object is displayed by the micro-projection device, the user can further select the associated information of the target object to view.
For example, the user may select one item at a time or may select multiple items at a time. If one item is selected, the second controller displays the money item information and the purchase link information of the one item on the second display. If a plurality of items are selected, the second controller displays the same-style goods information and the purchase link information of the plurality of items on the second display.
For example, the user may indicate to view the money item information and the purchase link information of the one item by selecting an icon of the item and confirming as illustrated in fig. 14. And indicating to view the same-style commodity information and the purchase link information of the plurality of items by pressing a specific key (for example, a left key) of a remote controller of the multimedia controller.
Based on the technical scheme, after the micro-projection device displays the target objects, the micro-projection device can further display the associated information of each target object according to the received instruction. Therefore, the user can acquire more information of the target object while normally watching the picture of the laser projection equipment, and the use experience of the user is further improved.
Optionally, in the above interaction flow, information transmission among the multimedia controller, the laser projection device, and the micro-projection device may be transmitted through a serial port.
Optionally, an embodiment of the present application further provides a storage medium, where instructions are stored, and when the storage medium is run on a computer, the storage medium causes the computer to execute the method in the above method embodiment.
Optionally, an embodiment of the present application further provides a chip for executing the instruction, where the chip is used to execute the method in the foregoing method embodiment.
The embodiment of the present application further provides a program product, where the program product includes a computer program, where the computer program is stored in a storage medium, and at least one processor may read the computer program from the storage medium, and when the at least one processor executes the computer program, the at least one processor may implement the method in the above method embodiment.
In the embodiments of the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship; in the formula, the character "/" indicates that the preceding and following related objects are in a relationship of "division". "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
It is to be understood that the various numerical references referred to in the embodiments of the present application are merely for convenience of description and distinction and are not intended to limit the scope of the embodiments of the present application.
It should be understood that, in the embodiment of the present invention, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiment of the present application.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (19)

1. A display system, comprising:
the display device comprises a first display, a second display, a first controller and a second controller;
the first controller configured to: receiving a picture search instruction input by a user, intercepting a current displayed image of the first display, acquiring an identification result of the image and sending a first instruction to the second controller in response to the picture search instruction, wherein the first instruction is used for indicating the second controller to display the identification result;
the second controller configured to: displaying the recognition result on the second display in response to the first instruction.
2. The display system of claim 1, wherein the second controller is configured to:
receiving a viewing instruction input by a user, wherein the viewing instruction is used for indicating to view the associated information of the target object in the recognition result;
and responding to the viewing instruction, acquiring the associated information of the target object, and displaying the associated information of the target object on the second display.
3. The display system of claim 2, wherein the second controller is configured to:
when the target object is a person, first associated information of the person is obtained, and the first associated information is displayed on the second display, wherein the first associated information comprises introduction information of the person.
4. The display system of claim 2, wherein the second controller is configured to:
and when the target object is a channel, acquiring second associated information of the channel, and displaying the second associated information on the second display, wherein the second associated information comprises program information and a program forecast which are currently played by the channel.
5. The display system of claim 2, wherein the second controller is configured to:
and when the target object is an article, acquiring third related information of the article, and displaying the third related information on the second display, wherein the third related information comprises the same-style commodity information and purchase link information of the article.
6. A display method, comprising:
receiving a first instruction, wherein the first instruction is sent by a first controller after receiving a picture search instruction input by a user, intercepting an image currently displayed by a first display and acquiring an identification result of the image, and the first instruction is used for indicating to display the identification result;
displaying the recognition result on a second display in response to the first instruction.
7. The method of claim 6, further comprising:
receiving a viewing instruction input by a user, wherein the viewing instruction is used for indicating to view the associated information of the target object in the recognition result;
and responding to the viewing instruction, acquiring the associated information of the target object, and displaying the associated information of the target object on the second display.
8. The method of claim 7, wherein the obtaining the associated information of the target object and displaying the associated information of the target object on the second display comprises:
when the target object is a person, first associated information of the person is obtained, and the first associated information is displayed on the second display, wherein the first associated information comprises introduction information of the person.
9. The method of claim 7, wherein the obtaining the associated information of the target object and displaying the associated information of the target object on the second display comprises:
and when the target object is a channel, acquiring second associated information of the channel, and displaying the second associated information on the second display, wherein the second associated information comprises program information and a program forecast which are currently played by the channel.
10. The method of claim 7, wherein the obtaining the associated information of the target object and displaying the associated information of the target object on the second display comprises:
and when the target object is an article, acquiring third related information of the article, and displaying the third related information on the second display, wherein the third related information comprises the same-style commodity information and purchase link information of the article.
11. A display system, comprising: a multimedia controller, a laser projection device, and a micro-projection device;
wherein the multimedia controller is configured to: receiving a picture search instruction, responding to the picture search instruction, determining an image currently displayed by the laser projection equipment, acquiring an identification result of the image, and sending a first instruction to the micro-projection equipment, wherein the first instruction is used for instructing the micro-projection equipment to display the identification result;
the micro-projection device configured to: receiving a first instruction from the multimedia controller, and displaying the identification result in response to the first instruction.
12. The system of claim 11, wherein the micro-projection device is configured to:
determining a first operation, wherein the first operation is used for selecting a target object in the recognition result;
in response to the first operation, sending a second instruction to the multimedia controller; the second instruction is used for indicating the target object;
receiving a third instruction from the multimedia controller, wherein the third instruction is used for indicating the associated information of the target object;
and responding to the third instruction, and displaying the associated information of the target object.
13. The system of claim 12, wherein the multimedia controller is configured to:
receiving a second instruction from the micro-projection device;
responding to the second instruction, and sending a fourth instruction to a cloud server, wherein the fourth instruction is used for instructing the cloud server to search the relevant information of the target object;
receiving a fifth instruction from the cloud server, wherein the fifth instruction is used for indicating the association information of the target object;
and determining the third instruction according to the fifth instruction, and sending the third instruction to the micro-projection equipment.
14. The system according to claim 12 or 13, wherein when the target object is a person, the related information of the target object includes introduction information of the person;
when the target object is a channel, the associated information of the target object includes at least one of the following items: program information currently played by the channel, and program forecast of the channel;
when the target object is an article, the associated information of the target object includes at least one of the following items: the same-style commodity information of the article and the same-style commodity purchasing link information of the article.
15. A display method, for use in a multimedia controller as claimed in any one of claims 11 to 14, the method comprising:
receiving a graph searching instruction;
responding to the image searching instruction, determining an image currently displayed by the laser projection equipment, and acquiring an identification result of the image;
and sending a first instruction to a micro-projection device, wherein the first instruction is used for instructing the micro-projection device to display the identification result.
16. The method of claim 15, further comprising:
receiving a second instruction from the micro-projection device;
responding to the second instruction, and sending a fourth instruction to a cloud server, wherein the fourth instruction is used for instructing the cloud server to search for associated information of a target object; the target object is a selected object in the recognition result of the image;
receiving a fifth instruction from the cloud server, wherein the fifth instruction is used for indicating the association information of the target object;
and determining the third instruction according to the fifth instruction, and sending the third instruction to the micro-projection equipment.
17. A display method applied to the micro-projection device according to any one of claims 11 to 14, the method comprising:
receiving a first instruction from a multimedia controller; the first instruction is used for instructing the micro-projection equipment to display an identification result; the identification result is determined by the multimedia controller according to the image searching instruction, and the image currently displayed by the laser projection equipment is identified;
and responding to the first instruction, and displaying the recognition result.
18. The method of claim 17, further comprising:
determining a first operation, wherein the first operation is used for selecting a target object in the recognition result;
in response to the first operation, sending a second instruction to the multimedia controller; the second instruction is used for indicating the target object;
receiving a third instruction from the multimedia controller, wherein the third instruction is used for indicating the associated information of the target object;
and responding to the third instruction, and displaying the associated information of the target object.
19. A computing device, comprising:
a memory for storing program instructions;
a processor for calling the program instructions stored in the memory and executing the method of any one of claims 6 to 10 according to the obtained program;
or a processor for calling program instructions stored in the memory and executing the method of claim 15 or 16 according to the obtained program;
or a processor for calling program instructions stored in said memory to execute the method of claim 17 or 18 according to the obtained program.
CN202011217873.XA 2019-11-04 2020-11-04 Display system, display method and computing device Active CN112269553B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911067968 2019-11-04
CN2019110679685 2019-11-04

Publications (2)

Publication Number Publication Date
CN112269553A true CN112269553A (en) 2021-01-26
CN112269553B CN112269553B (en) 2023-11-07

Family

ID=74344300

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010015561.4A Pending CN112784137A (en) 2019-11-04 2020-01-07 Display device, display method and computing device
CN202011217873.XA Active CN112269553B (en) 2019-11-04 2020-11-04 Display system, display method and computing device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202010015561.4A Pending CN112784137A (en) 2019-11-04 2020-01-07 Display device, display method and computing device

Country Status (2)

Country Link
CN (2) CN112784137A (en)
WO (1) WO2021088889A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113139492A (en) * 2021-04-30 2021-07-20 百度在线网络技术(北京)有限公司 Article identification method, article identification device, electronic equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104769957A (en) * 2012-09-19 2015-07-08 谷歌公司 Identification and presentation of internet-accessible content associated with currently playing television programs
CN107533360A (en) * 2015-12-07 2018-01-02 华为技术有限公司 A kind of method for showing, handling and relevant apparatus
CN107851270A (en) * 2015-10-09 2018-03-27 谷歌有限责任公司 The method of the media content of advertisement, system and medium on the second screen equipment are presented on using main equipment
CN108600846A (en) * 2018-03-15 2018-09-28 聚好看科技股份有限公司 Mobile terminal and the method for facilitating the search for virtual goods information
CN109034115A (en) * 2018-08-22 2018-12-18 Oppo广东移动通信有限公司 Video knows drawing method, device, terminal and storage medium
CN109922363A (en) * 2019-03-15 2019-06-21 青岛海信电器股份有限公司 A kind of graphical user interface method and display equipment of display screen shot

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9015139B2 (en) * 2010-05-14 2015-04-21 Rovi Guides, Inc. Systems and methods for performing a search based on a media content snapshot image
KR101799443B1 (en) * 2011-05-02 2017-11-20 삼성전자주식회사 Method for surveying watching of video content, Broadcasting receiving apparatus and Server thereof
CN105763930B (en) * 2014-12-17 2019-11-12 乐金电子(中国)研究开发中心有限公司 A kind of method, intelligent TV set and set-top box pushing TV programme two dimensional code
CN107105340A (en) * 2017-03-21 2017-08-29 百度在线网络技术(北京)有限公司 People information methods, devices and systems are shown in video based on artificial intelligence
CN107480236B (en) * 2017-08-08 2021-03-26 深圳创维数字技术有限公司 Information query method, device, equipment and medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104769957A (en) * 2012-09-19 2015-07-08 谷歌公司 Identification and presentation of internet-accessible content associated with currently playing television programs
CN107851270A (en) * 2015-10-09 2018-03-27 谷歌有限责任公司 The method of the media content of advertisement, system and medium on the second screen equipment are presented on using main equipment
CN107533360A (en) * 2015-12-07 2018-01-02 华为技术有限公司 A kind of method for showing, handling and relevant apparatus
CN108600846A (en) * 2018-03-15 2018-09-28 聚好看科技股份有限公司 Mobile terminal and the method for facilitating the search for virtual goods information
CN109034115A (en) * 2018-08-22 2018-12-18 Oppo广东移动通信有限公司 Video knows drawing method, device, terminal and storage medium
CN109922363A (en) * 2019-03-15 2019-06-21 青岛海信电器股份有限公司 A kind of graphical user interface method and display equipment of display screen shot

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113139492A (en) * 2021-04-30 2021-07-20 百度在线网络技术(北京)有限公司 Article identification method, article identification device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN112269553B (en) 2023-11-07
CN112784137A (en) 2021-05-11
WO2021088889A1 (en) 2021-05-14

Similar Documents

Publication Publication Date Title
CN110611787B (en) Display and image processing method
CN112383802B (en) Focus switching method, projection display device and system
CN112073770B (en) Display device and video communication data processing method
CN112399264B (en) Projection hall service management method and application
CN112073774A (en) Image quality processing method and display device
US11877091B2 (en) Method for adjusting position of video chat window and display device
CN112399263A (en) Interaction method, display device and mobile terminal
WO2021088890A1 (en) Display system and display method
CN112269553B (en) Display system, display method and computing device
CN111385631B (en) Display device, communication method and storage medium
CN112995733B (en) Display device, device discovery method and storage medium
CN112068741A (en) Display device and display method for Bluetooth switch state of display device
CN112783380A (en) Display apparatus and method
CN112073666B (en) Power supply control method of display equipment and display equipment
CN112073777B (en) Voice interaction method and display device
CN112463267B (en) Method for presenting screen saver information on display device screen and display device
CN112399071B (en) Control method and device for camera motor and display equipment
CN113301404A (en) Display apparatus and control method
CN112073773A (en) Screen interaction method and device and display equipment
CN112073808A (en) Color space switching method and display device
CN112073779B (en) Display device and fault-tolerant method for key transmission
CN112073763A (en) Display device
CN112071312A (en) Voice control method and display device
CN112995762A (en) Display device and network state synchronization method
CN112911353A (en) Display device, port scheduling method and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant