CN112399263A - Interaction method, display device and mobile terminal - Google Patents

Interaction method, display device and mobile terminal Download PDF

Info

Publication number
CN112399263A
CN112399263A CN202010224068.3A CN202010224068A CN112399263A CN 112399263 A CN112399263 A CN 112399263A CN 202010224068 A CN202010224068 A CN 202010224068A CN 112399263 A CN112399263 A CN 112399263A
Authority
CN
China
Prior art keywords
message
display device
interactive message
mobile terminal
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010224068.3A
Other languages
Chinese (zh)
Inventor
王金童
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Media Network Technology Co Ltd
Juhaokan Technology Co Ltd
Original Assignee
Qingdao Hisense Media Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Media Network Technology Co Ltd filed Critical Qingdao Hisense Media Network Technology Co Ltd
Priority to CN202080024297.9A priority Critical patent/CN113661715B/en
Priority to PCT/CN2020/108503 priority patent/WO2021031940A1/en
Publication of CN112399263A publication Critical patent/CN112399263A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
    • H04N21/8586Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot by using a URL

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides an interaction method, display equipment and a mobile terminal, when video playing is carried out on a show room service, an interaction message pushed by a server is received, and the interaction message is sent to the server by a mobile terminal; if the interactive message is generated by the mobile terminal according to a preset message, displaying the interactive message on the video playing interface; and if the interactive message is an interactive message which is not generated by the mobile terminal according to a preset message, not displaying the interactive message on the video playing interface. According to the interaction method, the display device and the mobile terminal, the display device is combined with the mobile terminal to achieve social interaction of the display device, and the advantages of a touch screen interaction mode and the like of the mobile terminal are utilized, so that the convenience of inputting social information through the display device is improved.

Description

Interaction method, display device and mobile terminal
Technical Field
The application relates to the technical field of internet, in particular to an interaction method, display equipment and a mobile terminal.
Background
With the increasing application on the internet, the habits of people are changed, the pure virtualization and virtual-free interaction of the traditional internet cannot meet the requirements of users, the network is gradually integrated into the real elements, and people in the north, south, and sea can converge on the internet because of common classmates and friends, the same interests and hobbies, and similar professions and works to form own online communities. Therefore, people's habits and ways of using the internet are profoundly changing and gradually affecting telecommunication networks and broadcast television networks.
In the era of integration of three networks, televisions have had interactive functions, and the interaction is limited to the interaction between television viewers and television stations, but also includes the interaction between viewers and television contents. Therefore, the demand for future televisions and television distribution networks to have the capability of social distribution is continuously increased. And the TV inherently has social characteristic attribute, and can greatly meet the requirements of people through TV interaction, for example, a real-time online communication platform is provided for people with common interests, and people can participate in commenting and sharing own feelings with other audiences while watching movies.
Although the technology of the television is continuously developed, the interaction of the television itself and the user is still realized through the remote control system. When a television is used for interaction, interactive contents are generally required to be input through a remote controller, and then information interaction between human and machines is realized through a remote control system, so that the remote controller cannot conveniently and quickly input information, interaction timeliness is restricted, and development of social contact of the television is restricted through human-machine interaction of the remote control system. Therefore, how to conveniently realize social contact through a television is a technical problem to be solved urgently by technical personnel in the field.
Disclosure of Invention
The application provides an interaction method, an interaction system and application, and social convenience of display equipment is improved.
In a first aspect, the present application provides an interactive method for a display device, the method comprising:
when video playing is carried out on a auditorium service, receiving an interactive message pushed by a server, wherein the interactive message is sent to the server by a mobile terminal;
if the interactive message is generated by the mobile terminal according to a preset message, displaying the interactive message on the video playing interface;
and if the interactive message is an interactive message which is not generated by the mobile terminal according to a preset message, not displaying the interactive message on the video playing interface.
In a second aspect, the present application provides an interaction method for a mobile terminal, the method including:
scanning a coding graph on a display device, wherein the coding graph is generated according to an identifier of a video showing hall service when the display device plays a video in the video showing hall service;
loading an interactive message editing interface according to the URL address obtained by analyzing the coded graph;
receiving a selection of a user on a preset message to generate a first interactive message on the interactive message editing interface, wherein the first interactive message is used for displaying on a mobile terminal and a display device corresponding to the auditorium service;
and receiving input of a user for editing characters to generate a second interactive message on the interactive message editing interface, wherein the second interactive message is used for being displayed only on the mobile terminal corresponding to the auditorium service and not being displayed on the display equipment corresponding to the auditorium service.
In a third aspect, the present application provides a display device comprising:
a display configured to display a user interface and a video playback interface;
a controller for communicative connection with the display, the controller configured to:
when video playing is carried out on the auditorium service, receiving an interactive message pushed by a server, wherein the interactive message is sent to the server by the mobile terminal;
if the interactive message is generated by the mobile terminal according to a preset message, displaying the interactive message on the video playing interface;
and if the interactive message is an interactive message which is not generated by the mobile terminal according to a preset message, not displaying the interactive message on the video playing interface.
In a fourth aspect, the present application provides a mobile terminal, comprising:
a display configured to display a user interface and a play screen;
a controller for communicative connection with the display, the controller configured to:
scanning a coding graph on a display device, wherein the coding graph is generated according to an identifier of a video showing hall service when the display device plays a video in the video showing hall service;
loading an interactive message editing interface according to the URL address obtained by analyzing the coded graph;
receiving a selection of a user on a preset message to generate a first interactive message on the interactive message editing interface, wherein the first interactive message is used for displaying on a mobile terminal and a display device corresponding to the auditorium service;
and receiving input of a user for editing characters to generate a second interactive message on the interactive message editing interface, wherein the second interactive message is used for being displayed only on the mobile terminal corresponding to the auditorium service and not being displayed on the display equipment corresponding to the auditorium service.
The application provides an interaction method, a display device and a mobile terminal: when video playing is carried out on the service of the auditorium by the display equipment, a coding graph is generated according to the identification of the service of the auditorium; the mobile terminal scans the coded graph on the display equipment, analyzes the coded graph to obtain a URL address, and loads an interactive message editing interface according to the URL address; a user can select a preset message to send a first interactive message or input an editing character to send a second interactive message on an interactive message editing interface; the server pushes various interactive messages, so that the first interactive message is displayed on the mobile terminal and the display device corresponding to the auditorium service, the second interactive message is displayed on the mobile terminal corresponding to the auditorium service, and the user can conveniently display social contact through the combination of the display device and the mobile terminal. Therefore, according to the interaction method, the display device and the mobile terminal, the display device is combined with the mobile terminal to achieve social interaction of the display device, and the advantages of the mobile terminal, such as a touch screen interaction mode, are utilized, so that the convenience of inputting social information through the display device is improved. Compared with television social contact through a remote controller, the display device social contact realized by the method has higher convenience.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without any creative effort.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus in some embodiments;
fig. 2 is a block diagram illustrating a hardware configuration of the control apparatus 100 in some embodiments;
a block diagram of the hardware configuration of the display device 200 in some embodiments is illustrated in fig. 3;
a block diagram of the hardware architecture of the display device 200 of fig. 3 is exemplarily shown in fig. 4;
fig. 5 is a diagram schematically illustrating a functional configuration of the display device 200 in some embodiments;
fig. 6a schematically illustrates a software configuration in the display device 200 in some embodiments;
FIG. 6b is a schematic diagram illustrating the configuration of applications in display device 200 in some embodiments;
FIG. 7 is a schematic diagram illustrating a user interface in the display device 200 in some embodiments;
FIG. 8 illustrates a display device display interface diagram one in some embodiments;
FIG. 9 illustrates a display device display interface diagram two in some embodiments;
FIG. 10 illustrates a display device display interface diagram three in some embodiments;
FIG. 11 is a timing diagram illustrating interaction with a display device in some embodiments;
FIG. 12 illustrates a display device display interface diagram four in some embodiments;
FIG. 13 illustrates a display device display interface diagram five in some embodiments;
FIG. 14 is a diagram illustrating a mobile terminal display interface in some embodiments;
FIG. 15 is a timing diagram that illustrates an interaction method in some embodiments;
a timing diagram of another interaction method in some embodiments is illustrated in fig. 16.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
The present application relates to a display device including at least two system-on-chips, and for convenience of understanding, a display device of a multi-chip structure is described herein.
For the convenience of users, various external device interfaces are usually provided on the display device to facilitate connection of different peripheral devices or cables to implement corresponding functions. When a high-definition camera is connected to an interface of the display device, if a hardware system of the display device does not have a hardware interface of a high-pixel camera receiving the source code, data received by the camera cannot be displayed on a display screen of the display device.
Furthermore, due to the hardware structure, the hardware system of the conventional display device only supports one path of hard decoding resources, and usually only supports video decoding with a resolution of 4K at most, so when a user wants to perform video chat while watching a network television, the user needs to use the hard decoding resources (usually GPU in the hardware system) to decode the network video without reducing the definition of the network video screen, and in this case, the user can only process the video chat screen by using a general-purpose processor (e.g. CPU) in the hardware system to perform soft decoding on the video.
The soft decoding is adopted to process the video chat picture, so that the data processing burden of a CPU (central processing unit) can be greatly increased, and when the data processing burden of the CPU is too heavy, the problem of picture blocking or unsmooth flow can occur. Further, due to the data processing capability of the CPU, when the CPU performs soft decoding on the video chat screen, multi-channel video calls cannot be generally implemented, and when a user wants to perform video chat with multiple other users in the same chat scene, access is blocked.
In view of the above aspects, to overcome the above drawbacks, the present application discloses a dual hardware system architecture to implement multiple channels of video chat data (at least one channel of local video).
The concept to which the present application relates will be first explained below with reference to the drawings. It should be noted that the following descriptions of the concepts are only for the purpose of facilitating understanding of the contents of the present application, and do not represent limitations on the scope of the present application.
The term "module," as used in various embodiments of the present application, may refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in the embodiments of the present application refers to a component of an electronic device (such as the display device disclosed in the present application) that is capable of wirelessly controlling the electronic device, typically over a short distance. The component may typically be connected to the electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in the embodiments of the present application refers to a user behavior used to express an intended idea, action, purpose, or result through a change in hand shape or an action such as hand movement.
The term "hardware system" used in the embodiments of the present application may refer to a physical component having computing, controlling, storing, inputting and outputting functions, which is formed by a mechanical, optical, electrical and magnetic device such as an Integrated Circuit (IC), a Printed Circuit Board (PCB) and the like. In various embodiments of the present application, a hardware system may also be referred to as a motherboard (or chip).
A schematic diagram of an operation scenario between a display device and a control apparatus in some embodiments is illustrated in fig. 1. As shown in fig. 1, a user may operate the display apparatus 200 through the control device 100.
The control device 100 may be a remote controller 100A, which can communicate with the display device 200 through an infrared protocol communication, a bluetooth protocol communication, a ZigBee (ZigBee) protocol communication, or other short-range communication, and is used to control the display device 200 in a wireless or other wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
The control apparatus 100 may also be a smart device, such as a mobile terminal 100B, a tablet computer, a notebook computer, etc., which may communicate with the display device 200 through a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), or other networks, and implement control of the display device 200 through an application program corresponding to the display device 200.
For example, the mobile terminal 100B and the display device 200 may each have a software application installed thereon, so that connection communication between the two can be realized through a network communication protocol, and the purpose of one-to-one control operation and data communication can be further realized. Such as: a control instruction protocol can be established between the mobile terminal 100B and the display device 200, a remote control keyboard is synchronized to the mobile terminal 100B, and the function of controlling the display device 200 is realized by controlling a user interface on the mobile terminal 100B; the audio and video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
As shown in fig. 1, the display apparatus 200 may also perform data communication with the server 300 through various communication means. In various embodiments of the present application, the display device 200 may be allowed to be communicatively coupled to the server 300 via a local area network, a wireless local area network, or other network. The server 300 may provide various contents and interactions to the display apparatus 200.
Illustratively, the display device 200 receives software Program updates, or accesses a remotely stored digital media library by sending and receiving information, and Electronic Program Guide (EPG) interactions. The servers 300 may be a group or groups, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through the server 300.
The display device 200 may be a liquid crystal display, an oled (organic Light Emitting diode) display, a projection display device, or an intelligent tv. The specific display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function that provides a computer support function in addition to the broadcast receiving tv function. Examples include a web tv, a smart tv, an Internet Protocol Tv (IPTV), and the like.
As shown in fig. 1, the display device may be connected or provided with a camera, and is configured to present a picture taken by the camera on a display interface of the display device or other display devices, so as to implement interactive chat between users. In some embodiments, the picture captured by the camera may be displayed on the display device in full screen, half screen, or any selectable area.
As an optional connection mode, the camera is connected with the display rear shell through the connecting plate, is fixedly installed in the middle of the upper side of the display rear shell, and can be fixedly installed at any position of the display rear shell as an installable mode, so that an image acquisition area is ensured not to be shielded by the rear shell, for example, the display orientation of the image acquisition area is the same as that of the display equipment.
As another alternative connection mode, the camera is connected to the display rear shell through a connection board or other conceivable connector, the camera is capable of lifting, the connector is provided with a lifting motor, when a user wants to use the camera or an application program wants to use the camera, the camera is lifted out of the display, and when the camera is not needed, the camera can be embedded in the rear shell to protect the camera from being damaged.
As an embodiment, the camera adopted in the present application may have 1600 ten thousand pixels, so as to achieve the purpose of ultra high definition display. In actual use, cameras higher or lower than 1600 ten thousand pixels may also be used.
After the camera is installed on the display device, the contents displayed by different application scenes of the display device can be fused in various different modes, so that the function which cannot be realized by the traditional display device is achieved.
Illustratively, a user may conduct a video chat with at least one other user while watching a video program. The presentation of the video program may be as a background frame over which a window for video chat is displayed. The function is called 'chat while watching'.
Optionally, in a scene of "chat while watching", at least one video chat is performed across terminals while watching a live video or a network video.
In another example, a user can conduct a video chat with at least one other user while entering the educational application for learning. For example, a student may interact remotely with a teacher while learning content in an educational application. Vividly, this function can be called "chatting while learning".
In another example, a user conducts a video chat with a player entering a card game while playing the game. For example, a player may enable remote interaction with other players when entering a gaming application to participate in a game. Figuratively, this function may be referred to as "watch while playing".
Optionally, the game scene is fused with the video picture, the portrait in the video picture is scratched and displayed in the game picture, and the user experience is improved.
Optionally, in the motion sensing game (such as ball hitting, boxing, running and dancing), the human posture and motion, limb detection and tracking and human skeleton key point data detection are obtained through the camera, and then the human posture and motion, the limb detection and tracking and the human skeleton key point data detection are fused with the animation in the game, so that the game of scenes such as sports and dancing is realized.
In another example, a user may interact with at least one other user in a karaoke application in video and voice. Vividly, this function can be called "sing while watching". Preferably, when at least one user enters the application in a chat scenario, a plurality of users can jointly complete recording of a song.
In another example, a user may turn on a camera locally to take pictures and videos, figurative, which may be referred to as "looking into the mirror".
In other examples, more or less functionality may be added. The function of the display device is not particularly limited in the present application.
Fig. 2 is a block diagram schematically showing the configuration of the control apparatus 100 according to the exemplary embodiment. As shown in fig. 2, the control device 100 includes at least one of a controller 110, a communicator 130, a user input/output interface 140, a memory 190, and a power supply 180.
The control apparatus 100 is configured to control the display device 200, and to receive an input operation instruction from a user, and convert the operation instruction into an instruction recognizable and responsive by the display device 200, and to mediate interaction between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications that control the display device 200 according to user demands.
In some embodiments, as shown in fig. 1, the mobile terminal 100B or other intelligent electronic device may function similar to the control apparatus 100 after installing an application for manipulating the display device 200. Such as: the user may implement the functions of controlling the physical keys of the apparatus 100 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 100B or other intelligent electronic devices.
The controller 110 includes a processor 112, a RAM113 and a ROM114, a communication interface, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components for communication and coordination and external and internal data processing functions.
The communicator 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display apparatus 200. The communicator 130 may include at least one of a WIFI module 131, a bluetooth module 132, an NFC module 133, and the like.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touch pad 142, a sensor 143, a key 144, and the like. Such as: the user can realize a user instruction input function through actions such as voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display apparatus 200. In some embodiments, it may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
In some embodiments, the control device 100 includes at least one of a communicator 130 and an output interface. The communicator 130 is configured in the control device 100, such as: the modules of WIFI, bluetooth, NFC, etc. may send the user input command to the display device 200 through the WIFI protocol, or the bluetooth protocol, or the NFC protocol code.
And a memory 190 for storing various operation programs, data and applications for driving and controlling the control apparatus 100 under the control of the controller 110. The memory 190 may store various control signal commands input by a user.
And a power supply 180 for providing operational power support to the components of the control device 100 under the control of the controller 110. A battery and associated control circuitry.
A hardware configuration block diagram of a hardware system in the display apparatus 200 according to an exemplary embodiment is exemplarily shown in fig. 3.
When a dual hardware system architecture is adopted, the mechanism relationship of the hardware system can be shown in fig. 3. For convenience of description, one hardware system in the dual hardware system architecture will be referred to as a first hardware system or a system, a-chip, and the other hardware system will be referred to as a second hardware system or N-system, N-chip. The first hardware system comprises a controller of the chip A and various interfaces, and the second hardware system comprises a controller of the chip N and various interfaces. The a-chip and the N-chip may each have a separate operating system installed therein, so that there are two separate but interrelated subsystems in the display apparatus 200.
As shown in fig. 3, the a chip and the N chip may be connected, communicated and powered through a plurality of different types of interfaces. The interface type of the interface between the a chip and the N chip may include at least one of a General-purpose input/output (GPIO), a USB interface, an HDMI interface, a UART interface, and the like. One or more of these interfaces may be used for communication or power transfer between the a-chip and the N-chip. For example, as shown in fig. 3, in the dual hardware system architecture, the N chip may be powered by an external power source (power), and the a chip may not be powered by the external power source but by the N chip.
In addition to the interface for connecting with the N chip, the a chip may further include an interface for connecting other devices or components, such as an MIPI interface for connecting a Camera (Camera) shown in fig. 3, a bluetooth interface, and the like.
Similarly, in addition to the interface for connecting with the N chip, the N chip may further include an VBY interface for connecting with a display screen tcon (timer Control register), and an i2S interface for connecting with a power Amplifier (AMP) and a Speaker (Speaker); and at least one of an IR/Key interface, a USB interface, a Wifi interface, a bluetooth interface, an HDMI interface, a Tuner interface, and the like.
The dual hardware system architecture of the present application is further described below with reference to fig. 4. It should be noted that fig. 4 is only an exemplary illustration of the dual hardware system architecture of the present application, and does not represent a limitation of the present application. In actual practice, both hardware systems may contain more or less hardware or interfaces as desired.
A block diagram of the hardware architecture of the display device 200 according to fig. 3 is exemplarily shown in fig. 4. As shown in fig. 4, the display device 200 may include an a chip (second controller) and an N chip (first controller), and a module connected to the a chip or the N chip through various interfaces.
The N-chip may include at least one of a tuner demodulator 220, a communicator 230, an external device interface 250, a controller 210, a memory 290, a user input interface, a video processor 260-1, an audio processor 260-2, a display 280, an audio output interface 270, and a power supply. The N-chip may also include more or fewer modules in other embodiments.
The tuning demodulator 220 is configured to perform modulation and demodulation processing such as amplification, mixing, resonance and the like on a broadcast television signal received in a wired or wireless manner, so as to demodulate an audio/video signal carried in a frequency of a television channel selected by a user and additional information (e.g., an EPG data signal) from a plurality of wireless or wired broadcast television signals. Depending on the broadcast system of the television signal, the signal path of the tuner 220 may be various, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; according to different modulation types, the adjustment mode of the signal can be a digital modulation mode or an analog modulation mode; and depending on the type of television signal being received, tuner demodulator 220 may demodulate analog and/or digital signals.
The tuner demodulator 220 is also operative to respond to the user-selected television channel frequency and the television signals carried thereby, in accordance with the user selection, and as controlled by the controller 210.
In other exemplary embodiments, the tuner/demodulator 220 may be in an external device, such as an external set-top box. In this way, the set-top box outputs television audio/video signals after modulation and demodulation, and the television audio/video signals are input into the display device 200 through the external device interface 250.
The communicator 230 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communicator 230 may include a WIFI module 231, a bluetooth communication protocol module 232, a wired ethernet communication protocol module 233, and other network communication protocol modules such as an infrared communication protocol module or a near field communication protocol module.
The display apparatus 200 may establish a connection of a control signal and a data signal with an external control apparatus or a content providing apparatus through the communicator 230. For example, the communicator may receive a control signal of the remote controller 100A according to the control of the controller.
The external device interface 250 is a component for providing data transmission between the N-chip controller 210 and the a-chip and other external devices. The external device interface may be connected with an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
The external device interface 250 may include: a High Definition Multimedia Interface (HDMI) terminal 251, a Composite Video Blanking Sync (CVBS) terminal 252, an analog or digital component terminal 253, a Universal Serial Bus (USB) terminal 254, a red, green, blue (RGB) terminal (not shown), and the like. The number and type of external device interfaces are not limited by this application.
The first controller 210 controls the operation of the display apparatus 200 and responds to the operation of the user by running various software control programs (e.g., an operating system and/or various application programs) stored on the memory 290.
As shown in fig. 4, the first controller 210 includes at least one of a read only memory RAM214, a random access memory ROM213, a graphics processor 216, a CPU processor 212, a communication interface 218, and a communication bus. The RAM214, the ROM213, the graphic processor 216, the CPU processor 212, and the communication interface 218 are connected via a bus.
A ROM213 for storing instructions for various system boots. If the display apparatus 200 starts power-on upon receipt of the power-on signal, the CPU processor 212 executes a system boot instruction in the ROM, and copies temporary data of the operating system stored in the memory 290 to the RAM214 to start running the boot operating system. After the start of the operating system is completed, the CPU processor 212 copies the temporary data of the various application programs in the memory 290 to the RAM214, and then starts running and starting the various application programs.
A graphics processor 216 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And a renderer for generating various objects based on the operator and displaying the rendered result on the display 280.
A CPU processor 212 for executing operating system and application program instructions stored in memory 290. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 212 may include a plurality of processors. The plurality of processors may include a main processor and a plurality of or a sub-processor. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. A plurality of or one sub-processor for performing an operation in a standby mode or the like.
The communication interfaces may include a first interface 218-1 through an nth interface 218-n. These interfaces may be network interfaces that are connected to external devices via a network.
The first controller 210 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to an icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
The memory 290 includes a memory for storing various software modules for driving and controlling the display apparatus 200. Such as: various software modules stored in memory 290, including: at least one of a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
The basic module is a bottom layer software module for signal communication between hardware in the display device 200 and sending processing and control signals to an upper layer module. The detection module is a management module used for collecting various information from various sensors or user input interfaces, and performing digital-to-analog conversion and analysis management.
For example: the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is a module for controlling the display 280 to display image content, and may be used to play information such as multimedia image content and UI interface. The communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing data communication between the browsing servers. The service module is a module for providing various services and various application programs.
Meanwhile, the memory 290 is also used to store visual effect maps and the like for receiving external data and user data, images of respective items in various user interfaces, and a focus object.
A user input interface for transmitting an input signal of a user to the first controller 210 or transmitting a signal output from the controller to the user. For example, the control device (e.g., a mobile terminal or a remote controller) may send an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by a user to the user input interface, and then the input signal is forwarded to the controller by the user input interface; alternatively, the control device may receive an output signal such as audio, video, or data output from the user input interface via the controller, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter a user command on a Graphical User Interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
The video processor 260-1 is configured to receive a video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on the display 280.
Illustratively, the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, such as a 24Hz, 25Hz, 30Hz, or 60Hz video, into a 60Hz, 120Hz, or 240Hz frame rate, where the input frame rate may be related to a source video stream, and the output frame rate may be related to an update rate of a display. The input is realized in a common format by using a frame insertion mode.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output an RGB data signal.
And a display 280 for receiving the image signal input from the video processor 260-1 and displaying the video content and image and the menu manipulation interface. The display 280 includes a display component for presenting a picture and a driving component for driving the display of an image. The video content may be displayed from the video in the broadcast signal received by the tuner/demodulator 220, or from the video content input from the communicator or the external device interface. And a display 220 simultaneously displaying a user manipulation interface UI generated in the display apparatus 200 and used to control the display apparatus 200.
And, a driving component for driving the display according to the type of the display 280. Alternatively, in case the display 280 is a projection display, it may also comprise a projection device and a projection screen.
The audio processor 260-2 is configured to receive an audio signal, decompress and decode the audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, amplification and other audio data processing to obtain an audio signal that can be played in the speaker 272.
An audio output interface 270 for receiving the audio signal output by the audio processor 260-2 under the control of the controller 210, wherein the audio output interface may include a speaker 272 or an external sound output terminal 274 for outputting to a generating device of an external device, such as: external sound terminal or earphone output terminal.
In other exemplary embodiments, video processor 260-1 may comprise one or more chip components. The audio processor 260-2 may also include one or more chips.
And, in other exemplary embodiments, the video processor 260-1 and the audio processor 260-2 may be separate chips or may be integrated in one or more chips with the controller 210.
And a power supply for supplying power supply support to the display apparatus 200 from the power input from the external power source under the control of the controller 210. The power supply may include a built-in power supply circuit installed inside the display apparatus 200, or may be a power supply installed outside the display apparatus 200, such as a power supply interface for providing an external power supply in the display apparatus 200.
Similar to the N-chip, as shown in fig. 4, the a-chip may include a second controller 310, a communicator 330, a detector 340, and a memory 390. At least one of a user input interface, a video processor, an audio processor, a display, an audio output interface may also be included in some embodiments. In some embodiments, there may also be a power supply that independently powers the A-chip.
The communicator 330 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communicator 330 may include a WIFI module 331, a bluetooth communication protocol module 332, a wired ethernet communication protocol module 333, and other network communication protocol modules such as an infrared communication protocol module or a near field communication protocol module.
The communicator 330 of the a-chip and the communicator 230 of the N-chip also interact with each other. For example, the N-chip WiFi module 231 is used to connect to an external network, generate network communication with an external server, and the like. The WiFi module 331 of the a chip is used to connect to the WiFi module 231 of the N chip without making a direct connection with an external network or the like. Therefore, for the user, a display device as in the above embodiment displays a WiFi account to the outside.
The detector 340 is a component of the display device a chip for collecting signals of an external environment or interacting with the outside. The detector 340 may include a light receiver 342, a sensor for collecting the intensity of ambient light, which may be used to adapt to display parameter changes, etc.; the system may further include an image collector 341, such as a camera, a video camera, etc., which may be configured to collect external environment scenes, collect attributes of the user or interact gestures with the user, adaptively change display parameters, and identify user gestures, so as to implement a function of interaction with the user.
An external device interface 350, which provides a component for data transmission between the second controller 310 and the N-chip or other external devices. The external device interface may be connected with an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner.
The second controller 310 controls the operation of the display apparatus 200 and responds to the user's operation by running various software control programs stored on the memory 390 (e.g., using installed third party applications, etc.), and interacting with the N-chip.
As shown in fig. 4, the second controller 310 includes at least one of a read only memory ROM313, a random access memory RAM314, a graphic processor 316, a CPU processor 312, a communication interface 318, and a communication bus. The ROM313 and the RAM314, the graphic processor 316, the CPU processor 312, and the communication interface 318 are connected via a bus.
A ROM313 for storing instructions for various system boots. CPU processor 312 executes system boot instructions in ROM and copies the operating system stored in memory 390 to RAM314 to begin running the boot operating system. After the start of the operating system is completed, the CPU processor 312 copies various application programs in the memory 390 to the RAM314, and then starts running and starting various application programs.
The CPU processor 312 is used for executing the operating system and application program instructions stored in the memory 390, communicating with the N chip, transmitting and interacting signals, data, instructions, etc., and executing various application programs, data and contents according to various interaction instructions received from the outside, so as to finally display and play various audio and video contents.
The communication interface 318 may include a first interface 318-1 through an nth interface 318-n. These interfaces may be network interfaces connected to external devices via a network, or may be network interfaces connected to the N-chip via a network.
The second controller 310 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
A graphics processor 316 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And a renderer for generating various objects based on the operator and displaying the rendered result on the display 280.
Both the A-chip graphics processor 316 and the N-chip graphics processor 216 are capable of generating various graphics objects. In distinction, if application 1 is installed on the a-chip and application 2 is installed on the N-chip, the a-chip graphics processor 316 generates a graphics object when a user performs a command input by the user in application 1 at the interface of application 1. When a user makes a command input by the user in the interface of the application 2 and within the application 2, a graphic object is generated by the graphic processor 216 of the N chip.
Fig. 5 is a diagram schematically illustrating a functional configuration of a display device according to an exemplary embodiment.
As shown in fig. 5, the memory 390 of the a-chip and the memory 290 of the N-chip are used to store an operating system, an application program, contents, user data, and the like, respectively, and perform system operations for driving the display device 200 and various operations in response to a user under the control of the second controller 310 of the a-chip and the controller 210 of the N-chip. The A-chip memory 390 and the N-chip memory 290 may include volatile and/or non-volatile memory.
The memory 290 is specifically configured to store an operating program for driving the controller 210 in the display device 200, and store various applications installed in the display device 200, various applications downloaded by a user from an external device, various graphical user interfaces related to the applications, various objects related to the graphical user interfaces, user data information, and internal data of various supported applications. The memory 290 is used to store system software such as an Operating System (OS) kernel, middleware, and applications, and to store input video data and audio data, and other user data.
The memory 290 is specifically used for storing drivers and related data such as the video processor 260-1 and the audio processor 260-2, the display 280, the communication interface 230, the tuner demodulator 220, the input/output interface, and the like.
In some embodiments, memory 290 may store software and/or programs, software programs for representing an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (e.g., the middleware, APIs, or applications), and the kernel may provide interfaces to allow the middleware and APIs, or applications, to access the controller to implement controlling or managing system resources.
The memory 290, for example, includes at least one of a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a power control module 2910, an operating system 2911, and other applications 2912, a browser module, and the like. The controller 210 performs functions such as: the system comprises a broadcast television signal receiving and demodulating function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction identification function, a communication control function, an optical signal receiving function, an electric power control function, a software control platform supporting various functions, a browser function and other various functions.
The memory 390 includes a memory storing various software modules for driving and controlling the display apparatus 200. Such as: various software modules stored in memory 390, including: at least one of a base module, a detection module, a communication module, a display control module, a browser module, and various service modules, etc. Since the functions of the memory 390 and the memory 290 are similar, reference may be made to the memory 290 for relevant points, and thus, detailed description thereof is omitted here.
The memory 390, for example, includes at least one of an image control module 3904, an audio control module 2906, an external instruction recognition module 3907, a communication control module 3908, a light receiving module 3909, an operating system 3911, and other application programs 3912, a browser module, and the like. The controller 210 performs functions such as: the system comprises an image control function, a display control function, an audio control function, an external instruction identification function, a communication control function, an optical signal receiving function, an electric power control function, a software control platform supporting various functions, a browser function and other various functions.
Differently, the external instruction recognition module 2907 of the N-chip and the external instruction recognition module 3907 of the a-chip can recognize different instructions.
Illustratively, since the image receiving device such as a camera is connected with the a-chip, the external instruction recognition module 3907 of the a-chip may include an image recognition module 3907-1, a graphic database is stored in the image recognition module 3907-1, and when the camera receives an external graphic instruction, the camera corresponds to the instruction in the graphic database to perform instruction control on the display device. Since the voice receiving device and the remote controller are connected to the N-chip, the external command recognition module 2907 of the N-chip may include a voice recognition module 2907-2, a voice database is stored in the voice recognition module 2907-2, and when the voice receiving device receives an external voice command or the like, the voice receiving device and the like perform a corresponding relationship with a command in the voice database to perform command control on the display device. Similarly, a control device 100 such as a remote controller is connected to the N-chip, and a key command recognition module performs command interaction with the control device 100.
A block diagram of a configuration of a software system in a display device 200 according to an exemplary embodiment is exemplarily shown in fig. 6 a.
For an N-chip, as shown in fig. 6a, the operating system 2911, which includes executing operating software for handling various basic system services and for performing hardware related tasks, serves as an intermediary between applications and hardware components for data processing.
In some embodiments, portions of the operating system kernel may contain a series of software to manage the display device hardware resources and provide services to other programs or software code.
In other embodiments, portions of the operating system kernel may include one or more device drivers, which may be a set of software code in the operating system that assists in operating or controlling the devices or hardware associated with the display device. The drivers may contain code that operates the video, audio, and/or other multimedia components. Examples include a display, a camera, Flash, WiFi, and audio drivers.
The accessibility module 2911-1 is configured to modify or access the application program to achieve accessibility and operability of the application program for displaying content.
A communication module 2911-2 for connection to other peripherals via associated communication interfaces and a communication network.
The user interface module 2911-3 is configured to provide an object for displaying a user interface, so that each application program can access the object, and user operability can be achieved.
Control applications 2911-4 for controlling process management, including runtime applications and the like.
The event transmission system 2914 may be implemented within the operating system 2911 or within the application 2912. In some embodiments, an aspect is implemented within the operating system 2911, while implemented in the application 2912, for listening for various user input events, and will implement one or more sets of predefined operations in response to various events referring to the recognition of various types of events or sub-events.
The event monitoring module 2914-1 is configured to monitor an event or a sub-event input by the user input interface.
The event identification module 2914-2 is used to input various event definitions for various user input interfaces, identify various events or sub-events, and transmit them to the process for executing one or more sets of their corresponding handlers.
The event or sub-event refers to an input detected by one or more sensors in the display device 200 and an input of an external control device (e.g., the control apparatus 100). Such as: the method comprises the following steps of inputting various sub-events through voice, inputting a gesture sub-event through gesture recognition, inputting a remote control key command of a control device and the like. Illustratively, the one or more sub-events in the remote control include a variety of forms including, but not limited to, one or a combination of key presses up/down/left/right/, ok keys, key presses, and the like. And non-physical key operations such as move, hold, release, etc.
The interface layout management module 2913, directly or indirectly receiving the input events or sub-events from the event transmission system 2914, monitors the input events or sub-events, and updates the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the size, position, and level of the container, which are related to the layout of the interface.
Since the functions of the operating system 3911 of the a chip are similar to those of the operating system 2911 of the N chip, reference may be made to the operating system 2911 for relevant points, and details are not repeated here.
As shown in the application icon in fig. 6b, the application layer of the display device contains various applications that can be executed at the display device 200.
The N-chip application layer 2912 may include, but is not limited to, one or more applications such as: a video-on-demand application, an application center, a game application, and the like. The application layer 3912 of the a-chip may include, but is not limited to, one or more applications such as: live television applications, media center applications, and the like. It should be noted that what applications are respectively contained on the a chip and the N chip is determined according to an operating system and other designs, and the present invention does not need to define and divide the applications contained on the a chip and the N chip in some embodiments.
The live television application program can provide live television through different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
A video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
The media center application program can provide various applications for playing multimedia contents. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
The application program center can provide and store various application programs. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on a display device. The application center may obtain these applications from different sources, store them in local storage, and then be operable on the display device 200.
A schematic diagram of a user interface in a display device 200 according to an exemplary embodiment is illustrated in fig. 7. As shown in fig. 7, the user interface includes a plurality of view display areas, illustratively, a first view display area 201 and a play screen 202, wherein the play screen includes a layout of one or more different items. And a selector in the user interface indicating that the item is selected, the position of the selector being movable by user input to change the selection of a different item.
It should be noted that the multiple view display areas may present display screens of different hierarchies. For example, a first view display area may present video chat project content and a second view display area may present application layer project content (e.g., web page video, VOD presentations, application screens, etc.).
Optionally, the different view display areas are presented with different priorities, and the display priorities of the view display areas are different among the view display areas with different priorities. If the priority of the system layer is higher than that of the application layer, when the user uses the acquisition selector and picture switching in the application layer, the picture display of the view display area of the system layer is not blocked; and when the size and the position of the view display area of the application layer are changed according to the selection of the user, the size and the position of the view display area of the system layer are not influenced.
The display frames of the same hierarchy can also be presented, at this time, the selector can switch between the first view display area and the second view display area, and when the size and the position of the first view display area are changed, the size and the position of the second view display area can be changed along with the change.
Since the a-chip and the N-chip may have independent operating systems installed therein, there are two independent but interrelated subsystems in the display device 200. For example, Android (Android) and various APPs can be independently installed on the chip a and the chip N, so that each chip can realize a certain function, and the chip a and the chip N cooperatively realize a certain function.
Some embodiments of the present application provide a display device 200 that is primarily intended for use with televisions, particularly social networking televisions. Based on the display device 200 provided in some embodiments of the present application, in order to improve the social convenience of the television, some embodiments of the present application provide an interaction method for realizing the social interaction of the display device by combining the display device with the mobile terminal, so as to facilitate the social interaction of the display device.
In some embodiments, the method provided by the present application is not only applicable to the display device provided by the above embodiments, but also applicable to other non-dual-chip display devices. The device provided by the application is not limited to being in a dual-chip architecture.
According to the interaction method provided by some embodiments of the application, interaction between users of the display equipment is achieved through the display equipment and the mobile terminal. The application scene of the interaction method comprises a plurality of display devices, a plurality of mobile terminals and a server, wherein the display devices are divided into a first display device and a plurality of second display devices according to different specific execution functions of the display devices, the mobile terminals can be mobile phones, tablet computers and the like, and in some embodiments of the application, the mobile terminals are mobile phones. In some embodiments of the present application, the first display device and the second display device are relative concepts, the first display device refers to a display device that creates an auditorium and a chat room, and the display devices within the auditorium other than the first display device are collectively referred to as the second display device.
FIG. 8 illustrates a user interface diagram of a first display device in some embodiments of the present application. When the user operates the remote control device to select the control of the 'newly-built auditorium' in fig. 8, and when the first display device end receives the instruction of the newly-built auditorium, the first display device end meets the condition of creating the auditorium, and enters the auditorium creation interface. Fig. 9 illustrates an auditorium creation interface for a first display device in some embodiments of the present application. As shown in fig. 9, selection is made to create a auditorium type, such as selecting "public auditorium," according to the instructions of fig. 9. Fig. 10 illustrates the creation of an auditorium with a first display device in some embodiments of the present application. When the auditorium type is selected to enter the interface of fig. 10, the user can perform operations of setting or modifying the auditorium name, adding a video playing movie, inviting friends and the like through the interface illustrated in fig. 10.
Fig. 11 is a flow chart illustrating a first display device creating an auditorium and interacting with the auditorium according to some embodiments of the present application. As shown in fig. 11, the first display device creates the theater through the server (including a room service, a message push module, a video play service IM system, etc., where the room service is responsible for controlling service levels, including a maximum number of rooms created by one user, a maximum number of online users in one room, providing room creation, joining, dismissing, message interaction, etc., where the message push module invites message push in a friend scene, etc., and the IM system is used for interactive message forwarding), and realizes synchronous viewing of the first display device and the second display device by using the theater and interaction through the mobile terminal.
Optionally, the first display device sends a create auditorium request to the server to create the auditorium. The request for creating a theater sent to the server by the first display device includes a user ID, theater information (such as a name of the theater, a house owner customerId, a list of media IDs (including a movie ID and a tv drama ID), and the like.
The server receives a create auditorium request sent by the first display device. When the server receives a request for creating the auditorium sent by the first display device, whether the first display device meets the condition for creating the auditorium is verified. If the first display device meets the condition of creating the auditorium, the auditorium creation request applies to generate an auditorium ID, and the auditorium ID is correlated with the auditorium creation request and returned to the first display device. And if the first display equipment does not meet the condition of creating the auditorium, returning a service error code to the first display equipment to remind a user of failed creation or give a prompt and guide to the first display equipment.
In some embodiments of the present application, when a first display device user creates a movie theater through a first display device, the first display device user operates a remote control device to add a movie to be played and invite a friend.
When the first display equipment receives an operation instruction of adding a film by a user through the remote control device, the first display equipment sends a media asset request to the server. The server returns a media asset list to the first display device according to the media asset request, and the first display device displays the media assets according to the received media asset list. The user selects a certain media asset through the remote control device, and the first display device receives an operation instruction of the user selecting the certain media asset through the remote control device and then sends a request for playing the certain media asset to the server. The server receives the request for playing the certain media asset sent by the first display device, and issues the video stream of the certain media asset to the first display device according to the request for playing the certain media asset.
When the first display equipment receives an operation instruction of a user for inviting friends through the remote control device, the first display equipment sends a friend inviting request to the server, the server generates an invitation message according to the received friend inviting request, and sends the invitation message to second display equipment corresponding to the inviting friends, wherein the invitation message comprises the auditorium ID. In some embodiments of the present application, the request for inviting friends may include one friend, or may include a plurality of friends; the first display device may send a request for inviting friends once or multiple times, which is not specifically limited herein. For convenience of description, the display devices corresponding to the invited buddies are all referred to as second display devices.
In some embodiments of the present application, a create auditorium request is sent to a server, an auditorium service is created, and then a chat room service is created based on an identification of the created auditorium service, the auditorium service and chat room service corresponding one-to-one. Optionally, a chat room is created during the creation of the auditorium. And after the creation of the auditorium service is completed, displaying a coded graph on a video playing interface of the first display device, wherein the coded graph is used for enabling the mobile terminal to access the chat room service in a code scanning mode. Optionally, the encoded graphic may be a two-dimensional code, but is not limited to a two-dimensional code.
In some embodiments of the present application, when the video playing is performed in the auditorium service, an instruction of a user to start a chat room service is received; obtaining the information of the chat room service according to the unique identifier of the auditorium service; and generating and displaying a coded graph for representing the chat room service address according to the chat room service information. In some embodiments of the present application, after the first display device successfully establishes the chat room, the encoded graphic is generated according to the chat room information. The chat room information comprises a URL address, equipment information of the first display equipment and a chat room ID, the equipment information comprises the display equipment ID of the first display equipment, and the display equipment ID is used for binding the applet account with the display equipment.
When the second display device receives the invitation information, if the invitation is accepted, the second display device sends a request for joining the auditorium to the server, the request for joining the auditorium comprises an auditorium ID, and the server determines whether the second display device meets the condition of joining the auditorium when receiving the request for joining the auditorium sent by the second display device. In some embodiments of the present application, a user may join a limited number of auditoriums, such as one, at a time; the number of people that are allowed to be online at the same time in the auditorium is limited, e.g. 200. The conditions for joining the auditorium include the number of currently joined auditorium and whether the number of current online users to be joined in the auditorium is full, and if the number of currently joined auditorium at the second display device end is within the allowable range and the number of current online users to be joined in the auditorium is not full, the second display device meets the conditions for joining the auditorium. When the second display device meets the condition of joining the auditorium, the server allows the second display device to join the auditorium; otherwise, directly returning the service error code to the second display equipment terminal, and giving corresponding prompt and guide to the second display equipment user.
And when the second display equipment receives the invitation of the first display equipment and successfully joins the auditorium, the server sends the video stream to the second display equipment according to the video playing progress of a certain medium resource on the first display equipment. And the second display equipment receives the video stream, and synchronously broadcasts the video with the first display equipment end, so that the first display equipment end user and the second display equipment end user can synchronously watch the video. Optionally, the first display device sends the video playing progress to the server. For example, the first display device sends the video playing progress to the server periodically, and the server corrects the video playing progress of all the second display devices in the projection hall, so that all the display devices in the projection hall can play videos synchronously, and users in the projection hall can watch videos synchronously.
In some embodiments of the present application, the first display device sends a heartbeat request to the server to tell the server that the first display device is currently online and the server receives the heartbeat request. Optionally, the first display device sends a heartbeat request to the server periodically, the server receives the heartbeat request sent by the first display device periodically, and when the server does not receive the heartbeat request sent by the first display device end in a predetermined time, the first display device is considered to be offline. Optionally, the first display device sends a heartbeat request to the server, after receiving the heartbeat request sent by the first display device, the server returns next heartbeat request sending time to the first display device, the first display device is required to send a heartbeat request to the server at the next heartbeat request sending time, and the first display device sends a heartbeat request to the server according to the next heartbeat request sending time. And if the server does not receive the heartbeat request sent by the first display equipment terminal at the next heartbeat request sending time, the first display equipment terminal is considered to be offline. Therefore, the first display device and the server realize the monitoring of the online state of the first display device through the interaction of the heartbeat request.
Optionally, after the server determines that the first display device end is offline, the server defaults that the theater dismissal is to be passive, the server sends a theater dismissal notification to all the second display devices in the theater, and the server stops the interaction of the display device ends in the theater, that is, the server does not forward the interactive content of the display devices. In addition, the video played in the original auditorium is played to the end of the auditorium, and the playing is not influenced by the dismissal of the auditorium.
Additionally, in some embodiments of the present application, the first display device may actively send a auditorium dismissal request to the server, and when the server receives the auditorium dismissal request sent by the first display device, the server dismisses the auditorium and sends an auditorium dismissal notification to all second display devices in the auditorium. The server stops the interaction at the display device side in the auditorium, i.e. the server does not forward the interactive content of the display device any more.
Therefore, when a request for joining the auditorium, which is sent by the second display device corresponding to the invited friend, is received, whether the current auditorium is dismissed or not needs to be verified, the auditorium is determined not to be dismissed and the second display device meets the condition of joining the auditorium, and the server allows the second display device to join the auditorium.
The method comprises the steps that when video playing is carried out on a video showing hall service by a first display device and a second display device, selection of a user on a preset message is received; generating a first interactive message according to a preset message selected by a user and a preset message identification character; and sending the first interactive message to the server so that the server forwards the first interactive message to all the devices corresponding to the chat room service. When the first display device successfully creates the auditorium service, the first display device receives the preset message set sent by the server, and when the second display device joins the auditorium service, the second display device receives the preset message set sent by the server. Optionally, the preset message set includes a preset emoticon message and a preset text message, and both the preset emoticon message and the preset text message carry preset message identification characters.
In some embodiments of the present application, the video playing interfaces of the first display device and the second display device include preset message controls, the first display device user and the second display device user can select the preset message controls through the control device, the display device sends the generated interactive message according to the selected preset message, sends the interactive message to the server, and then sends the interactive message to other display devices through the server. If so, the interactive message sent by the first display device is issued to the second display device by the server, and the second display device displays the interactive message sent by the first display device; the server sends the interactive message sent by the second display device to the first display device and other second display devices, and the first display device and other second display devices display the interactive message sent by the second display device.
In some embodiments of the application, when video playing is performed on a video exhibition hall service, a first display device and a second display device receive an operation of a user, and display a bullet screen control on a video playing interface, where the bullet screen control is used for receiving the operation of the user on the bullet screen control to start or close display of an interactive message on the video playing interface of the display device.
In some embodiments of the present application, after the mobile terminal is bound to the display device through the applet account, other operations such as remote control and image transfer may be performed on the display device. The first display equipment user uses the mobile terminal to scan the coded graph to obtain the coded graph information, analyzes the information obtained by the coded graph, and loads the interactive message editing interface according to the URL address in the two-dimensional code information. In addition, after the second display device accepts the invitation of the first display device and successfully joins the auditorium, the coding graph is displayed on the video playing interface of the second display device. And then the second display equipment user can use the mobile terminal to scan the coded graph to obtain the coded graph information, analyze the information obtained by the coded graph, and load the interactive message editing interface according to the URL address in the two-dimensional code information.
The interactive message editing interface is used for the mobile terminal to send interactive messages and receive interactive messages. In order to realize the mobile terminal sending the interactive message, the selectable interactive message editing interface comprises a preset message control and a character editing control. The user can select the preset message and input the editing characters to send the interactive message by operating the mobile terminal. Optionally, in the interactive message editing interface, a first interactive message generated by a user selecting a preset message and a second interactive message generated by a user inputting a character edit are received, and then the first interactive message and the second interactive message are sent to the server and sent to the display device and other mobile terminals through the server. The mobile terminal receives a first interactive message and a second interactive message which are sent by the display equipment and other mobile terminals through the server.
In addition, the interactive message editing interface also comprises the name of the auditorium, the media resource introduction of the video played in the auditorium and the like. Optionally, the upper part of the interactive message editing interface displays the name of the auditorium and the media introduction of the video played in the auditorium, and the lower part displays the interactive content, the preset message control and the character editing control, such as an interactive content editing and selecting window, for viewing the interactive message and sending the interactive message.
In some embodiments of the present application, the display device displays the interactive message on a video display interface of the display device according to the interactive message pushed by the receiving server; and the mobile terminal receives the interactive message pushed by the server and displays the interactive message on a display interface of the mobile terminal. Optionally, the display device receives the service push and the interactive message generated by the mobile terminal; if the interactive message is generated by the mobile terminal according to the preset message, displaying the interactive message on a video playing interface of the display equipment; and if the interactive message is not the interactive message generated by the mobile terminal according to the preset message, not displaying the interactive message on a video playing interface of the display equipment. Namely, the first interactive message generated by the mobile terminal is displayed on the video playing interface of the display device, and the second interactive message generated by the mobile terminal is not displayed.
Specifically, when a user wants to send an interactive message through the display device, the user selects a preset message through the remote control device to generate the interactive message, and gives an interactive message sending instruction to the display device. And when the display equipment receives the interactive message sending instruction, the interactive message sending instruction sends the interactive message to the server. The server receives the interactive message sent by the display device, broadcasts and pushes the interactive message, and pushes the interactive message to other display devices and mobile terminals in the auditorium.
When a user wants to send an interactive message through the mobile terminal, the user operates the mobile terminal to select the interactive message generated by the preset message or input the editing characters to generate the interactive message, and an interactive message sending instruction is given to the mobile terminal. And when the mobile terminal receives an interactive message sending instruction, sending a message corresponding to the interactive message sending instruction to a server. The server receives the interactive message sent by the mobile terminal, broadcasts and pushes the interactive message, for example, the interactive message is pushed to a display device terminal and other mobile terminals in a theater.
A user operates the mobile terminal to edit the input of characters to generate an interactive message, and a character input keyboard is loaded in response to the selection of the user on the information input box; displaying a character string input by a user in an information input box according to the selection of the virtual key on the character input keyboard by the user; and generating an interactive message according to the character string according to the selection of the sending control by the user, wherein the character bit of the identification field in the interactive message is not the preset message identification character.
Optionally, in some embodiments of the present application, the interactive message editing interface includes a first control, and the first control includes a first state and a second state. When the first control is in a first state, loading a character input keyboard, and presenting a character string input by a user in an information input box according to the selection of the user on a virtual key on the character input keyboard; and generating an interactive message according to the character string according to the selection of the sending control by the user, wherein the character bit of the identification field in the interactive message is not a preset message identification character. When the first control is in the second state, loading and displaying the preset message; displaying the preset message in the information input box according to the selection of the user on the preset message; and generating an interactive message according to the preset message according to the selection of the user on the sending control, wherein the character bit of the identification field in the interactive message is the identification character of the preset message.
The preset message includes a preset emoticon message and a preset text message. The interactive message generated by editing the input of the characters refers to a message generated by editing the characters according to the input of the user. In some embodiments of the application, the server receives the interactive message sent by the mobile terminal, and can perform auditing of the interactive message, particularly, the server performs auditing of a message generated by character editing according to user input, so as to avoid the occurrence of illicit or unsafe information such as sensitive words and the like in the message.
Optionally, when the display device receives an interactive message sent by the server, it is determined whether the received interactive message is a preset message or a message generated by editing characters according to user input. If the received interactive message is a preset message, displaying the received interactive message on a video playing interface of the display equipment end, such as displaying in a bullet screen mode; and if the received interactive message is a message generated by editing characters according to the input of the user, the display equipment receives the interactive message but does not broadcast and display the received interactive message on a display screen of the display equipment. Optionally, the display device deletes the interactive message generated by editing the character according to the user input. When the mobile terminal receives the interactive message pushed by the server, the received interactive message is broadcasted and displayed on a display screen of the mobile terminal no matter the interactive message is a preset message or a message generated by character editing according to user input. The message generated by editing characters according to the input of the user mainly refers to the information such as characters and the like input by the user through the mobile terminal. The interactive message is divided into the preset message and the message generated by editing characters according to the input of a user, so that the interactive message can be conveniently sent, and the controllability of interactive message display at the equipment end can be controlled.
In some embodiments of the present application, the message generated by editing characters according to the user input refers to a self-editing message input by a user through man-machine interaction with a mobile terminal. And when the mobile terminal receives a message sending instruction generated by character editing according to the user input, sending the corresponding message generated by character editing according to the user input to the server. In order to promote network culture and safety, when the server receives a message which is sent by the mobile terminal and generated by character editing according to user input, the server audits the message which is sent by the mobile terminal and generated by character editing according to the user input. When the message generated by editing the characters according to the user input contains the informal or unsafe information such as sensitive words and phrases, the server does not push the message generated by editing the characters according to the user input, and the message generated by editing the characters according to the user input can only be displayed on the mobile terminal which sends the message generated by editing the characters according to the user input. Further, when the self-editing message at the server audit position contains the informal or unsafe information such as sensitive words, a prompt or a guide can be sent to the mobile terminal.
In order to facilitate the display device to quickly identify whether the received interaction information is a preset message, in some embodiments of the present application, optionally, a preset message identification field is set in all preset messages, and a preset message identification field is not set in a message generated by editing characters according to user input. When a display equipment end receives an interactive message pushed by a server, acquiring an identification field of the interactive message, and judging whether the identification field is a preset message identification field. And when the identification field of the interactive message is a preset message identification field, the interactive message is considered as a preset message, and the display equipment terminal broadcasts and displays the received interactive message on a display screen of the display equipment terminal.
In some embodiments, the message generated by character editing according to the user input has a self-editing identifier set in the identifier field, and the identifier field of the preset message may be empty or a preset message identifier set. In some embodiments, the preset message is provided with a preset message identifier, the message generated by character editing according to the user input does not have the preset message identifier, and the identifier field of the message generated by character editing according to the user input may be null or an identifier for characterizing self-editing. In some embodiments, the server may filter the message through the identification field, and audit the filtered self-editing message, and the display device may filter the message through the identification field, so as to display only the preset message.
Accordingly, in some embodiments of the present application, the user may only send the preset message via the display device. Optionally, when the display device receives the interactive message sending instruction, the display device acquires a message corresponding to the interactive message sending instruction, adds a preset message identification field, and sends the message added with the preset message identification field to the server. Optionally, the preset message in effect in the display device carries the preset message identification field.
In some embodiments of the present application, the server is configured to validate the preset message, that is, after the auditorium is successfully established, the server is responsible for managing the preset message; and after the mobile terminal joins the chat room, the server returns the preset message set of the mobile terminal. After the mobile terminal receives the preset message set sent by the server, the user can interact by selecting the preset message in the preset message set.
Based on the display device provided by some embodiments of the present application, some embodiments of the present application further provide an interaction method, which is used for the display device.
Some embodiments provide an interaction method, comprising:
when video playing is carried out on a auditorium service, receiving an interactive message pushed by a server, wherein the interactive message is sent to the server by a mobile terminal;
if the interactive message is generated by the mobile terminal according to a preset message, displaying the interactive message on the video playing interface;
and if the interactive message is an interactive message which is not generated by the mobile terminal according to a preset message, not displaying the interactive message on the video playing interface.
Some embodiments provide an interactive method, wherein before receiving an interactive message pushed by a server, the method further comprises:
sending a request for creating a video studio to a server, creating a video studio service, wherein the video studio service is used for simultaneously sending video data of the same video to different display devices so as to enable the different display devices to synchronously play the video of the same video;
and creating a chat room service based on the created identifier of the auditorium service, wherein the auditorium service and the chat room service are in one-to-one correspondence. The auditorium service and the chat room service can be located on different servers, so that the auditorium service and the chat room service can be created while the auditorium service is created, a user cannot perceive the creation of the services on the two servers, and the user does not need to send creation commands respectively, thereby being beneficial to improving the user experience.
Some embodiments provide an interactive method, wherein creating a chat room service based on the created identification of the auditorium service comprises:
when the video playing is carried out in the auditorium service, an instruction of starting the chat room service by a user is received;
obtaining the information of the chat room service according to the unique identifier of the auditorium service;
and generating and displaying a coded graph for representing the chat room service address according to the chat room service information, wherein the coded graph is used for enabling the mobile terminal to access the chat room service in a code scanning mode.
In some embodiments, the user may turn on or off the bullet screen through a control on the display (bullet screen control). After the bullet screen is opened, the television can display the bullet screen.
In some embodiments, if a message that a user needs to display a two-dimensional code of a chat room is received, the two-dimensional code can be generated according to an address of a chat room service, so that the user can access the chat room service through a mobile phone, and at this time, a television end can open a bullet screen and can also close the bullet screen.
In some embodiments, in the interaction method provided in the embodiment, the interaction message generated by the mobile terminal according to the preset message is an interaction message generated by selecting the preset message on the mobile terminal according to the user,
the interactive message generated by the mobile terminal without the preset message is generated by editing characters on the mobile terminal according to a user.
In some facts, the user may select a preset text image or other preset information, the information does not include a character for identifying whether the text image is a preset text, and if the user starts a text editing box for editing, an interactive message convenient for television identification needs to be generated according to the received editing text and the preset character.
Some embodiments provide an interaction method, where the interaction message is provided with an identification field, and after receiving the interaction message pushed by the server, the method further includes:
and determining whether the interactive message is generated by the mobile terminal according to the preset message according to the characters of the identification field and the preset message identification characters.
Some embodiments provide an interaction method, further comprising: receiving the selection of a user on a preset message when video playing is carried out on the auditorium service;
generating a first interactive message according to a preset message selected by a user and a preset message identification character;
and sending the first interactive message to the server so that the server forwards the first interactive message to all the devices corresponding to the chat room service.
Some embodiments provide an interaction method, further comprising: receiving the operation of a user, and displaying a bullet screen control on a video playing interface, wherein the bullet screen control is used for receiving the operation of the user on the bullet screen control so as to start or close the display of the interactive message on the display.
Based on the display device provided by some embodiments of the present application, some embodiments of the present application further provide an interaction method, and the interaction method is used for a mobile terminal.
Some embodiments provide an interaction method comprising: scanning a coding graph on a display device, wherein the coding graph is generated according to an identifier of a video showing hall service when the display device plays a video in the video showing hall service;
loading an interactive message editing interface according to the URL address obtained by analyzing the coded graph;
receiving a selection of a user on a preset message to generate a first interactive message on the interactive message editing interface, wherein the first interactive message is used for displaying on a mobile terminal and a display device corresponding to the auditorium service;
and receiving input of a user for editing characters to generate a second interactive message on the interactive message editing interface, wherein the second interactive message is used for being displayed only on the mobile terminal corresponding to the auditorium service and not being displayed on the display equipment corresponding to the auditorium service.
In some embodiments, in the interaction method provided in the embodiments, the identification field of the first interaction message is a preset message identification character, and the identification field of the second interaction message is not a preset message identification character; the identification field is used for enabling the display device to identify whether the received interactive message is generated by selecting a preset message on the mobile terminal by a user.
Some embodiments provide an interaction method, further comprising: sending a request for acquiring a message of a theater to a server according to information obtained by analyzing the coded graph;
and receiving the auditorium message returned by the server and displaying the auditorium message on a display screen of the mobile terminal.
Some embodiments provide an interaction method, further comprising: and receiving a preset message set sent by the server, wherein the preset message set comprises a preset expression message and a preset text message, and the preset expression message and the preset text message both carry preset message identification characters.
Some embodiments provide an interaction method, wherein receiving an interaction message generated by a user operating on the mobile terminal comprises:
responding to the selection of the user on the information input box, and loading a character input keyboard;
displaying a character string input by a user in an information input box according to the selection of the virtual key on the character input keyboard by the user;
and generating an interactive message according to the character string according to the selection of the sending control by the user, wherein the character bit of the identification field in the interactive message is not a preset message identification character.
Some embodiments provide an interaction method, further comprising: responding to the selection of a user on the first control, switching the first control from the second state to the first state, and loading a character input keyboard;
displaying a character string input by a user in an information input box according to the selection of the virtual key on the character input keyboard by the user;
and generating an interactive message according to the character string according to the selection of the sending control by the user, wherein the character bit of the identification field in the interactive message is not a preset message identification character.
Some embodiments provide an interaction method, further comprising: responding to the selection of a user on the first control, switching the first control from the first state to the second state, and loading and displaying a preset message, wherein the preset message comprises at least one of a preset picture and a preset character string;
displaying the preset message in the information input box according to the selection of the user on the preset message;
and generating an interactive message according to the preset message according to the selection of the user on the sending control, wherein the character bit of the identification field in the interactive message is the identification character of the preset message.
Based on the interaction method provided by the embodiments, some embodiments of the present application further provide a mobile terminal. The mobile terminal provided by the implementation comprises:
a display configured to display a user interface and a play screen;
a controller for communicative connection with the display, the controller configured to:
scanning a coding graph on a display device, wherein the coding graph is generated according to an identifier of a video showing hall service when the display device plays a video in the video showing hall service;
loading an interactive message editing interface according to the URL address obtained by analyzing the coded graph;
receiving a selection of a user on a preset message to generate a first interactive message on the interactive message editing interface, wherein the first interactive message is used for displaying on a mobile terminal and a display device corresponding to the auditorium service;
and receiving input of a user for editing characters to generate a second interactive message on the interactive message editing interface, wherein the second interactive message is used for being displayed only on the mobile terminal corresponding to the auditorium service and not being displayed on the display equipment corresponding to the auditorium service.
In order to facilitate the display of the interaction method between the display device and the mobile terminal provided in some embodiments of the present application, the present application is specifically described with reference to a specific usage scenario.
For example, the theater established by the first display device selects to play Chu Qiao Chu biography, the user clicks 'interaction', and the display interface pops up a plurality of preset messages for the user to select to send the interactive messages used by the user as the transmitted interactive messages. Reference is made to fig. 12, which shows a preset emoticon message mainly in fig. 12, which may further include preset words, such as "yun princess fueling", "liqin rexion", and so on. The video playing interface of the second display device added to the auditorium is the same as that of the first display device, and the video playing of the second display device added to the auditorium is synchronous with that of the first display device. The user of the second display device clicks 'interaction' through the remote control device, and a plurality of preset messages are popped up on the display interface for the user to select the sending user to use as the transmitted interactive messages. When the users of the first display device and the second display device click on the "barrage" and turn "on", the interactive message will be displayed on the screen, as shown in fig. 13.
The mobile terminal calls a chat room wechat applet (such as a telepresence wechat applet) by wechat scanning a coded graph (two-dimensional code) shown in fig. 12, and enters an interactive message editing interface of a chat room service corresponding to the auditorium service, and the interface is in a binding relationship with the large screen of the display device. As shown in fig. 14, the upper part of the display screen of the mobile terminal displays the name of the auditorium and the media introduction of the video played in the auditorium, and the lower part displays the interactive content and the interactive content editing and selecting window for viewing the interactive content and transmitting the interactive content.
A mobile terminal user can click and select a preset message control of a display screen of the mobile terminal, such as a 'smiling face' button in the left picture of fig. 14, show a preset expression message, and can select to send the preset expression message to realize interaction; as shown in the right diagram of fig. 14, the user of the mobile terminal may click on a character editing control that selects the display of the mobile terminal to edit the message through the interactive content editing window. When the mobile terminal selects preset message interaction, the display equipment end receives the message and the 'barrage' of the display equipment end is in an open state, and the message is displayed on a display screen by the display equipment end; when the mobile terminal receives a message generated by character editing of a user for interaction, the self-editing message cannot be displayed on the display equipment, and only can be displayed on the local and opposite mobile terminals, namely the display equipment screens preset messages. And the server will audit the self-editing message sent by the mobile terminal, when the self-editing message contains the indecipherable or unsafe information such as sensitive words and the like, the self-editing message sent by the mobile terminal can only be displayed at the local terminal but cannot be pushed to other mobile terminals, and a prompt or guide is given at the local mobile terminal, as shown in the left diagram in fig. 14.
Therefore, in the interaction method, the display device and the mobile terminal provided in some embodiments of the present application, when the display device performs video playing in a theater service, a coding pattern is generated according to an identifier of the theater service; the mobile terminal scans the coded graph on the display equipment, analyzes the coded graph to obtain a URL address, and loads an interactive message editing interface according to the URL address; a user can select a preset message to send a first interactive message or input an editing character to send a second interactive message on an interactive message editing interface; the server pushes various interactive messages, so that the first interactive message is displayed on the mobile terminal and the display device corresponding to the auditorium service, the second interactive message is displayed on the mobile terminal corresponding to the auditorium service, and the user can conveniently display social contact through the combination of the display device and the mobile terminal. Therefore, according to the interaction method, the display device and the mobile terminal, the display device is combined with the mobile terminal to achieve social interaction of the display device, and the advantages of the mobile terminal, such as a touch screen interaction mode, are utilized, so that the convenience of inputting social information through the display device is improved. Compared with television social contact through a remote controller, the display device social contact realized by the method has higher convenience.
In some embodiments, fig. 15 is a timing diagram of an interaction method provided in some embodiments of the present application. As shown in fig. 15, according to the interaction method provided by some embodiments of the present application, interaction between users of display devices is achieved through the display devices and the mobile terminals. The application scene of the interaction method comprises a plurality of display devices, a plurality of mobile terminals and a server, wherein the display devices are divided into a first display device and a plurality of second display devices according to different execution functions, the mobile terminals can be mobile phones, tablet computers and the like, and the mobile terminals are mobile phones in some embodiments of the application. In some embodiments of the present application, the first display device and the second display device are relative concepts, the first display device refers to a display device that creates an auditorium and a chat room, and the display devices within the auditorium other than the first display device are collectively referred to as the second display device.
As shown in fig. 15, an interaction method provided by some embodiments of the present application includes:
and the display equipment terminal generates the chat room two-dimensional code according to the chat room information. The chat room information comprises a URL address, equipment information of a display equipment end and a chat room ID, the equipment information comprises the display equipment ID of the display equipment end, and the display equipment ID is used for binding the applet account with the display equipment. Optionally, after the mobile terminal is bound to the display device through the applet account, the mobile terminal may perform other operations such as remote control and image transmission on the display device.
The user uses the mobile terminal to scan the two-dimensional code of the chat room to obtain the information of the two-dimensional code of the chat room, analyzes the information obtained by the two-dimensional code of the chat room, and sends a request for joining the chat room to the server according to the URL address in the information of the two-dimensional code to join the chat room.
The display equipment terminal broadcasts and displays the interactive message on a display screen of the display equipment terminal according to the interactive message pushed by the receiving server terminal; and the mobile terminal receives the interactive message pushed by the server side and broadcasts and displays the interactive message on a display screen of the mobile terminal.
And when the user wants to send the interactive message through the display equipment terminal, giving an interactive message sending instruction to the display equipment terminal. And when the display equipment end receives an interactive message sending instruction, sending a message corresponding to the interactive message sending instruction to the server end. And the server receives the interactive message sent by the display equipment, broadcasts and pushes the interactive message, and pushes the interactive message to other display equipment terminals and mobile terminals in the auditorium.
And when the user wants to send the interactive message through the mobile terminal, giving an interactive message sending instruction to the mobile terminal. And when the mobile terminal receives an interactive message sending instruction, sending a message corresponding to the interactive message sending instruction to the server. And the server receives the interactive message sent by the mobile terminal, broadcasts and pushes the interactive message, and pushes the interactive message to a display device end and other mobile terminals in a theater.
FIG. 16 is a timing diagram of another interaction method according to some embodiments of the present application. In the interaction method provided in some embodiments of the present application as shown in fig. 16, the method further includes: the display device side (first display device side) sends a request for creating the auditorium to the server side, creates the auditorium, and creates a chat room based on the created auditorium. The details are as follows.
And the first display equipment terminal sends a request for creating the auditorium to the server terminal, and the auditorium is created. In some embodiments of the present application, the request for creating a auditorium, which is sent by the first display device to the server, includes a user ID, auditorium information (such as an auditorium name, a homeowner ID (client code), a media ID list (including a movie ID and a tv series ID), and the like.
And the server receives a request for creating the auditorium sent by the first display equipment. When the server receives a request for creating the auditorium sent by the first display equipment terminal, whether the first display equipment terminal meets the condition for creating the auditorium is verified. If the first display device side meets the condition of creating the auditorium, the auditorium creation request applies to generate an auditorium ID, and the auditorium ID is correlated with the auditorium creation request and returned to the first display device side. And if the first display equipment end does not meet the condition of establishing the auditorium, returning a service error code to the first display equipment end to remind a user of failed establishment or give a prompt and guide to the first display equipment end.
The first display equipment terminal sends a friend inviting request to the server terminal, the server terminal generates an invitation message according to the received friend inviting request and sends the invitation message to a second display equipment terminal corresponding to the friend inviting, and the invitation message comprises the ID of the auditorium. In some embodiments of the present application, the request for inviting friends may include one friend, or may include a plurality of friends; the first display device side may send a request for inviting friends once or multiple times, which is not specifically limited herein. For convenience of description, the display devices corresponding to the invited buddies are all referred to as second display device terminals.
When the second display equipment receives the invitation information, if the invitation is accepted, the second display equipment sends a request for joining the auditorium to the server, the request for joining the auditorium comprises an auditorium ID, and the server determines whether the second display equipment meets the condition of joining the auditorium when receiving the request for joining the auditorium sent by the second display equipment. In some embodiments of the present application, a user may join a limited number of auditoriums, such as one, at a time; the number of people that are allowed to be online at the same time in the auditorium is limited, e.g. 200. The conditions for joining the auditorium include the number of currently joined auditoriums and whether the number of current online users to be joined in the auditorium is full, and if the number of currently joined auditoriums of the second display device end is within the allowable range and the number of current online users to be joined in the auditorium is not full, the second display device end meets the conditions for joining the auditorium. When the second display equipment terminal meets the condition of joining the auditorium, the server allows the second display equipment terminal to join the auditorium; otherwise, directly returning the service error code to the second display equipment end, and giving corresponding prompt and guide to the user of the second display equipment end.
And the server sends the video stream to the second display equipment which accepts the invitation according to the video playing progress. And the second display equipment receiving the invitation receives the video stream, and synchronously broadcasts the video with the first display equipment end, so that the first display equipment end user and the second display equipment end user can synchronously watch the video. Furthermore, the first display equipment terminal sends the video playing progress to the server terminal periodically, and the server terminal corrects the video playing progress of all the second display equipment terminals in the projection hall, so that all the display equipment terminals in the projection hall can play videos synchronously, and users in the projection hall can watch videos synchronously.
In some embodiments of the present application, the first display device sends a heartbeat request to the server, where the heartbeat request is used to tell the server that the first display device is currently online, and the server receives the heartbeat request. Optionally, the first display device end sends a heartbeat request to the server end periodically, the server end receives the heartbeat request sent by the first display device end periodically, and when the server end does not receive the heartbeat request sent by the first display device end in a predetermined time, the first display device end is considered to be offline. Optionally, the first display device sends a heartbeat request to the server, after receiving the heartbeat request sent by the first display device, the server returns next heartbeat request sending time to the first display device, the first display device is required to send a heartbeat request to the server at the next heartbeat request sending time, and the first display device sends a heartbeat request to the server according to the next heartbeat request sending time. And if the server side does not receive the heartbeat request sent by the first display equipment side at the next heartbeat request sending time, the first display equipment side is considered to be offline. Therefore, the first display equipment terminal and the server terminal realize the monitoring of the online state of the first display equipment terminal through the interaction of the heartbeat request.
When the server determines that the first display device end is offline, the server defaults that the theater dismissal is to be passive, the server sends a theater dismissal notification to all second display devices in the theater, and the server stops the interaction of the display device ends in the theater, namely the server does not forward the interactive content of the display devices. In addition, the video played in the original auditorium is played to the end of the auditorium, and the playing is not influenced by the dismissal of the auditorium.
In addition, in some embodiments of the present application, the first display device may actively send a theater dismissal request to the server, and when the server receives the theater dismissal request sent by the first display device, the server dismisses the theater and sends a notification of the dismissal of the theater to all the second display devices in the theater. The server stops the interaction of the display device in the auditorium, i.e. the server does not forward the interactive content of the display device.
Therefore, when a request for joining the auditorium, which is sent by the second display device end corresponding to the invited friend, is received, whether the current auditorium is dismissed or not needs to be verified, whether the auditorium is not dismissed or whether the second display device end meets the condition of joining the auditorium is determined, and the service end allows the second display device end to join the auditorium.
According to the interaction method provided by some embodiments of the application, the user can complete social interaction with the mobile terminal through the display equipment terminal. According to the interaction method, the display equipment side is combined with the mobile terminal, and the advantages of the mobile terminal, such as a touch screen interaction mode, are utilized, so that the convenience of social information input through the display equipment is improved.
In some embodiments of the application, the mobile terminal sends a request for obtaining a message of a theater to the server according to information obtained by analyzing the two-dimensional code of the chat room; the auditorium message returned by the server to the mobile terminal; and the mobile terminal receives the auditorium message returned by the server and displays the auditorium message on a display screen of the mobile terminal. The auditorium messages are used to show the basic situation of the auditorium, such as the name of the creator of the auditorium, the introduction of the video assets played in the auditorium, the current number of people online in the auditorium, etc.
In some embodiments of the present application, the interactive messages include preset messages and user self-edit messages. When the display equipment end receives the interactive message pushed by the server end, whether the received interactive message is a preset message or a user self-editing message is judged. If the received interactive message is a preset message, broadcasting and displaying the received interactive message on a display screen of a display equipment end, namely displaying in a bullet screen mode; if the received interactive message is a user self-editing message, the display equipment receives the interactive message but does not broadcast and display the received interactive message on a display screen of the display equipment. And deleting the interactive message self-edited by the user by the optional display equipment terminal. When the mobile terminal receives the interactive message pushed by the server, the received interactive message is broadcast and displayed on a display screen of the mobile terminal no matter the interactive message is a preset message or a self-editing message of a user. The user self-editing message mainly refers to information such as characters and the like which are input by the user through the mobile terminal. The interactive message is divided into the preset message and the user self-editing message, so that the interactive message can be conveniently sent, and the controllability of interactive message display at the equipment end can be controlled.
In some embodiments of the present application, the user self-editing message refers to a self-editing message input by a user through man-machine interaction with a mobile terminal. And when the mobile terminal receives a user self-editing message sending instruction, sending the corresponding user self-editing message to the server. In order to promote network culture and security, when the server receives the self-editing message sent by the mobile terminal, the server examines the self-editing message sent by the mobile terminal. When the self-editing message contains the informal or unsafe information such as sensitive words and the like, the server side does not push the self-editing message, and the self-editing message can be only displayed on the mobile terminal which sends the self-editing message. Further, when the self-editing message at the audit position of the server contains the informal or unsafe information such as sensitive words, a prompt or a guide can be sent to the mobile terminal.
In order to facilitate the display device to quickly identify whether the received interaction information is preset information, in some embodiments of the present application, optionally, a preset message identification field is set in all the preset information. When the display equipment receives the interactive message pushed by the server, the identification field of the interactive message is obtained, and whether the identification field is a preset message identification field or not is judged. And when the identification field of the interactive message is a preset message identification field, the interactive message is considered as a preset message, and the display equipment terminal broadcasts and displays the received interactive message on a display screen of the display equipment terminal.
In some embodiments, the preset message is provided with a preset message identifier, the message self-edited by the user does not have the preset message identifier, the identifier field self-edited by the user can be null or an identifier for characterizing self-editing, in some embodiments, the identifier field self-edited by the user is provided with a self-editing identifier, and the identifier field of the preset message can be null or the preset message identifier is set. In some embodiments, the server may filter the message through the identification field, and audit the filtered self-editing message, and the display device may filter the message through the identification field, so as to display only the preset message.
Accordingly, in some embodiments of the present application, the user may only send the preset information through the display device. The preset information comprises preset expression information and preset text information. Optionally, when the display device receives the interactive message sending instruction, the display device acquires a message corresponding to the interactive message sending instruction, adds the preset information identification field, and sends the message added with the preset information identification field to the server. Optionally, the valid preset information in the display device carries the preset information identification field.
In addition, in some embodiments of the application, the server is used for enabling the preset information to take effect, that is, after the mapping hall is established successfully, the server is responsible for managing the preset information; and after the mobile terminal joins the chat room, the server returns the preset information set of the mobile terminal. After the mobile terminal receives the preset information set sent by the server, the user can interact by selecting the preset information in the preset information set. All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments shown in the present application without inventive effort, shall fall within the scope of protection of the present application. Moreover, while the disclosure herein has been presented in terms of exemplary one or more examples, it is to be understood that each aspect of the disclosure can be utilized independently and separately from other aspects of the disclosure to provide a complete disclosure.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims of the present application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can, for example, be implemented in sequences other than those illustrated or otherwise described herein with respect to some embodiments of the application.
Furthermore, the terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (13)

1. An interactive method for a display device, the method comprising:
when video playing is carried out on a auditorium service, receiving an interactive message pushed by a server, wherein the interactive message is sent to the server by a mobile terminal;
if the interactive message is generated by the mobile terminal according to a preset message, displaying the interactive message on the video playing interface;
and if the interactive message is an interactive message which is not generated by the mobile terminal according to a preset message, not displaying the interactive message on the video playing interface.
2. The method of claim 1, wherein before receiving the interactive message pushed by the server, the method further comprises:
sending a request for creating a video studio to a server, creating a video studio service, wherein the video studio service is used for simultaneously sending video data of the same video to different display devices so as to enable the different display devices to synchronously play the video of the same video;
and creating a chat room service based on the created identifier of the auditorium service, wherein the auditorium service and the chat room service are in one-to-one correspondence.
3. The method of claim 2, wherein creating a chat room service based on the created identification of the auditorium service comprises:
when the video playing is carried out in the auditorium service, an instruction of starting the chat room service by a user is received;
obtaining the information of the chat room service according to the unique identifier of the auditorium service;
and generating and displaying a coded graph for representing the chat room service address according to the chat room service information, wherein the coded graph is used for enabling the mobile terminal to access the chat room service in a code scanning mode.
4. The method of claim 1,
the interactive message generated by the mobile terminal according to the preset message is the interactive message generated by selecting the preset message on the mobile terminal according to the user,
the interactive message generated by the mobile terminal without the preset message is generated by editing characters on the mobile terminal according to a user.
5. The method of claim 1, wherein the interactive message is provided with an identification field, and after receiving the interactive message pushed by the server, the method further comprises:
and determining whether the interactive message is generated by the mobile terminal according to the preset message according to the characters of the identification field and the preset message identification characters.
6. An interaction method for a mobile terminal, the method comprising:
scanning a coding graph on a display device, wherein the coding graph is generated according to an identifier of a video showing hall service when the display device plays a video in the video showing hall service;
loading an interactive message editing interface according to the URL address obtained by analyzing the coded graph;
receiving a selection of a user on a preset message to generate a first interactive message on the interactive message editing interface, wherein the first interactive message is used for displaying on a mobile terminal and a display device corresponding to the auditorium service;
and receiving input of a user for editing characters to generate a second interactive message on the interactive message editing interface, wherein the second interactive message is used for being displayed only on the mobile terminal corresponding to the auditorium service and not being displayed on the display equipment corresponding to the auditorium service.
7. The method of claim 6, wherein the identification field of the first interactive message is a preset message identification character and the identification field of the second interactive message is not a preset message identification character; the identification field is used for enabling the display device to identify whether the received interactive message is generated by selecting a preset message on the mobile terminal by a user.
8. A display device, comprising:
a display configured to display a user interface and a video playback interface;
a controller for communicative connection with the display, the controller configured to:
when video playing is carried out on the auditorium service, receiving an interactive message pushed by a server, wherein the interactive message is sent to the server by a mobile terminal;
if the interactive message is generated by the mobile terminal according to a preset message, displaying the interactive message on the video playing interface;
and if the interactive message is an interactive message which is not generated by the mobile terminal according to a preset message, not displaying the interactive message on the video playing interface.
9. The display device of claim 8, wherein before receiving the server-pushed interactive message, the controller is further configured to:
sending a request for creating a video studio to a server, creating a video studio service, wherein the video studio service is used for simultaneously sending video data of the same video to different display devices so as to enable the different display devices to synchronously play the video of the same video;
and creating a chat room service based on the created identifier of the auditorium service, wherein the auditorium service and the chat room service are in one-to-one correspondence.
10. The display device of claim 9, wherein creating a chat room service based on the identification of the created auditorium service comprises:
when the video playing is carried out in the auditorium service, an instruction of starting the chat room service by a user is received;
obtaining the information of the chat room service according to the unique identifier of the auditorium service;
and generating and displaying a coded graph for representing the chat room service address according to the chat room service information, wherein the coded graph is used for enabling the mobile terminal to access the chat room service in a code scanning mode.
11. The display device according to claim 8, wherein the interactive message generated by the mobile terminal according to the preset message is an interactive message generated according to a preset message selected by a user on the mobile terminal,
the interactive message generated by the mobile terminal without the preset message is generated by editing characters on the mobile terminal according to a user.
12. The display device according to claim 8, wherein the interactive message is provided with an identification field, and after receiving the interactive message pushed by the server, the controller is configured to:
and determining whether the interactive message is generated by the mobile terminal according to the preset message according to the characters of the identification field and the preset message identification characters.
13. A mobile terminal, comprising:
a display configured to display a user interface and a play screen;
a controller for communicative connection with the display, the controller configured to:
scanning a coding graph on a display device, wherein the coding graph is generated according to an identifier of a video showing hall service when the display device plays a video in the video showing hall service;
loading an interactive message editing interface according to the URL address obtained by analyzing the coded graph;
receiving a selection of a user on a preset message to generate a first interactive message on the interactive message editing interface, wherein the first interactive message is used for displaying on a mobile terminal and a display device corresponding to the auditorium service;
and receiving input of a user for editing characters to generate a second interactive message on the interactive message editing interface, wherein the second interactive message is used for being displayed only on the mobile terminal corresponding to the auditorium service and not being displayed on the display equipment corresponding to the auditorium service.
CN202010224068.3A 2019-08-18 2020-03-26 Interaction method, display device and mobile terminal Pending CN112399263A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080024297.9A CN113661715B (en) 2019-08-18 2020-08-11 Service management method, interaction method, display equipment and mobile terminal for projection hall
PCT/CN2020/108503 WO2021031940A1 (en) 2019-08-18 2020-08-11 Screening room service management method, interaction method, display device, and mobile terminal

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910761454 2019-08-18
CN2019107614543 2019-08-18

Publications (1)

Publication Number Publication Date
CN112399263A true CN112399263A (en) 2021-02-23

Family

ID=74603745

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010224068.3A Pending CN112399263A (en) 2019-08-18 2020-03-26 Interaction method, display device and mobile terminal

Country Status (1)

Country Link
CN (1) CN112399263A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112689202A (en) * 2021-03-22 2021-04-20 北京达佳互联信息技术有限公司 Live broadcast room message processing method and device, server and storage medium
CN113660504A (en) * 2021-08-18 2021-11-16 北京百度网讯科技有限公司 Message display method and device, electronic equipment and storage medium
CN114125566A (en) * 2021-12-29 2022-03-01 阿里巴巴(中国)有限公司 Interaction method and system and electronic equipment
WO2022206606A1 (en) * 2021-03-29 2022-10-06 西安青松光电技术有限公司 Control method and apparatus for led display device, storage medium, and electronic device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105263055A (en) * 2015-09-02 2016-01-20 康佳集团股份有限公司 Interactive method and interactive system of television live broadcasting
US9253223B1 (en) * 2013-01-23 2016-02-02 Google Inc. Live interaction in persistent conversations
CN106375829A (en) * 2016-08-31 2017-02-01 腾讯科技(深圳)有限公司 Video comment method, and related device and system
CN109391850A (en) * 2017-08-02 2019-02-26 腾讯科技(深圳)有限公司 The method, apparatus and storage medium of interaction message in video page

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9253223B1 (en) * 2013-01-23 2016-02-02 Google Inc. Live interaction in persistent conversations
CN105263055A (en) * 2015-09-02 2016-01-20 康佳集团股份有限公司 Interactive method and interactive system of television live broadcasting
CN106375829A (en) * 2016-08-31 2017-02-01 腾讯科技(深圳)有限公司 Video comment method, and related device and system
CN109391850A (en) * 2017-08-02 2019-02-26 腾讯科技(深圳)有限公司 The method, apparatus and storage medium of interaction message in video page

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112689202A (en) * 2021-03-22 2021-04-20 北京达佳互联信息技术有限公司 Live broadcast room message processing method and device, server and storage medium
CN112689202B (en) * 2021-03-22 2021-06-15 北京达佳互联信息技术有限公司 Live broadcast room message processing method and device, server and storage medium
US11553256B2 (en) 2021-03-22 2023-01-10 Beijing Dajia Internet Information Technology Co., Ltd. Method and device for processing message in live broadcast room
WO2022206606A1 (en) * 2021-03-29 2022-10-06 西安青松光电技术有限公司 Control method and apparatus for led display device, storage medium, and electronic device
CN113660504A (en) * 2021-08-18 2021-11-16 北京百度网讯科技有限公司 Message display method and device, electronic equipment and storage medium
CN113660504B (en) * 2021-08-18 2024-04-16 北京百度网讯科技有限公司 Message display method, device, electronic equipment and storage medium
CN114125566A (en) * 2021-12-29 2022-03-01 阿里巴巴(中国)有限公司 Interaction method and system and electronic equipment
CN114125566B (en) * 2021-12-29 2024-03-08 阿里巴巴(中国)有限公司 Interaction method, interaction system and electronic equipment

Similar Documents

Publication Publication Date Title
CN112073797B (en) Volume adjusting method and display device
CN112073665B (en) Video call interface switching method on smart television
CN110611787A (en) Display and image processing method
CN112533037B (en) Method for generating Lian-Mai chorus works and display equipment
CN112073664B (en) Video call method and display device
CN112399263A (en) Interaction method, display device and mobile terminal
CN112073798B (en) Data transmission method and equipment
CN112399264B (en) Projection hall service management method and application
CN112073770A (en) Display device and video communication data processing method
CN112783380A (en) Display apparatus and method
CN111385631B (en) Display device, communication method and storage medium
CN112995733B (en) Display device, device discovery method and storage medium
CN112533056B (en) Display device and sound reproduction method
CN112399225B (en) Service management method for projection hall and display equipment
CN112463267A (en) Method for presenting screen saver information on screen of display device and display device
CN113661715B (en) Service management method, interaction method, display equipment and mobile terminal for projection hall
CN112073777B (en) Voice interaction method and display device
CN112533023B (en) Method for generating Lian-Mai chorus works and display equipment
WO2020248790A1 (en) Voice control method and display device
WO2020248699A1 (en) Sound processing method and display apparatus
CN112073812A (en) Application management method on smart television and display device
CN112073773A (en) Screen interaction method and device and display equipment
CN112071338A (en) Recording control method and device and display equipment
CN112073772B (en) Key seamless transmission method based on dual systems and display equipment
CN112911353B (en) Display device, port scheduling method and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210223