CN112399264A - Projection hall service management method and application - Google Patents

Projection hall service management method and application Download PDF

Info

Publication number
CN112399264A
CN112399264A CN202010380112.XA CN202010380112A CN112399264A CN 112399264 A CN112399264 A CN 112399264A CN 202010380112 A CN202010380112 A CN 202010380112A CN 112399264 A CN112399264 A CN 112399264A
Authority
CN
China
Prior art keywords
video
auditorium
display device
service
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010380112.XA
Other languages
Chinese (zh)
Other versions
CN112399264B (en
Inventor
张晓东
李园园
周润升
张善鹏
薛涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Media Network Technology Co Ltd
Original Assignee
Qingdao Hisense Media Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Media Network Technology Co Ltd filed Critical Qingdao Hisense Media Network Technology Co Ltd
Priority to PCT/CN2020/108503 priority Critical patent/WO2021031940A1/en
Priority to CN202080024297.9A priority patent/CN113661715B/en
Publication of CN112399264A publication Critical patent/CN112399264A/en
Application granted granted Critical
Publication of CN112399264B publication Critical patent/CN112399264B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides a method for managing the services of a theater and an application thereof, wherein the method comprises the following steps: sending a request for creating a auditorium to a server, wherein the request for creating the auditorium is used for enabling the server to create an auditorium service, and the auditorium service is used for enabling different display devices accessing the auditorium service to simultaneously play a first video corresponding to a first video identifier; receiving an identification of a service of a projection hall; accessing video data of a first video fed back by the service receiving server of the auditorium according to the first video identifier and playing the first video according to the video data; and starting a camera of the first display device to acquire the video, and displaying the video in a local video window on a playing interface for playing the first video. According to the method and the application for managing the services of the film studio, the functions of watching the video and calling the video can be completed in one scene without switching back and forth; in the video watching process, other friends can be smoothly invited to join without interrupting video playing.

Description

Projection hall service management method and application
Technical Field
The application relates to the technical field of internet, in particular to a service management method and application for a theater.
Background
At present, products providing video playing capability and products providing video call capability are available in the market. For example, with the development of smart televisions, cameras are gradually set on smart televisions, and applications for video call are installed on the televisions, so that a voice video call function is realized through the televisions, and a function of "watching while chatting" of the televisions is further realized by combining a video playing function of the televisions or video playing applications on the television.
But "chat while watching" implemented by tv generally means that the user opens the video call application to establish a video call with a friend while watching a movie. Specifically, when the chat is realized by the television, the user needs to open the video call application to establish a call with a friend, and then open the video playing application to play the video. The video opening time between friends is different, so the playing schedules between friends cannot be consistent, and if the schedules are consistent, the friends may need to adjust the video playing schedule at the same time. And if other friends need to be added to chat, the original video playing needs to be interrupted, the video call application is switched to invite the friends, and then the video playing is returned to continue playing, so that the playing progress of the friends and the relatives is different. However, the people need to simultaneously watch video and can also carry out real-time video call, so that the current television-based method for realizing the watching chat is not the real-time watching chat needed by the people.
Disclosure of Invention
The application provides a service management method and application for a theater, wherein a user can synchronously watch movies and simultaneously carry out video call in a scene, and the watching and chatting are really realized.
In a first aspect, the present application provides an auditorium service management method, applied to a first display device, the method including:
sending a request for creating a auditorium to a server, wherein the request for creating the auditorium comprises a first video identifier, the request for creating the auditorium is used for enabling the server to create an auditorium service, and the auditorium service is used for enabling different display devices accessing the auditorium service to simultaneously play a first video corresponding to the first video identifier;
receiving an identifier of a auditorium service, wherein the identifier of the auditorium service is sent by the server after the auditorium service is successfully created;
in response to receiving the auditorium service identification, accessing the auditorium service according to the auditorium service identification to enable the first display device to receive video data of a first video fed back by a server according to the first video identification and play the first video according to the video data;
and in response to receiving the service identifier of the auditorium, starting a camera of the first display device to acquire local video data, displaying the local video data on a local video window on a playing interface for playing the first video, and sending the local video data to the server.
In a second aspect, the present application provides an auditorium service management method, applied to a server, where the method includes:
receiving a request for creating a auditorium sent by a first display device side, wherein the request for creating the auditorium comprises a first video identifier;
in response to a received request for creating a auditorium, creating an auditorium service, wherein the auditorium service is used for enabling different display devices accessing the auditorium service to simultaneously play a first video corresponding to a first video identifier;
sending an identifier of a auditorium service to the first display device, wherein the identifier of the auditorium service is used for informing the first display device that the auditorium is successfully created;
receiving access of the first display device according to the identifier of the auditorium service, and feeding back video data of a first video to the first display device according to the first video identifier;
and receiving local video data acquired by the first display device through a camera of the first display device.
In a third aspect, the present application provides a display device comprising:
a display configured to display a user interface, a video playback interface, and display device local video data;
a controller for communicative connection with the display, the controller configured to:
sending a request for creating a auditorium to a server, wherein the request for creating the auditorium comprises a first video identifier, the request for creating the auditorium is used for enabling the server to create an auditorium service, and the auditorium service is used for enabling different display devices accessing the auditorium service to simultaneously play a first video corresponding to the first video identifier;
receiving an identifier of a auditorium service, wherein the identifier of the auditorium service is sent by the server after the auditorium service is successfully created;
in response to receiving the auditorium service identification, accessing the auditorium service according to the auditorium service identification to enable the first display device to receive video data of a first video fed back by a server according to the first video identification and play the first video according to the video data;
and in response to receiving the service identifier of the auditorium, starting a camera of the first display device to acquire local video data, displaying the local video data on a local video window on a playing interface for playing the first video, and sending the local video data to the server.
The application provides a video theater service management method and application, wherein a first display device sends a request for creating a video theater to a server, the server creates a video theater service according to the received request for creating the video theater and returns a video theater service identifier to the first display device, the first display device receives the video theater service identifier, and in response to the received video theater service identifier, the first display device plays a first video and starts a camera to acquire local video data, a local video window on a playing interface for playing the first video displays the local video data, and the local video data is sent to the server, so that the server sends the local video data of the first display device to a second display device; meanwhile, the second display device may receive the video data of the first video and the local video data of the first display device by accessing the auditorium service. Therefore, the video theater service management method and the application thereof can complete the functions of video watching and video call in one scene without switching back and forth; in the video watching process, other friends can be smoothly invited to join without interrupting video playing, and multi-user chatting while watching in the synchronous film watching process is realized.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without any creative effort.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus in some embodiments;
fig. 2 is a block diagram illustrating a hardware configuration of the control apparatus 100 in some embodiments;
a block diagram of the hardware configuration of the display device 200 in some embodiments is illustrated in fig. 3;
a block diagram of the hardware architecture of the display device 200 of fig. 3 is exemplarily shown in fig. 4;
fig. 5 is a diagram schematically illustrating a functional configuration of the display device 200 in some embodiments;
fig. 6a schematically illustrates a software configuration in the display device 200 in some embodiments;
FIG. 6b is a schematic diagram illustrating the configuration of applications in display device 200 in some embodiments;
FIG. 7 is a schematic diagram illustrating a user interface in the display device 200 in some embodiments;
FIG. 8 illustrates an interface diagram one displayed by the display device in some embodiments;
FIG. 9 illustrates an interface diagram two displayed by the display device in some embodiments;
FIG. 10 illustrates an interface diagram three displayed by the display device in some embodiments;
FIG. 11 illustrates a fourth interface displayed by the display device in some embodiments;
FIG. 12 illustrates an interface diagram five displayed by the display device in some embodiments;
FIG. 13 illustrates an interface diagram six displayed by the display device in some embodiments;
fig. 14 is a timing diagram illustrating an implementation of a chat-while-watching method according to an embodiment;
FIG. 15 illustrates an interface diagram seven displayed by the display device in some embodiments;
an interface diagram eight of the display device display in some embodiments is illustrated in fig. 16.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
The present application relates to a display terminal including at least two system-on-chip, and for ease of understanding, a display terminal of a multi-chip structure is described herein.
For the convenience of users, various external device interfaces are usually provided on the display device to facilitate connection of different peripheral devices or cables to implement corresponding functions. When a high-definition camera is connected to an interface of the display device, if a hardware system of the display device does not have a hardware interface of a high-pixel camera receiving the source code, data received by the camera cannot be displayed on a display screen of the display device.
Furthermore, due to the hardware structure, the hardware system of the conventional display device only supports one path of hard decoding resources, and usually only supports video decoding with a resolution of 4K at most, so when a user wants to perform video chat while watching a network television, the user needs to use the hard decoding resources (usually GPU in the hardware system) to decode the network video without reducing the definition of the network video screen, and in this case, the user can only process the video chat screen by using a general-purpose processor (e.g. CPU) in the hardware system to perform soft decoding on the video.
The soft decoding is adopted to process the video chat picture, so that the data processing burden of a CPU (central processing unit) can be greatly increased, and when the data processing burden of the CPU is too heavy, the problem of picture blocking or unsmooth flow can occur. Further, due to the data processing capability of the CPU, when the CPU performs soft decoding on the video chat screen, multi-channel video calls cannot be generally implemented, and when a user wants to perform video chat with multiple other users in the same chat scene, access is blocked.
In view of the above aspects, to overcome the above drawbacks, the present application discloses a dual hardware system architecture to implement multiple channels of video chat data (at least one channel of local video).
The concept to which the present application relates will be first explained below with reference to the drawings. It should be noted that the following descriptions of the concepts are only for the purpose of facilitating understanding of the contents of the present application, and do not represent limitations on the scope of the present application.
The term "module," as used in various embodiments of the present application, may refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in the embodiments of the present application refers to a component of an electronic device (such as the display device disclosed in the present application) that is capable of wirelessly controlling the electronic device, typically over a short distance. The component may typically be connected to the electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in the embodiments of the present application refers to a user behavior used to express an intended idea, action, purpose, or result through a change in hand shape or an action such as hand movement.
The term "hardware system" used in the embodiments of the present application may refer to a physical component having computing, controlling, storing, inputting and outputting functions, which is formed by a mechanical, optical, electrical and magnetic device such as an Integrated Circuit (IC), a Printed Circuit Board (PCB) and the like. In various embodiments of the present application, a hardware system may also be referred to as a motherboard (or chip).
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display apparatus 200 through the control device 100.
The control device 100 may be a remote controller 100A, which can communicate with the display device 200 through an infrared protocol communication, a bluetooth protocol communication, a ZigBee (ZigBee) protocol communication, or other short-range communication, and is used to control the display device 200 in a wireless or other wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
The control apparatus 100 may also be a smart device, such as a mobile terminal 100B, a tablet computer, a notebook computer, etc., which may communicate with the display device 200 through a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), or other networks, and implement control of the display device 200 through an application program corresponding to the display device 200.
For example, the mobile terminal 100B and the display device 200 may each have a software application installed thereon, so that connection communication between the two can be realized through a network communication protocol, and the purpose of one-to-one control operation and data communication can be further realized. Such as: a control instruction protocol can be established between the mobile terminal 100B and the display device 200, a remote control keyboard is synchronized to the mobile terminal 100B, and the function of controlling the display device 200 is realized by controlling a user interface on the mobile terminal 100B; the audio and video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
As shown in fig. 1, the display apparatus 200 may also perform data communication with the server 300 through various communication means. In various embodiments of the present application, the display device 200 may be allowed to be communicatively coupled to the server 300 via a local area network, a wireless local area network, or other network. The server 300 may provide various contents and interactions to the display apparatus 200.
Illustratively, the display device 200 receives software Program updates, or accesses a remotely stored digital media library by sending and receiving information, and Electronic Program Guide (EPG) interactions. The servers 300 may be a group or groups, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through the server 300.
The display device 200 may be, for example, a liquid crystal display, an oled (organic Light Emitting diode) display, or a projection display device; on the other hand, the display device can be a display system consisting of an intelligent television or a display and a set-top box. The specific display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function that provides a computer support function in addition to the broadcast receiving tv function. Examples include a web tv, a smart tv, an Internet Protocol Tv (IPTV), and the like. In some embodiments, the display device may not have a broadcast receiving television function.
As shown in fig. 1, the display device may be connected or provided with a camera, and is configured to present a picture taken by the camera on a display interface of the display device or other display devices, so as to implement interactive chat between users. Specifically, the picture shot by the camera can be displayed on the display device in a full screen mode, a half screen mode or any optional area.
As an optional connection mode, the camera is connected with the display rear shell through the connecting plate, is fixedly installed in the middle of the upper side of the display rear shell, and can be fixedly installed at any position of the display rear shell as an installable mode, so that an image acquisition area is ensured not to be shielded by the rear shell, for example, the display orientation of the image acquisition area is the same as that of the display equipment.
As another alternative connection mode, the camera is connected to the display rear shell through a connection board or other conceivable connector, the camera is capable of lifting, the connector is provided with a lifting motor, when a user wants to use the camera or an application program wants to use the camera, the camera is lifted out of the display, and when the camera is not needed, the camera can be embedded in the rear shell to protect the camera from being damaged.
As an embodiment, the camera adopted in the present application may have 1600 ten thousand pixels, so as to achieve the purpose of ultra high definition display. In actual use, cameras higher or lower than 1600 ten thousand pixels may also be used.
After the camera is installed on the display device, the contents displayed by different application scenes of the display device can be fused in various different modes, so that the function which cannot be realized by the traditional display device is achieved.
Illustratively, a user may conduct a video chat with at least one other user while watching a video program. The presentation of the video program may be as a background frame over which a window for video chat is displayed. The function is called 'chat while watching'.
Optionally, in a scene of "chat while watching", at least one video chat is performed across terminals while watching a live video or a network video.
In another example, a user can conduct a video chat with at least one other user while entering the educational application for learning. For example, a student may interact remotely with a teacher while learning content in an educational application. Vividly, this function can be called "chatting while learning".
In another example, a user conducts a video chat with a player entering a card game while playing the game. For example, a player may enable remote interaction with other players when entering a gaming application to participate in a game. Figuratively, this function may be referred to as "watch while playing".
Optionally, the game scene is fused with the video picture, the portrait in the video picture is scratched and displayed in the game picture, and the user experience is improved.
Optionally, in the motion sensing game (such as ball hitting, boxing, running and dancing), the human posture and motion, limb detection and tracking and human skeleton key point data detection are obtained through the camera, and then the human posture and motion, the limb detection and tracking and the human skeleton key point data detection are fused with the animation in the game, so that the game of scenes such as sports and dancing is realized.
In another example, a user may interact with at least one other user in a karaoke application in video and voice. Vividly, this function can be called "sing while watching". Preferably, when at least one user enters the application in a chat scenario, a plurality of users can jointly complete recording of a song.
In another example, a user may turn on a camera locally to take pictures and videos, figurative, which may be referred to as "looking into the mirror".
In other examples, more or less functionality may be added. The function of the display device is not particularly limited in the present application.
Fig. 2 is a block diagram schematically showing the configuration of the control apparatus 100 according to the exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communicator 130, a user input/output interface 140, a memory 190, and a power supply 180.
The control apparatus 100 is configured to control the display device 200, and to receive an input operation instruction from a user, and convert the operation instruction into an instruction recognizable and responsive by the display device 200, and to mediate interaction between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications that control the display device 200 according to user demands.
In some embodiments, as shown in fig. 1, the mobile terminal 100B or other intelligent electronic device may function similar to the control apparatus 100 after installing an application for manipulating the display device 200. Such as: the user may implement the functions of controlling the physical keys of the apparatus 100 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 100B or other intelligent electronic devices.
The controller 110 includes a processor 112, a RAM113 and a ROM114, a communication interface, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components for communication and coordination and external and internal data processing functions.
The communicator 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display apparatus 200. The communicator 130 may include at least one of a WIFI module 131, a bluetooth module 132, an NFC module 133, and the like.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touch pad 142, a sensor 143, a key 144, and the like. Such as: the user can realize a user instruction input function through actions such as voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display apparatus 200. In some embodiments, it may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
In some embodiments, the control device 100 includes at least one of a communicator 130 and an output interface. The communicator 130 is configured in the control device 100, such as: the modules of WIFI, bluetooth, NFC, etc. may send the user input command to the display device 200 through the WIFI protocol, or the bluetooth protocol, or the NFC protocol code.
And a memory 190 for storing various operation programs, data and applications for driving and controlling the control apparatus 100 under the control of the controller 110. The memory 190 may store various control signal commands input by a user.
And a power supply 180 for providing operational power support to the components of the control device 100 under the control of the controller 110. A battery and associated control circuitry.
A hardware configuration block diagram of a hardware system in the display apparatus 200 according to an exemplary embodiment is exemplarily shown in fig. 3.
When a dual hardware system architecture is adopted, the mechanism relationship of the hardware system can be shown in fig. 3. For convenience of description, one hardware system in the dual hardware system architecture will be referred to as a first hardware system or a system, a-chip, and the other hardware system will be referred to as a second hardware system or N-system, N-chip. The chip A comprises a controller of the chip A and various modules connected with the controller of the chip A through various interfaces, and the chip N comprises a controller of the chip N and various modules connected with the controller of the chip N through various interfaces. The chip a and the chip N may each have a relatively independent operating system, and the operating system of the chip a and the operating system of the chip N may communicate with each other through a communication protocol, which is as follows: the frame layer of the operating system of the a-chip and the frame layer of the operating system of the N-chip can communicate to transmit commands and data, so that two independent subsystems, which are associated with each other, exist in the display device 200.
As shown in fig. 3, the a chip and the N chip may be connected, communicated and powered through a plurality of different types of interfaces. The interface type of the interface between the a chip and the N chip may include a General-purpose input/output (GPIO) interface, a USB interface, an HDMI interface, a UART interface, and the like. One or more of these interfaces may be used for communication or power transfer between the a-chip and the N-chip. For example, as shown in fig. 3, in the dual hardware system architecture, the N chip may be powered by an external power source (power), and the a chip may not be powered by the external power source but by the N chip.
In addition to the interface for connecting with the N chip, the a chip may further include an interface for connecting other devices or components, such as an MIPI interface for connecting a Camera (Camera) shown in fig. 3, a bluetooth interface, and the like.
Similarly, in addition to the interface for connecting with the N chip, the N chip may further include an VBY interface for connecting with a display screen tcon (timer Control register), and an i2S interface for connecting with a power Amplifier (AMP) and a Speaker (Speaker); and an IR/Key interface, a USB interface, a Wifi interface, a bluetooth interface, an HDMI interface, a Tuner interface, and the like.
The dual hardware system architecture of the present application is further described below with reference to fig. 4. It should be noted that fig. 4 is only an exemplary illustration of the dual hardware system architecture of the present application, and does not represent a limitation of the present application. In actual practice, both hardware systems may contain more or less hardware or interfaces as desired.
A block diagram of the hardware architecture of the display device 200 according to fig. 3 is exemplarily shown in fig. 4. As shown in fig. 4, the hardware system of the display device 200 may include an a chip and an N chip, and a module connected to the a chip or the N chip through various interfaces.
The N-chip may include a tuner demodulator 220, a communicator 230, an external device interface 250, a controller 210, a memory 290, a user input interface, a video processor 260-1, an audio processor 260-2, a display 280, an audio output interface 272, and a power supply. The N-chip may also include more or fewer modules in other embodiments.
The tuning demodulator 220 is configured to perform modulation and demodulation processing such as amplification, mixing, resonance and the like on a broadcast television signal received in a wired or wireless manner, so as to demodulate an audio/video signal carried in a frequency of a television channel selected by a user and additional information (e.g., an EPG data signal) from a plurality of wireless or wired broadcast television signals. Depending on the broadcast system of the television signal, the signal path of the tuner 220 may be various, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; according to different modulation types, the adjustment mode of the signal can be a digital modulation mode or an analog modulation mode; and depending on the type of television signal being received, tuner demodulator 220 may demodulate analog and/or digital signals.
The tuner demodulator 220 is also operative to respond to the user-selected television channel frequency and the television signals carried thereby, in accordance with the user selection, and as controlled by the controller 210.
In other exemplary embodiments, the tuner/demodulator 220 may be in an external device, such as an external set-top box. In this way, the set-top box outputs television audio/video signals after modulation and demodulation, and the television audio/video signals are input into the display device 200 through the external device interface 250.
The communicator 230 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communicator 230 may include a WIFI module 231, a bluetooth communication protocol module 232, a wired ethernet communication protocol module 233, and other network communication protocol modules such as an infrared communication protocol module or a near field communication protocol module.
The display apparatus 200 may establish a connection of a control signal and a data signal with an external control apparatus or a content providing apparatus through the communicator 230. For example, the communicator may receive a control signal of the remote controller 100A according to the control of the controller.
The external device interface 250 is a component for providing data transmission between the N-chip controller 210 and the a-chip and other external devices. The external device interface may be connected with an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
The external device interface 250 may include: a High Definition Multimedia Interface (HDMI) terminal 251, a Composite Video Blanking Sync (CVBS) terminal 252, an analog or digital component terminal 253, a Universal Serial Bus (USB) terminal 254, a red, green, blue (RGB) terminal (not shown), and the like. The number and type of external device interfaces are not limited by this application.
The controller 210 controls the operation of the display device 200 and responds to the user's operation by running various software control programs (e.g., an operating system and/or various application programs) stored on the memory 290.
As shown in fig. 4, the controller 210 includes a read only memory RAM213, a random access memory ROM214, a graphics processor 216, a CPU processor 212, a communication interface, and a communication bus. The RAM213, the ROM214, the graphic processor 216, the CPU processor 212, and the communication interface are connected via a bus.
A ROM213 for storing instructions for various system boots. If the display device 200 is powered on upon receipt of the power-on signal, the CPU processor 212 executes a system boot instruction in the ROM and copies the operating system stored in the memory 290 to the RAM214 to start running the boot operating system. After the start of the operating system is completed, the CPU processor 212 copies the various application programs in the memory 290 to the RAM214, and then starts running and starting the various application programs.
A graphics processor 216 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And a renderer for generating various objects based on the operator and displaying the rendered result on the display 280.
A CPU processor 212 for executing operating system and application program instructions stored in memory 290. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 212 may include a plurality of processors. The plurality of processors may include a main processor and a plurality of or a sub-processor. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. A plurality of or one sub-processor for performing an operation in a standby mode or the like.
The communication interfaces may include a first interface 218-1, a second interface 218-2 through an nth interface 218-n. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 210 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to an icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
The memory 290 includes a memory for storing various software modules for driving and controlling the display apparatus 200. Such as: various software modules stored in memory 290, including: the system comprises a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like.
The basic module is a bottom layer software module for signal communication between hardware in the display device 200 and sending processing and control signals to an upper layer module. The detection module is a management module used for collecting various information from various sensors or user input interfaces, and performing digital-to-analog conversion and analysis management.
For example: the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is a module for controlling the display 280 to display image content, and may be used to play information such as multimedia image content and UI interface. The communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing data communication between the browsing servers. The service module is a module for providing various services and various application programs.
Meanwhile, the memory 290 is also used to store visual effect maps and the like for receiving external data and user data, images of respective items in various user interfaces, and a focus object.
A user input interface for transmitting an input signal of a user to the controller 210 or transmitting a signal output from the controller to the user. For example, the control device (e.g., a mobile terminal or a remote controller) may send an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by a user to the user input interface, and then the input signal is forwarded to the controller by the user input interface; alternatively, the control device may receive an output signal such as audio, video, or data output from the user input interface via the controller, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter a user command on a Graphical User Interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
The video processor 260-1 is configured to receive a video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on the display 280.
Illustratively, the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, such as a 24Hz, 25Hz, 30Hz, or 60Hz video, into a 60Hz, 120Hz, or 240Hz frame rate, where the input frame rate may be related to a source video stream, and the output frame rate may be related to an update rate of a display. The input is realized in a common format by using a frame insertion mode.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output an RGB data signal.
And a display 280 for receiving the image signal input from the video processor 260-1 and displaying the video content and image and the menu manipulation interface. The display 280 includes a display component for presenting a picture and a driving component for driving the display of an image. The video content may be displayed from the video in the broadcast signal received by the tuner/demodulator 220, or from the video content input from the communicator or the external device interface. And a display 220 simultaneously displaying a user manipulation interface UI generated in the display apparatus 200 and used to control the display apparatus 200.
And, a driving component for driving the display according to the type of the display 280. Alternatively, in case the display 280 is a projection display, it may also comprise a projection device and a projection screen.
The audio processor 260-2 is configured to receive an audio signal, decompress and decode the audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, amplification and other audio data processing to obtain an audio signal that can be played in the speaker 272.
An audio output interface 270 for receiving the audio signal output by the audio processor 260-2 under the control of the controller 210, wherein the audio output interface may include a speaker 272 or an external sound output terminal 274 for outputting to a generating device of an external device, such as: external sound terminal or earphone output terminal.
In other exemplary embodiments, video processor 260-1 may comprise one or more chip components. The audio processor 260-2 may also include one or more chips.
And, in other exemplary embodiments, the video processor 260-1 and the audio processor 260-2 may be separate chips or may be integrated in one or more chips with the controller 210.
And a power supply for supplying power supply support to the display apparatus 200 from the power input from the external power source under the control of the controller 210. The power supply may include a built-in power supply circuit installed inside the display apparatus 200, or may be a power supply installed outside the display apparatus 200, such as a power supply interface for providing an external power supply in the display apparatus 200.
Similar to the N-chip, as shown in fig. 4, the a-chip may include a controller 310, a communicator 330, a detector 340, and a memory 390. A video processor 360, a user input interface, an audio processor, a display, an audio output interface may also be included in some embodiments. In some embodiments, there may also be a power supply that independently powers the A-chip.
The communicator 330 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communicator 330 may include a WIFI module 331, a bluetooth communication protocol module 332, a wired ethernet communication protocol module 333, and other network communication protocol modules such as an infrared communication protocol module or a near field communication protocol module.
The communicator 330 of the a-chip and the communicator 230 of the N-chip also interact with each other. For example, the WiFi module 231 within the N-chip hardware system is used to connect to an external network, generate network communication with an external server, and the like. The WiFi module 331 in the a-chip hardware system is used to connect to the N-chip WiFi module 231 without making a direct connection with an external network or the like, and the a-chip is connected to an external network through the N-chip. Therefore, for the user, a display device as in the above embodiment displays a WiFi account to the outside.
The detector 340 is a component of the display device a chip for collecting signals of an external environment or interacting with the outside. The detector 340 may include a light receiver 342, a sensor for collecting the intensity of ambient light, which may be used to adapt to display parameter changes, etc.; the system may further include an image collector 341, such as a camera, a video camera, etc., which may be configured to collect external environment scenes, collect attributes of the user or interact gestures with the user, adaptively change display parameters, and identify user gestures, so as to implement a function of interaction with the user.
An external device interface 350, which provides a component for data transmission between the controller 310 and the N-chip or other external devices. The external device interface may be connected with an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner.
The controller 310 controls the operation of the display device 200 and responds to the user's operation by running various software control programs stored on the memory 390 (e.g., using installed third party applications, etc.), and interacting with the N-chip.
As shown in fig. 4, the controller 310 includes a read only memory ROM313, a random access memory RAM314, a graphics processor 316, a CPU processor 312, a communication interface, and a communication bus. The ROM313, the RAM314, the graphic processor 316, the CPU processor 312, and the communication interface are connected via a bus.
A ROM313 for storing instructions for various system boots. CPU processor 312 executes system boot instructions in ROM and copies the operating system stored in memory 390 to RAM314 to begin running the boot operating system. After the start of the operating system is completed, the CPU processor 312 copies various application programs in the memory 390 to the RAM314, and then starts running and starting various application programs.
The CPU processor 312 is used for executing the operating system and application program instructions stored in the memory 390, communicating with the N chip, transmitting and interacting signals, data, instructions, etc., and executing various application programs, data and contents according to various interaction instructions received from the outside, so as to finally display and play various audio and video contents.
The communication interfaces may include a first interface 318-1, a second interface 318-2 through an nth interface 318-n. These interfaces may be network interfaces connected to external devices via a network, or may be network interfaces connected to the N-chip via a network.
The controller 310 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
A graphics processor 316 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And a renderer for generating various objects based on the operator and displaying the rendered result on the display 280.
Both the A-chip graphics processor 316 and the N-chip graphics processor 216 are capable of generating various graphics objects. In distinction, if application 1 is installed on the a-chip and application 2 is installed on the N-chip, the a-chip graphics processor 316 generates a graphics object when a user performs a command input by the user in application 1 at the interface of application 1. When a user makes a command input by the user in the interface of the application 2 and within the application 2, a graphic object is generated by the graphic processor 216 of the N chip.
Fig. 5 is a diagram schematically illustrating a functional configuration of a display device according to an exemplary embodiment.
As shown in fig. 5, the memory 390 of the a-chip and the memory 290 of the N-chip are used to store an operating system, an application program, contents, user data, and the like, respectively, and perform system operations for driving the display device 200 and various operations in response to a user under the control of the controller 310 of the a-chip and the controller 210 of the N-chip. The A-chip memory 390 and the N-chip memory 290 may include volatile and/or non-volatile memory.
The memory 290 is specifically configured to store an operating program for driving the controller 210 in the display device 200, and store various applications installed in the display device 200, various applications downloaded by a user from an external device, various graphical user interfaces related to the applications, various objects related to the graphical user interfaces, user data information, and internal data of various supported applications. The memory 290 is used to store system software such as an Operating System (OS) kernel, middleware, and applications, and to store input video data and audio data, and other user data.
The memory 290 is specifically used for storing drivers and related data such as the video processor 260-1 and the audio processor 260-2, the display 280, the communication interface 230, the tuner demodulator 220, the input/output interface, and the like.
In some embodiments, memory 290 may store software and/or programs, software programs for representing an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (e.g., the middleware, APIs, or applications), and the kernel may provide interfaces to allow the middleware and APIs, or applications, to access the controller to implement controlling or managing system resources.
The memory 290, for example, includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a light receiving module, a power control module 2910, an operating system 2911, and other application programs 2912, a browser module, and the like. The controller 210 performs functions such as: the system comprises a broadcast television signal receiving and demodulating function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction identification function, a communication control function, an optical signal receiving function, an electric power control function, a software control platform supporting various functions, a browser function and other various functions.
The memory 390 includes a memory storing various software modules for driving and controlling the display apparatus 200. Such as: various software modules stored in memory 390, including: the system comprises a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like. Since the functions of the memory 390 and the memory 290 are similar, reference may be made to the memory 290 for relevant points, and thus, detailed description thereof is omitted here.
Illustratively, the memory 390 includes an image control module 3904, an audio control module 3906, an external instruction recognition module 3907, a communication control module 3908, a light receiving module 3909, an operating system 3911, and other application programs 3912, a browser module, and the like. The controller 210 performs functions such as: the system comprises an image control function, a display control function, an audio control function, an external instruction identification function, a communication control function, an optical signal receiving function, an electric power control function, a software control platform supporting various functions, a browser function and other various functions.
Differently, the external instruction recognition module 2907 of the N-chip and the external instruction recognition module 3907 of the a-chip can recognize different instructions.
Illustratively, since the image receiving device such as a camera is connected with the a-chip, the external instruction recognition module 3907 of the a-chip may include an image recognition module 3907-1, a graphic database is stored in the image recognition module 3907-1, and when the camera receives an external graphic instruction, the camera corresponds to the instruction in the graphic database to perform instruction control on the display device. Since the voice receiving device and the remote controller are connected to the N-chip, the external command recognition module 2907 of the N-chip may include a voice recognition module 2907-2, a voice database is stored in the voice recognition module 2907-2, and when the voice receiving device receives an external voice command or the like, the voice receiving device and the like perform a corresponding relationship with a command in the voice database to perform command control on the display device. Similarly, a control device 100 such as a remote controller is connected to the N-chip, and the key command recognition module 2907-3 performs command interaction with the control device 100.
A block diagram of a configuration of a software system in a display device 200 according to an exemplary embodiment is exemplarily shown in fig. 6 a.
For an N-chip, as shown in fig. 6a, the operating system 2911, includes executing operating software for handling various basic system services and for performing hardware related tasks.
In some embodiments, portions of the operating system kernel may contain a series of software to manage the display device hardware resources and provide services to other programs or software code.
In other embodiments, portions of the operating system kernel may include one or more device drivers, which may be a set of software code in the operating system that assists in operating or controlling the devices or hardware associated with the display device. The drivers may contain code that operates the video, audio, and/or other multimedia components. Examples include a display, a camera, Flash, WiFi, and audio drivers.
The accessibility module 2911-1 is configured to modify or access the application program to achieve accessibility and operability of the application program for displaying content.
A communication module 2911-2 for connection to other peripherals via associated communication interfaces and a communication network.
The user interface module 2911-3 is configured to provide an object for displaying a user interface, so that each application program can access the object, and user operability can be achieved.
Control applications 2911-4 for controlling process management, including runtime applications and the like.
The event transmission system 2914 may be implemented within the operating system 2911 or within the application 2912. In some embodiments, an aspect is implemented within the operating system 2911, while implemented in the application 2912, for listening for various user input events, and will implement one or more sets of predefined operations in response to various events referring to the recognition of various types of events or sub-events.
The event monitoring module 2914-1 is configured to monitor an event or a sub-event input by the user input interface.
The event identification module 2914-2 is used to input various event definitions for various user input interfaces, identify various events or sub-events, and transmit them to the process for executing one or more sets of their corresponding handlers.
The event or sub-event refers to an input detected by one or more sensors in the display device 200 and an input of an external control device (e.g., the control apparatus 100). Such as: the method comprises the following steps of inputting various sub-events through voice, inputting a gesture sub-event through gesture recognition, inputting a remote control key command of a control device and the like. Illustratively, the one or more sub-events in the remote control include a variety of forms including, but not limited to, one or a combination of key presses up/down/left/right/, ok keys, key presses, and the like. And non-physical key operations such as move, hold, release, etc.
The interface layout management module 2913, directly or indirectly receiving the input events or sub-events from the event transmission system 2914, monitors the input events or sub-events, and updates the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the size, position, and level of the container, which are related to the layout of the interface.
Since the functions of the operating system 3911 of the a chip are similar to those of the operating system 2911 of the N chip, reference may be made to the operating system 2911 for relevant points, and details are not repeated here.
As shown in fig. 6b, the application layer of the display device contains various applications that can be executed at the display device 200.
The N-chip application layer 2912 may include, but is not limited to, one or more applications such as: a video-on-demand application, an application center, a game application, and the like. The application layer 3912 of the a-chip may include, but is not limited to, one or more applications such as: live television applications, media center applications, and the like. It should be noted that what applications are respectively contained in the a chip and the N chip is determined according to an operating system and other designs, and the present invention does not need to make specific limitations and divisions on the applications contained in the a chip and the N chip.
The live television application program can provide live television through different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
A video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
The media center application program can provide various applications for playing multimedia contents. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
The application program center can provide and store various application programs. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on a display device. The application center may obtain these applications from different sources, store them in local storage, and then be operable on the display device 200.
A schematic diagram of a user interface in a display device 200 according to an exemplary embodiment is illustrated in fig. 7. As shown in fig. 7, the user interface includes a plurality of view display areas, illustratively, a first view display area 201 and a play screen 202, wherein the play screen includes a layout of one or more different items. And a selector in the user interface indicating that the item is selected, the position of the selector being movable by user input to change the selection of a different item.
It should be noted that the multiple view display areas may present display screens of different hierarchies. For example, a first view display area may present video chat project content and a second view display area may present application layer project content (e.g., web page video, VOD presentations, application screens, etc.).
Optionally, the different view display areas are presented with different priorities, and the display priorities of the view display areas are different among the view display areas with different priorities. If the priority of the system layer is higher than that of the application layer, when the user uses the acquisition selector and picture switching in the application layer, the picture display of the view display area of the system layer is not blocked; and when the size and the position of the view display area of the application layer are changed according to the selection of the user, the size and the position of the view display area of the system layer are not influenced.
The display frames of the same hierarchy can also be presented, at this time, the selector can switch between the first view display area and the second view display area, and when the size and the position of the first view display area are changed, the size and the position of the second view display area can be changed along with the change.
Since the a-chip and the N-chip may have independent operating systems installed therein, there are two independent but interrelated subsystems in the display device 200. For example, Android (Android) and various APPs can be independently installed on the chip a and the chip N, so that each chip can realize a certain function, and the chip a and the chip N cooperatively realize a certain function.
The display device 200 provided by the embodiment of the application is mainly used for televisions, particularly social televisions. Based on the display device 200 provided in the embodiment of the present application, in order to realize that a user synchronously views and simultaneously performs a video call in one scene, the embodiment of the present application provides a service management method for an auditorium.
In some embodiments, the method provided by the present application is not only applicable to the dual-chip display device provided by the above embodiments, but also applicable to other non-dual-chip display devices.
In some embodiments, the display device using the auditorium service management method may be a television provided with a dual chip cooperating with the above embodiments, or may be another smart television, and only needs to support the operation of the method.
FIG. 8 illustrates a user interface diagram of a first display device in some embodiments of the present application. When the user operates the remote control device to select the control of the 'newly-built auditorium' in fig. 8, and the first display device side meets the condition of creating the auditorium when the first display device receives the instruction of the newly-built auditorium, the user enters the auditorium creation interface. Fig. 9 illustrates an auditorium creation interface for a first display device in some embodiments of the present application. As shown in fig. 9, the newly created auditorium interface includes private auditorium controls for creating private auditorium traffic and public auditorium controls for a user to create public auditorium traffic.
In some embodiments of the present application, a selection is made to create a auditorium type, such as selecting a "private auditorium" control, as indicated in fig. 9. And when the user selects the control of the private auditorium, displaying an editing interface of the private auditorium.
In some embodiments of the present application, fig. 10 illustrates an interface for a first display device to create a private auditorium in some embodiments of the present application. When the "private auditorium" control is selected, the interface of fig. 10 is entered. As shown in fig. 10, the private auditorium editing interface includes an auditorium name edit box, an add movie control, an invite friends control, a determine launch control, and so on. The user can perform operations such as setting or modifying the name of the auditorium, adding a movie played by the video, inviting friends, and the like through the interface illustrated in fig. 10.
In some embodiments of the present application, when the user selects the auditorium name edit box and makes an edit input, thereby implementing the setting of the auditorium name, the "auditorium for afar tea" shown in fig. 10. When a user selects the movie addition control, the display device receives a selection instruction of the movie addition control, starts a search interface, records the identification of the movie according to the selection of the user on the movie, and replaces the movie addition control with a movie display control for displaying the selection.
In some embodiments of the present application, fig. 11 illustrates a search interface entered by a first display device according to a selection instruction of an "add movie" control in some embodiments of the present application, and fig. 12 illustrates an interface recording an identification of a movie according to a user's selection of the movie in some embodiments of the present application.
In some embodiments of the application, when a user selects an invitation friend control, the display device receives a selection instruction of the invitation friend control, calls contact information and starts a contact selection interface, records the identification of the contact according to the selection of the user on the contact, and replaces the invitation friend control with a contact display control for displaying the selected contact. The user can select a plurality of contact display controls as required. FIG. 13 illustrates a contact selection interface of a first display device in some embodiments of the present application.
In some embodiments of the present application, a user selects a certain initiating control, the display device receives an input selection instruction for the certain initiating control, and sends a theater creation request according to a name in a theater name edit box, an identifier of a movie, and an identifier of a contact. The auditorium request is used for enabling the server to respond to the auditorium request to create a private auditorium service, and the private auditorium service determines video data needing to be sent to the display device according to the identification of the film; determining an invitation request to be sent according to the contact person identifier, wherein the invitation request is used for inviting other display equipment corresponding to the contact person identifier to access the auditorium service, and the display equipment receiving the invitation becomes second display equipment; and calling the hi application to start the camera of the opposite terminal after accessing the auditorium service, returning the video data of the opposite terminal, receiving the local video data of the first display device, and performing video chat. And then the functions of watching video and calling video can be completed in one scene, namely, the real chat while watching is realized.
In some embodiments of the present application, an application environment of the auditorium service management method includes a first display device, a plurality of second display devices, and a server. Wherein: the first display device and the second display device are relative concepts and are both display devices; the server is divided into a auditorium server and a video call server according to the functions of the server, and can also be divided in more detail, such as dividing the auditorium server into a room service, a message push service, a video play service and the like.
In some embodiments of the present application, the auditorium service includes a video playing service unit, a video call service unit, and a message service unit, where the video playing service unit is configured to control different devices to play a first video synchronously, the video call service unit is configured to perform a video call between the different devices, and a message service unit user performs transmission of text and/or emoticon and/or text messages between the different devices. And after receiving the auditorium creation request, the server creates the auditorium service according to the creation request and sends an invitation to the second display device according to the contact person identifier carried by the auditorium creation request.
In some embodiments of the present application, the video call function establishes and implements the cooperation between the normal video call server and the IM (instant messaging) system interface. Some embodiments of the present application provide that the first display device creates a theater through the server, and the theater is utilized to realize synchronous viewing of the first display device and the second display device and to realize interaction through the mobile terminal.
In some embodiments of the application, the first display device receives a selection instruction of a certain initiating control for receiving input from a user, and sends a request for creating an auditorium to the server according to the name in the auditorium name edit box, the identification of the movie (first video identification) and the identification of the contact person. The create auditorium request includes the first video identification. The server creates a auditorium service based on the received request to create the auditorium. The auditorium service in the embodiment of the application is used for enabling different display devices accessing the auditorium service to simultaneously play the first video corresponding to the first video identifier and establish a chat room for video chat.
In some embodiments of the present application, a server receives a create auditorium request sent by a first display device. When the server receives a request for creating the auditorium sent by the first display device, whether the first display device meets the condition for creating the auditorium is verified. And if the first display equipment end meets the condition of establishing the auditorium, the server establishes the auditorium service, generates an auditorium service identifier according to the auditorium service successfully established, and returns the auditorium service identifier to the first display equipment. And if the first display equipment does not meet the condition of creating the auditorium, returning a service error code to the first display equipment to remind a user of failed creation or give a prompt and guide to the first display equipment.
In some embodiments of the present application, the first display device receives the auditorium service identifier sent by the server. In response to the received auditorium service identification, the first display device accesses the auditorium service in accordance with the received auditorium service identification. And the server receives the video data of the first video corresponding to the first video identifier fed back to the first display equipment by the first display equipment according to the business access of the auditorium and the first video identifier. The first display device receives the video data of the first video fed back by the server and plays the first video according to the first video data.
In some embodiments of the present application, in response to the received auditorium service identifier, the first display device activates a camera thereon, obtains through the camera, and displays the local video data in a local video window on a playing interface for playing the first video. In addition, the first display device transmits the local video data to the server according to the auditorium service identification. The server receives the local video data sent by the first display device, and if the second display device accesses the auditorium service, the server sends the local video data of the first display device to the second display device.
In some embodiments of the present application, in response to a received service identifier of an auditorium, a first display device draws a play interface for playing a first video on the display interface, and sets a local video window and an opposite-end video window in a floating layer on an upper layer of the play interface, where a local video window user plays local video data, and an opposite-end video window user plays received video data of a second display device.
In some embodiments of the present application, after receiving local video data uploaded according to an auditorium service identifier, a server distributes the local video data to a second display device accessing the auditorium service by using a video call service unit. In some embodiments of the application, when the server creates the auditorium service, the server sends the auditorium service identifier to the second display device corresponding to the contact identifier according to the contact identifier in the auditorium creation request. And if the second display equipment receives the auditorium service identification and accesses the auditorium service, the server sends the video data of the first video to the second display equipment according to the playing progress of the first video on the first display equipment, and the second display equipment receives the video data of the first video and then plays the first video synchronously with the first display equipment.
Meanwhile, if the second display device receives the service identifier of the mapping hall, a camera of the second display device is started to acquire local video data of the second display device, and the local video data is displayed on a local video window on a playing interface for playing the first video. In some embodiments of the present application, the local video data of the second display device is transmitted to the server while the second display device displays the local video data. And the server receives the local video data of the second display device and feeds the received local video data of the second display device back to the first display device. The first display device receives the uploaded local video data of the second display device sent by the server, and displays the uploaded local video data of the second display device on an opposite-end video window on a playing interface for playing the first video.
Meanwhile, the second display device receives local video data uploaded by the first display device and sent by the server, and if more than one second display device exists, the second display device also receives local video data uploaded by other second display devices. Furthermore, the first display device accessing the auditorium service through the auditorium service identifier displays local video data of the first video in a local video window on the upper layer of a playing interface for playing the first video, and simultaneously displays the uploaded local video data in a second display of an opposite terminal through at least one opposite terminal video window; displaying local video data of a second display device accessing the auditorium service through the auditorium service identifier on a playing interface for playing the first video, and simultaneously displaying the local video data uploaded by the first display device and the local video data uploaded by other second display devices; the method and the device realize that the user can simultaneously carry out video chat when synchronously watching the first video.
In some embodiments of the present application, a first display device displays, in an opposite-end video window on an upper layer of a playing interface for playing a first video, local video data uploaded by a second display device; the second display device displays the local video data of the second display device in a local video window on the upper layer of a playing interface for playing the first video, and displays the uploaded local video data of the first display device and the uploaded local video data of other second display devices in an opposite-end video window on the playing interface for playing the first video.
Optionally, in some embodiments of the present application, a video window on an upper layer of the playing interface for playing the first video is located on one side of the first video display window, for example, on the right side of the first video display window.
In some embodiments of the application, after receiving an operation instruction of selecting the invitation control, a contact selection interface is displayed. The contact person selection interface comprises a plurality of contact person controls for representing different contact persons, and a user can perform the contact person controls according to needs. Typically, there is an upper limit control for the user selectable contact controls, such as a maximum of 5 contact controls selected.
In some embodiments of the present application, when the second display device receives the invitation information, if the invitation is accepted, the identifier of the auditorium service of the second display device accesses the auditorium service. And when receiving a request for accessing the auditorium service sent by the second display device, the server determines whether the second display device meets the condition of joining the auditorium. Such as the number of users currently participating in the auditorium, the rights of the users to watch the video, etc.
In some embodiments of the present application, a user may join a limited number of auditoriums at a time, such as one or five auditoriums. Sometimes, after a user sends a service request for creating a auditorium, a server sends an invitation corresponding to a contact person which is just selected, but after a period of time, some second display devices corresponding to the contact persons join, and some second display devices corresponding to the contact persons do not join or some second display devices quit the auditorium service. When the second display device accepts the invitation of the auditorium, the number of the people joining the auditorium at the same time is limited, the server side needs to verify the conditions of joining the auditorium, the conditions of joining the auditorium can include at least one of the number of currently joined auditorium and whether the number of the current online users to be joined into the auditorium is full, and if the number of the currently joined auditorium at the second display device end is within the allowable range and the number of the current online users to be joined into the auditorium is not full, the second display device meets the conditions of joining the auditorium. When the second display device meets the condition of joining the auditorium, the server allows the second display device to join the auditorium; otherwise, directly returning the service error code to the second display equipment terminal, and giving corresponding prompt and guide to the second display equipment user.
And when the second display equipment receives the invitation of the first display equipment and successfully accesses the service of the auditorium, the server sends the video data of the first video to the second display equipment according to the playing progress of the first video on the first display equipment. The second display device receives the video data of the first video, and the video playing window of the second display device plays the first video, so that the synchronous film playing with the first display device end on the second display device is realized, and then the first display device end user and the second display device end user can synchronously watch the film.
In some embodiments of the present application, the first display device sends a video playing progress to the server. If the first display device sends the video playing progress to the server periodically, the server corrects the video playing progress of all the second display devices in the video studio service, and ensures that all the display devices in the video studio can play videos synchronously, thereby ensuring that users in the video studio can watch videos synchronously.
In some embodiments of the present application, the first display device sends a heartbeat request to the server to tell the server that the first display device is currently online and the server receives the heartbeat request. Optionally, the first display device sends a heartbeat request to the server periodically, the server receives the heartbeat request sent by the first display device periodically, and when the server does not receive the heartbeat request sent by the first display device end in a predetermined time, the first display device is considered to be offline. Optionally, the first display device sends a heartbeat request to the server, after receiving the heartbeat request sent by the first display device, the server returns next heartbeat request sending time to the first display device, the first display device is required to send a heartbeat request to the server at the next heartbeat request sending time, and the first display device sends a heartbeat request to the server according to the next heartbeat request sending time. And if the server does not receive the heartbeat request sent by the first display equipment terminal at the next heartbeat request sending time, the first display equipment terminal is considered to be offline. Therefore, the first display device and the server realize the monitoring of the online state of the first display device through the interaction of the heartbeat request.
In some embodiments of the present application, when video playing is performed in a theater service, a first display device and a second display device receive a selection of a user for a preset message; generating a first interactive message according to a preset message selected by a user and a preset message identification character; and sending the first interactive message to the server so that the server forwards the first interactive message to all the devices corresponding to the chat room service. When the first display device successfully creates the auditorium service, the first display device receives the preset message set sent by the server, and when the second display device joins the auditorium service, the second display device receives the preset message set sent by the server. Optionally, the preset message set includes a preset emoticon message and a preset text message, and both the preset emoticon message and the preset text message carry preset message identification characters.
In some embodiments of the present application, the video playing interfaces of the first display device and the second display device include preset message controls, the first display device user and the second display device user can select the preset message controls through the control device, the display device sends the generated interactive message according to the selected preset message, sends the interactive message to the server, and then sends the interactive message to other display devices through the server. If so, the interactive message sent by the first display device is issued to the second display device by the server, and the second display device displays the interactive message sent by the first display device; the server sends the interactive message sent by the second display device to the first display device and other second display devices, and the first display device and other second display devices display the interactive message sent by the second display device.
In some embodiments of the application, when video playing is performed on a video exhibition hall service, a first display device and a second display device receive an operation of a user, and display a bullet screen control on a video playing interface, where the bullet screen control is used for receiving the operation of the user on the bullet screen control to start or close display of an interactive message on the video playing interface of the display device.
In some embodiments of the present application, when the server determines that the first display device is offline, the server defaults that the auditorium will be passively dismissed, the server sends a dismissal notification to all the second display devices in the auditorium, and the server stops the interaction at the display device side in the auditorium, that is, the server does not forward the interactive content of the display devices. In addition, the video played in the original auditorium service is played to the end of the auditorium service, and the playing cannot be influenced by the dismissal of the auditorium service.
Additionally, in some embodiments of the present application, the first display device may actively send a auditorium dismissal request to the server, and when the server receives the auditorium dismissal request sent by the first display device, the server dismisses the auditorium and sends an auditorium dismissal notification to all second display devices in the auditorium. The server stops the interaction of the display device side in the auditorium service, i.e. the server does not forward the interactive content of the display device any more.
Therefore, when receiving a request for joining the auditorium, which is sent by the second display device corresponding to the invited friend, the server also needs to verify whether the current auditorium is dismissed, determine that the auditorium is not dismissed and that the second display device meets the condition for joining the auditorium service, and allow the second display device to access the auditorium service.
In some embodiments of the present application, the terminal device uploads the message according to the identifier of the auditorium service, and the message service unit in the auditorium service processes and forwards the message.
In some embodiments of the present application, when the first display device receives the auditorium service identifier, the video playback interface of the first display device displays a coded graphic, where the coded graphic is used for enabling the mobile terminal to access the auditorium service by scanning a code so as to enable the user to perform interactive message transmission through the mobile terminal. Optionally, the encoded graphic may be a two-dimensional code, but is not limited to a two-dimensional code. Optionally, the first display device generates the encoded graphics based on the auditorium service identification. And when the second display equipment receives the service identification of the auditorium and accesses the auditorium service, the video playing interface of the second display equipment displays the coding graph. After the mobile terminal is bound with the display device through the applet account, the mobile terminal can perform other operations such as remote control and picture transmission on the display device. The first display equipment user uses the mobile terminal to scan the coded graph to obtain the coded graph information, analyzes the information obtained by the coded graph, and loads the interactive message editing interface according to the auditorium service identification in the two-dimensional code information. The second display equipment user can scan the coded graph by using the mobile terminal to obtain the coded graph information, analyzes the information obtained by the coded graph, and loads the interactive message editing interface according to the auditorium service identification in the two-dimensional code information.
The interactive message editing interface is used for the mobile terminal to send interactive messages and receive interactive messages. In order to realize the mobile terminal sending the interactive message, the selectable interactive message editing interface comprises a preset message control and a character editing control. The user can select the preset message and input the editing characters to send the interactive message by operating the mobile terminal. Optionally, in the interactive message editing interface, a first interactive message generated by a user selecting a preset message and a second interactive message generated by a user inputting a character edit are received, and then the first interactive message and the second interactive message are sent to the server and sent to the display device and other mobile terminals through the server. The mobile terminal receives a first interactive message and a second interactive message which are sent by the display equipment and other mobile terminals through the server.
In addition, the interactive message editing interface also comprises the name of the auditorium, the media resource introduction of the video played in the auditorium and the like. Optionally, the upper part of the interactive message editing interface displays the name of the auditorium and the media introduction of the video played in the auditorium, and the lower part displays the interactive content, the preset message control and the character editing control, such as an interactive content editing and selecting window, for viewing the interactive message and sending the interactive message.
In some embodiments of the present application, the display device displays the interactive message on a video display interface of the display device according to the interactive message pushed by the receiving server; and the mobile terminal receives the interactive message pushed by the server and displays the interactive message on a display interface of the mobile terminal. Optionally, the display device receives the service push and the interactive message generated by the mobile terminal; if the interactive message is generated by the mobile terminal according to the preset message, displaying the interactive message on a video playing interface of the display equipment; and if the interactive message is not the interactive message generated by the mobile terminal according to the preset message, not displaying the interactive message on a video playing interface of the display equipment. Namely, the first interactive message generated by the mobile terminal is displayed on the video playing interface of the display device, and the second interactive message generated by the mobile terminal is not displayed.
In some embodiments of the present application, when a user wants to send an interactive message through a display device, the user selects a preset message through a remote control device to generate the interactive message, and gives an interactive message sending instruction to the display device. And when the display equipment receives the interactive message sending instruction, the interactive message sending instruction sends the interactive message to the server. The server receives the interactive message sent by the display device, broadcasts and pushes the interactive message, and pushes the interactive message to other display devices and mobile terminals in the auditorium.
When a user wants to send an interactive message through the mobile terminal, the user operates the mobile terminal to select the interactive message generated by the preset message or input the editing characters to generate the interactive message, and an interactive message sending instruction is given to the mobile terminal. And when the mobile terminal receives an interactive message sending instruction, sending a message corresponding to the interactive message sending instruction to a server. The server receives the interactive message sent by the mobile terminal, broadcasts and pushes the interactive message, for example, the interactive message is pushed to a display device terminal and other mobile terminals in a theater.
A user operates the mobile terminal to edit the input of characters to generate an interactive message, and a character input keyboard is loaded in response to the selection of the user on the information input box; displaying a character string input by a user in an information input box according to the selection of the virtual key on the character input keyboard by the user; and generating an interactive message according to the character string according to the selection of the sending control by the user, wherein the character bit of the identification field in the interactive message is not the preset message identification character.
Optionally, in some embodiments of the present application, the interactive message editing interface includes a first control, and the first control includes a first state and a second state. When the first control is in a first state, loading a character input keyboard, and presenting a character string input by a user in an information input box according to the selection of the user on a virtual key on the character input keyboard; and generating an interactive message according to the character string according to the selection of the sending control by the user, wherein the character bit of the identification field in the interactive message is not a preset message identification character. When the first control is in the second state, loading and displaying the preset message; displaying the preset message in the information input box according to the selection of the user on the preset message; and generating an interactive message according to the preset message according to the selection of the user on the sending control, wherein the character bit of the identification field in the interactive message is the identification character of the preset message.
The preset message includes a preset emoticon message and a preset text message. The interactive message generated by editing the input of the characters refers to a message generated by editing the characters according to the input of the user. In some embodiments of the application, the server receives the interactive message sent by the mobile terminal, and can perform auditing of the interactive message, particularly, the server performs auditing of a message generated by character editing according to user input, so as to avoid the occurrence of illicit or unsafe information such as sensitive words and the like in the message.
Optionally, when the display device receives an interactive message sent by the server, it is determined whether the received interactive message is a preset message or a message generated by editing characters according to user input. If the received interactive message is a preset message, displaying the received interactive message on a video playing interface of the display equipment end, such as displaying in a bullet screen mode; and if the received interactive message is a message generated by editing characters according to the input of the user, the display equipment receives the interactive message but does not broadcast and display the received interactive message on a display screen of the display equipment. Optionally, the display device deletes the interactive message generated by editing the character according to the user input. When the mobile terminal receives the interactive message pushed by the server, the received interactive message is broadcasted and displayed on a display screen of the mobile terminal no matter the interactive message is a preset message or a message generated by character editing according to user input. The message generated by editing characters according to the input of the user mainly refers to the information such as characters and the like input by the user through the mobile terminal. The interactive message is divided into the preset message and the message generated by editing characters according to the input of a user, so that the interactive message can be conveniently sent, and the controllability of interactive message display at the equipment end can be controlled.
In some embodiments of the present application, the message generated by editing characters according to the user input refers to a self-editing message input by a user through man-machine interaction with a mobile terminal. And when the mobile terminal receives a message sending instruction generated by character editing according to the user input, sending the corresponding message generated by character editing according to the user input to the server. In order to promote network culture and safety, when the server receives a message which is sent by the mobile terminal and generated by character editing according to user input, the server audits the message which is sent by the mobile terminal and generated by character editing according to the user input. When the message generated by editing the characters according to the user input contains the informal or unsafe information such as sensitive words and phrases, the server does not push the message generated by editing the characters according to the user input, and the message generated by editing the characters according to the user input can only be displayed on the mobile terminal which sends the message generated by editing the characters according to the user input. Further, when the self-editing message at the server audit position contains the informal or unsafe information such as sensitive words, a prompt or a guide can be sent to the mobile terminal.
In order to facilitate the display device to quickly identify whether the received interaction information is a preset message, in some embodiments of the present application, optionally, a preset message identification field is set in all preset messages, and a preset message identification field is not set in a message generated by editing characters according to user input. When a display equipment end receives an interactive message pushed by a server, acquiring an identification field of the interactive message, and judging whether the identification field is a preset message identification field. And when the identification field of the interactive message is a preset message identification field, the interactive message is considered as a preset message, and the display equipment terminal broadcasts and displays the received interactive message on a display screen of the display equipment terminal.
In some embodiments of the present application, a self-edit identifier is set in an identifier field in a message generated by editing characters according to a user input, and the identifier field of the preset message may be empty or a preset message identifier is set. In some embodiments of the present application, the preset message is provided with a preset message identifier, the message generated by character editing according to the user input does not have the preset message identifier, and the identifier field of the message generated by character editing according to the user input may be null or an identifier for characterizing self-editing. In some embodiments of the present application, the server may screen the message through the identification field, and audit the screened self-editing message, and the display device may screen the message through the identification field, so as to display only the preset message.
Accordingly, in some embodiments of the present application, the user may only send the preset message via the display device. Optionally, when the display device receives the interactive message sending instruction, the display device acquires a message corresponding to the interactive message sending instruction, adds a preset message identification field, and sends the message added with the preset message identification field to the server. Optionally, the preset message in effect in the display device carries the preset message identification field.
In some embodiments of the present application, the server is configured to validate the preset message, that is, after the auditorium is successfully established, the server is responsible for managing the preset message; and after the mobile terminal joins the chat room, the server returns the preset message set of the mobile terminal. After the mobile terminal receives the preset message set sent by the server, the user can interact by selecting the preset message in the preset message set.
Based on the display device provided by some embodiments of the present application, some embodiments of the present application further provide an auditorium service management method, where the auditorium service management method is used for the first display device.
Some embodiments of the present application provide a method for auditorium service management, including: sending a request for creating a auditorium to a server, wherein the request for creating the auditorium comprises a first video identifier, the request for creating the auditorium is used for enabling the server to create an auditorium service, and the auditorium service is used for enabling different display devices accessing the auditorium service to simultaneously play a first video corresponding to the first video identifier;
receiving an identifier of a auditorium service, wherein the identifier of the auditorium service is sent by the server after the auditorium service is successfully created;
in response to receiving the auditorium service identification, accessing the auditorium service according to the auditorium service identification to enable the first display device to receive video data of a first video fed back by a server according to the first video identification and play the first video according to the video data;
and in response to receiving the service identifier of the auditorium, starting a camera of the first display device to acquire local video data, displaying the local video data on a local video window on a playing interface for playing the first video, and sending the local video data to the server according to the service identifier of the auditorium.
In the auditorium service management method provided in some embodiments of the present application, the request for creating an auditorium further includes a contact identifier, and the request for creating an auditorium is further configured to enable a server to send, when creating an auditorium service, the identifier of the auditorium service to a second display device corresponding to the contact identifier according to the contact identifier, so that the second display device accesses the auditorium service and plays the first video, and simultaneously returns video data of an opposite end obtained by the second display device;
the method further comprises the following steps: and displaying the opposite-end video data on an opposite-end video window on a playing interface for playing the first video.
In some embodiments of the application, there is provided an auditorium service management method, wherein after playing the first video according to the video data, the method further includes:
after receiving an operation instruction of selecting the invitation control, displaying a contact person selection interface, wherein the contact person selection interface comprises a plurality of contact person controls used for representing different contact persons;
sending a friend invitation request according to the contact person identifier corresponding to the selected contact person control and the auditorium service identifier, wherein the friend invitation request is used for inviting second display equipment corresponding to the contact person identifier to access the auditorium service so as to enable the second display equipment to play the first video, and simultaneously returning opposite-end video data acquired by the second display equipment;
the method further comprises the following steps: and displaying the opposite-end video data on an opposite-end video window on a playing interface for playing the first video.
In some embodiments of the application, there is provided an auditorium service management method, wherein after playing the first video according to the video data, the method further includes:
receiving the selection of a user on preset information;
and generating interactive content according to the preset information and sending the interactive content to the server so that other terminals accessing the auditorium service display the interactive content.
In some embodiments of the application, in the method for auditorium service management, the receiving server identifies video data of a first video fed back according to the first video; the method comprises the following steps:
when the first display equipment has the authority of playing the first video, receiving video data of the first video fed back by a server according to the first video identifier;
and when the display equipment does not have the authority of playing the first video, receiving and displaying authority reminding information fed back by a server, wherein the authority reminding information is sent by the auditorium service when the server determines that the display equipment does not have the authority of playing the first video.
Based on the display device provided by some embodiments of the present application, some embodiments of the present application further provide an auditorium service management method, where the auditorium service management method is used for a server.
Some embodiments of the present application provide a method for auditorium service management, including:
receiving a request for creating a auditorium sent by a first display device side, wherein the request for creating the auditorium comprises a first video identifier;
in response to a received request for creating a auditorium, creating an auditorium service, wherein the auditorium service is used for enabling different display devices accessing the auditorium service to simultaneously play a first video corresponding to a first video identifier;
sending an identifier of a auditorium service to the first display device, wherein the identifier of the auditorium service is used for informing the first display device that the auditorium is successfully created;
receiving access of the first display device according to the identifier of the auditorium service, and feeding back video data of a first video to the first display device according to the first video identifier;
and receiving local video data acquired by the first display device through a camera of the first display device.
In some embodiments of the application, the auditorium service management method further includes the step of creating an auditorium request;
according to the request of the auditorium, the identifier of the auditorium service is sent to a second display device corresponding to the contact person identifier according to the contact person identifier, so that the second display device accesses the auditorium service and plays the first video, and meanwhile, opposite-end video data acquired by the second display device are transmitted back;
the method further comprises the following steps:
receiving video data of the second display device;
and sending the received video data to the opposite end of each display device so as to enable an opposite end video window on a playing interface for playing the first video to display the opposite end video data.
In some embodiments of the application, after feeding back video data of a first video to the first display device according to the first video identifier, the method further includes:
receiving a friend invitation request sent by the first display device according to the contact person identifier corresponding to the selected contact person control and the service identifier of the auditorium, and sending an invitation to the second display device according to the friend invitation request, so that the second display device plays the first video, and simultaneously returns opposite-end video data acquired by the second display device;
the method further comprises the following steps: receiving video data of the second display device;
and sending the received video data to the opposite end of each display device so as to enable an opposite end video window on a playing interface for playing the first video to display the opposite end video data.
In some embodiments of the application, after feeding back video data of a first video to the first display device according to the first video identifier, the method further includes:
receiving interactive content generated by the first display device according to preset information;
and sending the interactive content to other terminals accessing the auditorium service so as to enable the other terminals accessing the auditorium service to display the interactive content.
In some embodiments of the application, a method for auditorium service management, where feeding back video data of a first video to a first display device according to a first video identifier includes:
if the first display equipment has the authority of playing the first video, feeding back video data of the first video according to the first video identifier;
and if the first display equipment does not have the authority of playing the first video, feeding back authority reminding information to remind that the first display equipment does not have the authority of playing the first video.
In some embodiments of the present application, the video call is implemented by a video call service, which is a service parallel to but interrelated with the auditorium service.
In some embodiments of the present application, the first display device determines, according to the local video data uploaded by the first display device according to the auditorium service identifier, a corresponding associated video call service according to the first auditorium service identifier, and sends the local video data to the associated video call service, so that the video call service sends the audio and video data to the opposite terminal, and the second display device also sends the audio and video data to the opposite terminal.
In some embodiments of the present application, the video call service has a video call service identifier independent of a video call service other than a video theater service, and the first display device sends the video call service with the local video data uploaded according to the video call service identifier, so that the video call service sends the audio and video data to the opposite end, and the second display device also sends the audio and video data to the opposite end.
In some embodiments of the present application, a server receives a request for creating a theater, which is sent by a first display device, and if the request for creating a theater is sent after a user selects a private theater control, creates a video call service associated with a theater service while the theater service is successfully created, and sends a theater service identifier and a video call service identifier to the first display device.
In some embodiments of the present application, the server packages the auditorium service identifier and the video call service identifier in the auditorium invitation and sends the auditorium invitation to the second display device, the second display device accesses the auditorium service according to the auditorium service identifier, starts the video call application according to the video call identifier to start the camera and/or the recording device, and sends the acquired audio and video data to the opposite end through the video call service.
The first display device accesses the auditorium service according to the auditorium service identifier, simultaneously starts the video call application according to the video call service identifier so as to start the camera and/or the recording device, and sends the acquired audio and video data to the opposite terminal through the video call service.
In some embodiments of the present application, the server receives another request for creating a theater from the first display device, and the server does not create the associated video call service because the other request for creating a theater is sent after the user selects the creation of the public theater control.
In some embodiments of the present application, a first display device sends a request for creating a video call service to a server according to a first auditorium service identifier, the server creates an associated video call service according to the first auditorium service identifier, and returns a video call service identifier to the first display device after the creation is successful, the first display device starts a video call application according to the video call service identifier to start a camera and/or a recording device, and sends acquired audio and video data to an opposite end through the video call service. Fig. 14 is a sequence diagram of a chat implementation method of reading according to an embodiment of the present application. As shown in fig. 14, by the chat while watching implementation method provided in the embodiment of the present application, functions of watching video and performing video call can be completed in one scene, that is, chat while watching is truly implemented. The application environment of the method for realizing the chatting while watching comprises a first display device, a plurality of second display devices and a server side, wherein the first display device and the second display devices are relative concepts, and the server side is divided into a theater service side and a video call service side according to functions of the server side, and can also be divided into more detail, such as the theater service side is divided into room service, message push service, video playing service and the like. In the embodiment of the application, the video call function is established and the common video call server is cooperatively matched with an IM (instant messaging) system interface.
As shown in fig. 14, the method for implementing chat while watching provided in the embodiment of the present application includes:
and the first display equipment terminal sends a request for creating the auditorium to the auditorium service terminal to create the auditorium. In this embodiment of the application, the request for creating a theater, which is sent by the first display device to the server, includes a user ID, theater information (such as a name of the theater, a house owner customer ID (client code), a media ID list (including a movie ID and a tv show ID), a contact identifier, and the like.
The auditorium service end receives an auditorium creating request sent by the first display equipment end. When the auditorium service end receives the request for creating the auditorium sent by the first display equipment end, whether the first display equipment end meets the condition for creating the auditorium is verified. If the first display device side meets the condition of creating the auditorium, the auditorium creation request applies to generate an auditorium ID, and the auditorium ID is correlated with the auditorium creation request and returned to the first display device side. And if the first display equipment end does not meet the condition of establishing the auditorium, returning a service error code to the first display equipment end to remind a user of failed establishment or give a prompt and guide to the first display equipment end.
And the first display equipment end receives the auditorium information returned by the auditorium service end, and the auditorium is successfully established. And after the first display equipment end completes the creation of the auditorium, sending a video playing request to the auditorium service end. The video playing method comprises the steps that a video playing request sent by a first display equipment end is received by a auditorium server end, a video stream corresponding to the video playing request is returned to the first display equipment end, the first display equipment end receives the video stream, and a video layer of a page is displayed on the first display equipment end to play a video.
Further, in this embodiment of the application, when the auditorium service end receives the video playing request sent by the first display device, the attribute of the video requested to be played in the video playing request, such as whether the video is a pay video, is verified. And if the video requested to be played is the charging video, verifying whether the first display equipment end meets the condition of watching the charging video. And only when the first display equipment terminal meets the condition of watching the charged video, the auditorium service terminal returns the video stream corresponding to the video playing request to the first display equipment terminal. And if the first display equipment meets the condition of watching the charging video, giving corresponding indication or guidance.
In the embodiment of the application, the first display device side sends the video playing progress to the auditorium server side. Optionally, the first display device sends the video playing progress to the auditorium server periodically.
In some embodiments of the application, the video call service unit and the video play service unit are part of a auditorium service, when a user accesses the auditorium service, all services of the auditorium service are simultaneously activated, the call service unit and the video play service unit communicate through an internal interface and complete synchronous video playing and call video data forwarding, the first display device starts a camera and a recording device according to a received auditorium service identifier to collect audio and video, and transmits the audio and video data to an opposite terminal according to the auditorium service identifier to perform chat while watching. And the second display equipment receives an input operation instruction for receiving invitation after receiving the invitation message containing the service identifier of the auditorium, accesses the auditorium service according to the service identifier of the auditorium, acquires the first video data after receiving the successful access message, starts the camera and the recording equipment to collect audio and video, and transmits the audio and video data to the opposite terminal according to the service identifier of the auditorium to carry out the chat while watching.
In some embodiments of the application, after the first video is played, the first display device sends a request for inviting a friend to establish a video call to the video call server, and the request for inviting the friend to establish the video call carries information of a theater. And the video call server side pushes a message for establishing the video call invitation to the invited friend (second display equipment). And the second display equipment receives the video call establishment invitation message pushed by the video call server, and returns the invitation acceptance message to the video call server according to the video call establishment invitation message when the second display equipment accepts the invitation. In the embodiment of the application, the information of the auditorium is carried in the video call establishment invitation message. And when the second display equipment accepts the invitation, the second display equipment sends a request for joining the auditorium to the auditorium server according to the invitation message for establishing the video call, and joins the auditorium. The auditorium request includes the auditorium ID.
And the video call server receives the invitation receiving information returned by the second display equipment terminal, and establishes a video call between the first display equipment terminal and the second display equipment terminal. When the video call between the first display device end and the second display device end is established, the video call windows of the home end and the opposite end of the first display device end are displayed on the suspension layer of the display page of the first display device end, and the video call windows of the home end and the opposite end of the second display device end are displayed on the suspension layer of the display page of the second display device end. And after the first display equipment terminal and the second display equipment establish video call, the video call is carried out based on the IM system.
The video theater service end receives a request for joining the video theater sent by the second display equipment end, sends a video stream to the second display equipment end according to the video playing progress of the first display equipment end, the second display equipment end receives the video stream returned by the video theater service end, and the video layer of the page is displayed on the second display equipment end to play the video, so that the synchronous video watching with the first display equipment end is realized. Further, in this embodiment of the application, the service end of the theater corrects the video playing schedules of all the second display device ends in the theater according to the video playing schedule periodically sent by the first display device end, so as to ensure that all the display device ends in the theater can play video synchronously, thereby ensuring that users in the theater can watch videos synchronously.
In the embodiment of the present application, the number of auditoriums that a user joins at the same time is limited, such as one. Therefore, when the auditorium server sends the request for joining the auditorium to the second display device, whether the second display device meets the condition for joining the auditorium is determined, such as verifying that no auditorium has been joined at the current moment. And when the second display equipment terminal meets the condition of joining the auditorium, the auditorium service terminal allows the second display equipment terminal to join the auditorium, otherwise, the service error code is directly returned to the second display equipment terminal, and corresponding prompt and guide are given to a user of the second display equipment terminal.
According to the method for realizing the chat while watching, the video theater is established through interaction of the first display device end and the video theater service end, the video is played through the video theater, the first display device end sends friend invitations to the second display device end through the video call service end to establish video calls, the second display device end establishes the video calls with the first display device end through the video call service end to realize real-time video calls, and the second display device end is added into the video theater through the video theater service end to be synchronously watched with the first display device end. Therefore, by the chat while watching implementation method provided by the embodiment of the application, the functions of watching the video and calling the video can be completed in one scene without switching back and forth; in the video watching process, other friends can be smoothly invited to join without interrupting video playing. Namely, the method for realizing the chat while watching provided by the embodiment of the application realizes the chat while watching in a real sense.
In some embodiments of the present application, the auditorium service and the video call service providing the video call are two different services. The method comprises the steps that after receiving a theater service identification fed back by a theater service end (namely a theater service module), a first display device accesses the theater service to obtain data of a first video, automatically determines a contact identification of a contact person selected when the theater is created before according to the theater service identification, then calls a video call application different from the theater application to automatically send a request for establishing a video call to a video call service end of a server according to the contact identification, the video call service end creates a call room, and after generating a video call invitation, the server sends the theater invitation request to a second display device by using the theater service end or the video call service end or other service ends, wherein the theater invitation request comprises the theater service identification, the video call room address and the like. After receiving the input invitation receiving instruction, the second display device starts a local camera and/or a recording device, creates a playing interface for playing the first video, and creates a local video window and an opposite-end video window on the playing interface. The second display device accesses the auditorium service end (namely the auditorium service module) according to the auditorium service identification and joins the call room according to the video call room address. The auditorium service can be successfully added by the second display device at the same time, the first video data stream is synchronized to the second display device, and the call room can send a message added by the second display device to different display devices so that the display devices can be ready to accept the audio and video data collected by the second display device, and therefore video call is established. At this time, although the video call service and the auditorium service are two different services, the first display device or the second display device is an operation without sensing, that is, the first display device performs an operation of creating the auditorium, and the second display device performs only an operation of receiving an invitation of the auditorium, that is, the establishment of the auditorium service and the video call service is completed.
In the chat implementation method while watching provided by the embodiment of the application, the first display device end sends a heartbeat request to the auditorium server end to tell the auditorium server end that the first display device end is still online at present, and the auditorium server end receives the heartbeat request. Optionally, the first display device end sends a heartbeat request to the auditorium server end periodically, the auditorium server end receives the heartbeat request sent by the first display device end periodically, and when the auditorium server end does not receive the heartbeat request sent by the first display device end in a predetermined time, the first display device end is considered to be offline. Optionally, the first display device end sends a heartbeat request to the auditorium server end, after receiving the heartbeat request sent by the first display device end, the auditorium server end returns next heartbeat request sending time to the first display device end, the first display device end is required to send a heartbeat request to the auditorium server end at the next heartbeat request sending time, and the first display device end sends a heartbeat request to the auditorium server end according to the next heartbeat request sending time. And if the auditorium server does not receive the heartbeat request sent by the first display equipment terminal at the next heartbeat request sending time, the first display equipment terminal is considered to be offline. Therefore, the first display equipment terminal and the auditorium server terminal realize the monitoring of the online state of the first display equipment terminal through the interaction of the heartbeat request.
When the auditorium server judges that the first display device terminal is offline, the auditorium server defaults that the auditorium dismissal is to be driven, the auditorium server sends an auditorium dismissal notification to all second display devices in the auditorium, and the auditorium server stops playing video progress in the auditorium and corrects alignment. In addition, the video played in the original auditorium is played to the end of the auditorium, and the playing is not influenced by the dismissal of the auditorium. Optionally, the video theater service end sends a dismissal notification to the video call service end, the video call service end sends video call dismissal to all the second display devices in the original video theater, and the video call is cut off.
In addition, in the chat implementation method while watching provided by the application, the first display device end can actively send a theater dismissal request to the auditorium service end, when the auditorium service end receives the auditorium dismissal request sent by the first display device end, the auditorium service end dismisses the auditorium and sends an auditorium dismissal notification to all the second display devices in the auditorium, and the auditorium service end stops playing video progress in the auditorium to correct alignment.
Therefore, when receiving a request for joining the auditorium, which is sent by a second display device corresponding to the invited friend, the auditorium service side needs to verify whether the current auditorium is dismissed, determine that the auditorium is not dismissed and that the second display device meets the condition for joining the auditorium, and allow the second display device to join the auditorium.
The chat-while-watching implementation method provided by the embodiment of the application is described below by combining specific examples.
Assuming that the first display device side is A, A sends a request for creating a auditorium to the auditorium service side, and the auditorium service side receives the request for creating the auditorium sent by A. The auditorium server verifies that A meets the condition for creating the auditorium, creates the auditorium and returns A through the created auditorium information, and A creates the auditorium successfully. A and user man-machine interaction obtains to-be-played video information selected by a user, a video playing request is sent to a theater service terminal according to the to-be-played video information selected by the user, the theater service terminal returns a video stream to A according to the received video playing request, and the A receives the video stream for video playing.
And A sends a request for inviting friends to establish a video call to the video call server, wherein the request for inviting friends comprises that A selects N friends (the corresponding display device ends are B, C, D and the like respectively, and B, C, D and the like are called as second display device ends). And the video call server receives the request for inviting the friend to establish the video call sent by the A, generates an invitation message for establishing the video call according to the request for establishing the video call by the inviting friend, and sends the invitation message for establishing the video call to B, C, D and the like.
And when the B receives the invitation message for establishing the video call and accepts the invitation, returning invitation accepting information to the video call server according to the invitation message for establishing the video call and sending a request for joining the auditorium to the auditorium server according to the invitation message for establishing the video call. The video call server receives the invitation accepting information returned by the B, and establishes the video call between the A and the B; and the auditorium server receives the request for joining the auditorium sent by the B, and verifies whether the B meets the condition for joining the auditorium. And when the B meets the condition of joining the auditorium, the auditorium service end allows the B to join the auditorium, and the B successfully joins the auditorium. And B, sending a video playing request to the auditorium server, returning a video stream to B by the auditorium server according to the video playing progress obtained from A, and receiving the video stream by B for video playing. And B and A can carry out real-time video call while synchronously watching the video.
And when C receives the invitation message for establishing the video call and accepts the invitation, returning invitation accepting information to the video call server according to the invitation message for establishing the video call and sending a request for joining the auditorium to the auditorium server according to the invitation message for establishing the video call. The video call server receives the invitation accepting information returned by the C, and establishes A, B a video call between the C and the video call server; and the auditorium server receives the request for joining the auditorium sent by the C, and verifies whether the C meets the condition for joining the auditorium. And when the C meets the condition of joining the auditorium, the auditorium server allows the C to join the auditorium, and the C successfully joins the auditorium. And C, sending a video playing request to the auditorium server, returning a video stream to C by the auditorium server according to the video playing progress obtained from A, and receiving the video stream by C for video playing. C and A, B allow real-time video calls to be made while viewing the video synchronously.
If A sends the request of dismissal in the auditorium to the auditorium server, when the auditorium server receives the request of dismissal in the auditorium sent by A, the auditorium server dismisses the auditorium and sends notification of dismissal in the auditorium to the on-line user such as B, C. The auditorium server stops the video play progress in the auditorium to correct the alignment, and online users such as B, C may no longer be viewing synchronously, but online users such as B, C may continue to view the video to the end. In addition, if the auditorium server judges that A is off-line, the auditorium server defaults that the auditorium dismissal is to be passive, and the auditorium server dismisses the auditorium and sends an auditorium dismissal notification to online users such as B in the auditorium.
And when D receives the invitation information and accepts the invitation, D returns the invitation accepting information to the video call server according to the video call invitation establishing message and sends a request for joining the auditorium to the auditorium server according to the video call invitation establishing message. And the auditorium server receives the request for joining the auditorium sent by the D, and verifies whether the D meets the condition for joining the auditorium. And if the auditorium server side obtains that the auditorium is dismissed when receiving the request for joining the auditorium sent by the D, the auditorium server side indicates that the D is not in accordance with the condition for joining the auditorium, and the auditorium server side returns a service error code number to the D to remind a user of the D that the auditorium is dismissed and the auditorium is failed to join the auditorium. And simultaneously, the video call server returns a service error code number to remind the user at the D end of resolving the video call and establish the failure of the video call.
To facilitate the exhibition of the auditorium management method provided by the embodiment of the application, the application is specifically described in conjunction with a specific use scenario.
The display page of the first display device end is shown in fig. 8, and the first display device end receives a user control instruction. When the first display device end receives the command of newly building the auditorium and the first display device end meets the condition of creating the auditorium, the first display device end enters an auditorium creation page, as shown in the figure 9, and the selection for creating the auditorium type is carried out according to the indication of the figure 9. When the type of the good auditorium (private auditorium) is selected to enter the page of fig. 10, the user can perform operations of setting or modifying the name of the auditorium, adding a video playing film, inviting friends and the like through the page shown in fig. 10. If "add movie" is clicked, a movie selection page is entered, as shown in fig. 11, and the movie to be played is determined to be played according to the user selection. If the juveniles ' party ' is selected from the picture in fig. 11, click confirmation is performed, the page shown in fig. 12 is entered, the selected movie is shown as juveniles ' party, and meanwhile, if the selected movie is a pay movie, the user is reminded that the user needs to pay for the pay movie to watch the juveniles. If the user clicks 'invite relatives and friends' to enter a friend invitation page, as shown in figure 13, the user selects friends to be invited, usually selects 5 friends at most, clicks and determines to enter a page shown in figure 15 after the selection is completed, and clicks 'determine to initiate', the display page enters a page shown in figure 16, plays the selected movie juveniles 'pie', and displays a video passing window at a certain position or positions, for example, the right side of the display, so that the video call is carried out while the synchronous viewing is realized.
Based on the chat while watching implementation method provided by the embodiment of the application, the application provides a chat while watching system, which comprises a display device, a auditorium server and a video call server; wherein: the display device, the auditorium server and the video call server are configured to cooperatively execute the method for realizing the chat while watching in the embodiment.
The auditorium service end is divided into room service, message pushing service, video playing service and the like, the room service is used for apportioning the control of business levels of auditorium creation, adding and dissolving and the like, the message pushing service is used for finishing message pushing in the message auditorium, and the video playing service returns video streams according to video playing requests.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments shown in the present application without inventive effort, shall fall within the scope of protection of the present application. Moreover, while the disclosure herein has been presented in terms of exemplary one or more examples, it is to be understood that each aspect of the disclosure can be utilized independently and separately from other aspects of the disclosure to provide a complete disclosure.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims of the present application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can be implemented in sequences other than those illustrated or otherwise described herein with respect to the embodiments of the application, for example.
Furthermore, the terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. An auditorium service management method applied to a first display device, the method comprising:
sending a request for creating a auditorium to a server, wherein the request for creating the auditorium comprises a first video identifier, the request for creating the auditorium is used for enabling the server to create an auditorium service, and the auditorium service is used for enabling different display devices accessing the auditorium service to simultaneously play a first video corresponding to the first video identifier;
receiving an identifier of a auditorium service, wherein the identifier of the auditorium service is sent by the server after the auditorium service is successfully created;
in response to receiving the auditorium service identification, accessing the auditorium service according to the auditorium service identification to enable the first display device to receive video data of a first video fed back by a server according to the first video identification and play the first video according to the video data;
and in response to receiving the service identifier of the auditorium, starting a camera of the first display device to acquire local video data, displaying the local video data on a local video window on a playing interface for playing the first video, and sending the local video data to the server.
2. The auditorium service management method according to claim 1, wherein the request for creating an auditorium further includes a contact identifier, and the request for creating an auditorium is further configured to enable the server, when creating an auditorium service, to send the identifier of the auditorium service to the second display device corresponding to the contact identifier according to the contact identifier, so that the second display device accesses the auditorium service and plays the first video, and simultaneously returns the opposite-end video data acquired by the second display device;
the method further comprises the following steps: and displaying the opposite-end video data on an opposite-end video window on a playing interface for playing the first video.
3. The auditorium traffic management method according to claim 1, wherein after playing said first video according to said video data, said method further comprises:
after receiving an operation instruction of selecting the invitation control, displaying a contact person selection interface, wherein the contact person selection interface comprises a plurality of contact person controls used for representing different contact persons;
sending a friend invitation request according to the contact person identifier corresponding to the selected contact person control and the auditorium service identifier, wherein the friend invitation request is used for inviting second display equipment corresponding to the contact person identifier to access the auditorium service so as to enable the second display equipment to play the first video, and simultaneously returning opposite-end video data acquired by the second display equipment;
the method further comprises the following steps: and displaying the opposite-end video data on an opposite-end video window on a playing interface for playing the first video.
4. The auditorium traffic management method according to claim 1, wherein after playing said first video according to said video data, said method further comprises:
receiving the selection of a user on preset information;
and generating interactive content according to the preset information and sending the interactive content to the server so that other terminals accessing the auditorium service display the interactive content.
5. The auditorium traffic management method according to claim 1, wherein said receiving server identifies the video data of the fed back first video according to the first video; the method comprises the following steps:
when the first display equipment has the authority of playing the first video, receiving video data of the first video fed back by a server according to the first video identifier;
and when the display equipment does not have the authority of playing the first video, receiving and displaying authority reminding information fed back by a server, wherein the authority reminding information is sent by the auditorium service when the server determines that the display equipment does not have the authority of playing the first video.
6. A method for managing services in an auditorium, applied to a server, the method comprising:
receiving a request for creating a auditorium sent by a first display device side, wherein the request for creating the auditorium comprises a first video identifier;
in response to a received request for creating a auditorium, creating an auditorium service, wherein the auditorium service is used for enabling different display devices accessing the auditorium service to simultaneously play a first video corresponding to a first video identifier;
sending an identifier of a auditorium service to the first display device, wherein the identifier of the auditorium service is used for informing the first display device that the auditorium is successfully created;
receiving access of the first display device according to the identifier of the auditorium service, and feeding back video data of a first video to the first display device according to the first video identifier;
and receiving local video data acquired by the first display device through a camera of the first display device.
7. The auditorium traffic management method according to claim 6, wherein said create auditorium request further comprises a contact identification;
according to the request of the auditorium, the identifier of the auditorium service is sent to a second display device corresponding to the contact person identifier according to the contact person identifier, so that the second display device accesses the auditorium service and plays the first video, and meanwhile, opposite-end video data acquired by the second display device are transmitted back;
the method further comprises the following steps:
receiving video data of the second display device;
and sending the received video data to the opposite end of each display device so as to enable an opposite end video window on a playing interface for playing the first video to display the opposite end video data.
8. The auditorium traffic management method according to claim 6, wherein after feeding back video data of a first video to said first display device according to said first video identification, said method further comprises:
receiving a friend invitation request sent by the first display device according to the contact person identifier corresponding to the selected contact person control and the service identifier of the auditorium, and sending an invitation to the second display device according to the friend invitation request, so that the second display device plays the first video, and simultaneously returns opposite-end video data acquired by the second display device;
the method further comprises the following steps: receiving video data of the second display device;
and sending the received video data to the opposite end of each display device so as to enable an opposite end video window on a playing interface for playing the first video to display the opposite end video data.
9. The auditorium service management method according to claim 6, wherein the auditorium service comprises a video playing service unit, a video call service unit and a message service unit, wherein the video playing service unit is used for controlling the synchronous playing of the first video among different devices, and the video call service unit is used for performing a video call among different devices.
10. A display device, comprising:
a display configured to display a user interface, a video playback interface, and display device local video data;
a controller for communicative connection with the display, the controller configured to:
sending a request for creating a auditorium to a server, wherein the request for creating the auditorium comprises a first video identifier, the request for creating the auditorium is used for enabling the server to create an auditorium service, and the auditorium service is used for enabling different display devices accessing the auditorium service to simultaneously play a first video corresponding to the first video identifier;
receiving an identifier of a auditorium service, wherein the identifier of the auditorium service is sent by the server after the auditorium service is successfully created;
in response to receiving the auditorium service identification, accessing the auditorium service according to the auditorium service identification to enable the first display device to receive video data of a first video fed back by a server according to the first video identification and play the first video according to the video data;
and in response to receiving the service identifier of the auditorium, starting a camera of the first display device to acquire local video data, displaying the local video data on a local video window on a playing interface for playing the first video, and sending the local video data to the server.
CN202010380112.XA 2019-08-18 2020-05-08 Projection hall service management method and application Active CN112399264B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/108503 WO2021031940A1 (en) 2019-08-18 2020-08-11 Screening room service management method, interaction method, display device, and mobile terminal
CN202080024297.9A CN113661715B (en) 2019-08-18 2020-08-11 Service management method, interaction method, display equipment and mobile terminal for projection hall

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2019107614577 2019-08-18
CN201910761457 2019-08-18

Publications (2)

Publication Number Publication Date
CN112399264A true CN112399264A (en) 2021-02-23
CN112399264B CN112399264B (en) 2022-06-14

Family

ID=74603785

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010380112.XA Active CN112399264B (en) 2019-08-18 2020-05-08 Projection hall service management method and application

Country Status (1)

Country Link
CN (1) CN112399264B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114615518A (en) * 2022-05-11 2022-06-10 飞狐信息技术(天津)有限公司 Video playing method and device, electronic equipment and storage medium
WO2023130988A1 (en) * 2022-01-04 2023-07-13 聚好看科技股份有限公司 Display device and channel recommendation method
CN117112251A (en) * 2022-05-27 2023-11-24 荣耀终端有限公司 Communication method and related equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105872835A (en) * 2015-12-18 2016-08-17 乐视致新电子科技(天津)有限公司 Method and device for achieving synchronous film watching at different places, and intelligent device
CN105898509A (en) * 2015-11-26 2016-08-24 乐视网信息技术(北京)股份有限公司 Video playing interaction method and system
CN108667798A (en) * 2018-03-27 2018-10-16 上海临奇智能科技有限公司 A kind of method and system of virtual viewing
CN109309849A (en) * 2018-08-31 2019-02-05 北京优酷科技有限公司 The interactive approach and device of multimedia content
US20190176035A1 (en) * 2011-02-01 2019-06-13 Timeplay Inc. Systems and methods for interactive experiences and controllers therefor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190176035A1 (en) * 2011-02-01 2019-06-13 Timeplay Inc. Systems and methods for interactive experiences and controllers therefor
CN105898509A (en) * 2015-11-26 2016-08-24 乐视网信息技术(北京)股份有限公司 Video playing interaction method and system
CN105872835A (en) * 2015-12-18 2016-08-17 乐视致新电子科技(天津)有限公司 Method and device for achieving synchronous film watching at different places, and intelligent device
CN108667798A (en) * 2018-03-27 2018-10-16 上海临奇智能科技有限公司 A kind of method and system of virtual viewing
CN109309849A (en) * 2018-08-31 2019-02-05 北京优酷科技有限公司 The interactive approach and device of multimedia content

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023130988A1 (en) * 2022-01-04 2023-07-13 聚好看科技股份有限公司 Display device and channel recommendation method
CN114615518A (en) * 2022-05-11 2022-06-10 飞狐信息技术(天津)有限公司 Video playing method and device, electronic equipment and storage medium
CN117112251A (en) * 2022-05-27 2023-11-24 荣耀终端有限公司 Communication method and related equipment

Also Published As

Publication number Publication date
CN112399264B (en) 2022-06-14

Similar Documents

Publication Publication Date Title
CN112073797B (en) Volume adjusting method and display device
CN112533037B (en) Method for generating Lian-Mai chorus works and display equipment
CN112073664B (en) Video call method and display device
CN112399264B (en) Projection hall service management method and application
CN112399263A (en) Interaction method, display device and mobile terminal
CN112073798B (en) Data transmission method and equipment
CN112399243A (en) Playing method and display device
CN112073778A (en) Display device and fault-tolerant method for key transmission
CN111385631B (en) Display device, communication method and storage medium
CN112995733B (en) Display device, device discovery method and storage medium
CN112533056B (en) Display device and sound reproduction method
CN112463267B (en) Method for presenting screen saver information on display device screen and display device
CN112788378A (en) Display apparatus and content display method
CN112399225B (en) Service management method for projection hall and display equipment
CN113661715B (en) Service management method, interaction method, display equipment and mobile terminal for projection hall
CN112533023B (en) Method for generating Lian-Mai chorus works and display equipment
CN112073777B (en) Voice interaction method and display device
CN112073666B (en) Power supply control method of display equipment and display equipment
WO2020248790A1 (en) Voice control method and display device
CN112073812A (en) Application management method on smart television and display device
CN112073773A (en) Screen interaction method and device and display equipment
CN112071338A (en) Recording control method and device and display equipment
CN112911353B (en) Display device, port scheduling method and storage medium
CN112073772B (en) Key seamless transmission method based on dual systems and display equipment
CN112073779B (en) Display device and fault-tolerant method for key transmission

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant