WO2021189358A1 - Dispositif d'affichage et procédé de réglage de volume - Google Patents

Dispositif d'affichage et procédé de réglage de volume Download PDF

Info

Publication number
WO2021189358A1
WO2021189358A1 PCT/CN2020/081417 CN2020081417W WO2021189358A1 WO 2021189358 A1 WO2021189358 A1 WO 2021189358A1 CN 2020081417 W CN2020081417 W CN 2020081417W WO 2021189358 A1 WO2021189358 A1 WO 2021189358A1
Authority
WO
WIPO (PCT)
Prior art keywords
volume
audio
display device
output
user
Prior art date
Application number
PCT/CN2020/081417
Other languages
English (en)
Chinese (zh)
Inventor
王之奎
孙永瑞
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Priority to PCT/CN2020/081417 priority Critical patent/WO2021189358A1/fr
Publication of WO2021189358A1 publication Critical patent/WO2021189358A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output

Definitions

  • This application relates to the technical field of display devices, and in particular to a display device and a volume adjustment method.
  • the display device can provide users with playback screens such as audio, video, and pictures.
  • display devices can not only provide users with live TV program content received through data broadcasting, but also provide users with various applications and service content such as online videos and online games.
  • video chat has also become a basic function of the display device, and furthermore, a scene where there are two audio outputs at the same time also arises. For example, a scene where an audio and video chat is performed while playing a video program, or a scene where an audio and video chat is performed while playing music.
  • the display device can be controlled to achieve the above functions based on the user's operation of the physical hard keys or virtual keys on the control device such as remote control, mobile terminal, etc., or through its own microphone or control
  • the microphone on the device receives the voice input by the user and is controlled to perform the above-mentioned functions. For example, when the display device is playing a program, the user adjusts the volume through the volume key on the remote control.
  • This application provides a display device and a volume adjustment method to solve the problem of how to better handle the two channels of sound.
  • the present application provides a display device, which is characterized in that it includes:
  • the controller is used for:
  • a volume setting interface is presented on the display, the volume setting interface including a volume setting item for associating the output volume of the voice call with the output volume of the audio and video program;
  • the output volume of the audio and video program and the output volume of the voice call are adjusted in association, so that the voice call is adjusted
  • the output volume of is different from the adjusted output volume of the audio and video program.
  • this application also provides a volume adjustment method, which includes:
  • a volume setting interface is presented on the display, the volume setting interface including a volume setting item for associating the output volume of the voice call with the output volume of the audio and video program;
  • the output volume of the audio and video program and the output volume of the voice call are adjusted in association, so that the voice call is adjusted
  • the output volume of is different from the adjusted output volume of the audio and video program.
  • Fig. 1 exemplarily shows a schematic diagram of an operation scenario between a display device and a control device according to an embodiment
  • Fig. 2 exemplarily shows a block diagram of the hardware configuration of the control device 100 according to the embodiment
  • FIG. 3 exemplarily shows a block diagram of the hardware configuration of the display device 200 according to the embodiment
  • FIG. 4 exemplarily shows a block diagram of the hardware architecture of the display device 200 according to FIG. 3;
  • FIG. 5 exemplarily shows a schematic diagram of the functional configuration of the display device 200 according to the embodiment
  • Fig. 6a exemplarily shows a schematic diagram of the software configuration in the display device 200 according to the embodiment
  • FIG. 6b exemplarily shows a schematic diagram of the configuration of the application program in the display device 200 according to the embodiment
  • FIG. 7 exemplarily shows a schematic diagram of a user interface in the display device 200 according to the embodiment.
  • Fig. 8 exemplarily shows a user interface in a watching and chatting scenario
  • Fig. 9 exemplarily shows the processing procedure for multiple channels of audio data
  • Fig. 10 exemplarily shows another user interface in a watching and chatting scenario
  • Fig. 11 exemplarily shows another user interface in a watching and chatting scenario
  • Fig. 12 exemplarily shows another user interface in a watching and chatting scenario
  • Fig. 13 exemplarily shows another user interface in a watching and chatting scenario
  • Fig. 14 exemplarily shows a flow chart of a volume control method.
  • the embodiment of the present application provides a display device and a volume adjustment method.
  • the display device provided in this application can be a display device with multiple controller architectures, such as the display device with a controller (dual hardware system) architecture shown in Figures 3-6 of this application, or a non-dual controller
  • controller dual hardware system
  • the structure of the display device is not limited in this application.
  • the volume adjustment method provided in this application can be applied to display devices such as smart TVs, of course, can also be applied to other handheld devices that can provide voice and data connectivity functions and have wireless connection functions, or other devices that can be connected to a wireless modem Processing equipment, such as mobile phones (or "cellular" phones) and computers with mobile terminals, can also be portable, pocket-sized, handheld, computer-built or vehicle-mounted mobile devices, which exchange data with wireless access networks .
  • various external device interfaces are usually provided on the display device to facilitate the connection of different peripheral devices or cables to achieve corresponding functions.
  • a high-resolution camera When a high-resolution camera is connected to the interface of the display device, if the hardware system of the display device does not have the hardware interface of the high-pixel camera that receives the source code, it will cause the data received by the camera to be unable to present the data received by the camera to the display of the display device. On the screen.
  • the hardware system of traditional display devices only supports one hard decoding resource, and usually only supports 4K resolution video decoding. Therefore, when you want to realize the video chat while watching Internet TV, in order not to reduce
  • the definition of the network video picture requires the use of hard decoding resources (usually the GPU in the hardware system) to decode the network video.
  • the general-purpose processor such as CPU
  • the video chat screen is processed by soft decoding.
  • this application discloses a dual hardware system architecture to realize multiple channels of video chat data (at least one local video).
  • module used in the various embodiments of this application can refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that can execute related components Function.
  • remote control used in the various embodiments of this application refers to a component of an electronic device (such as the display device disclosed in this application), which can generally control the electronic device wirelessly within a short distance.
  • the component can generally use infrared and/or radio frequency (RF) signals and/or Bluetooth to connect with electronic devices, and can also include functional modules such as WiFi, wireless USB, Bluetooth, and motion sensors.
  • RF radio frequency
  • a handheld touch remote control uses a user interface in a touch screen to replace most of the physical built-in hard keys in general remote control devices.
  • gesture used in the embodiments of the present application refers to a user's behavior through a change of hand shape or hand movement to express expected ideas, actions, goals, and/or results.
  • the term "hardware system” used in the various embodiments of this application may refer to an integrated circuit (IC), printed circuit board (Printed circuit board, PCB) and other mechanical, optical, electrical, and magnetic devices with computing , Control, storage, input and output functions of the physical components.
  • the hardware system is usually also referred to as a motherboard or a chip.
  • Fig. 1 exemplarily shows a schematic diagram of an operation scenario between a display device and a control device according to an embodiment. As shown in FIG. 1, the user can operate the display device 200 by controlling the device 100.
  • the control device 100 may be a remote controller 100A, which can communicate with the display device 200 through infrared protocol communication, Bluetooth protocol communication, ZigBee protocol communication or other short-distance communication methods for wireless or other short-distance communication.
  • the display device 200 is controlled in a wired manner.
  • the user can control the display device 200 by inputting user instructions through keys on the remote control, voice input, control panel input, etc.
  • the user can control the display device 200 by inputting corresponding control commands through the volume plus and minus keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, and power on/off keys on the remote control. Function.
  • the control device 100 can also be a smart device, such as a mobile terminal 100B, a tablet computer, a computer, a notebook computer, etc., which can be connected through a local area network (LAN, Wide Area Network), a wide area network (WAN, Wide Area Network), and a wireless local area network ((WLAN) , Wireless Local Area Network) or other networks communicate with the display device 200, and control the display device 200 through an application program corresponding to the display device 200.
  • LAN Local area network
  • WAN Wide Area Network
  • WLAN wireless local area network
  • the application can provide users with various controls through an intuitive user interface (UI, User Interface) on the screen associated with the smart device.
  • UI User Interface
  • User interface is a medium interface for interaction and information exchange between applications or operating systems and users. It realizes the conversion between the internal form of information and the form acceptable to users.
  • the commonly used form of user interface is graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It can be an icon, window, control and other interface elements displayed on the display screen of an electronic device.
  • the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, etc. Visual interface elements.
  • both the mobile terminal 100B and the display device 200 can install software applications, so that the connection and communication between the two can be realized through a network communication protocol, and the purpose of one-to-one control operation and data communication can be realized.
  • the mobile terminal 100B can be made to establish a control command protocol with the display device 200
  • the remote control keyboard can be synchronized to the mobile terminal 100B
  • the function of controlling the display device 200 can be realized by controlling the user interface of the mobile terminal 100B; or the mobile terminal 100B
  • the audio and video content displayed on the screen is transmitted to the display device 200 to realize the synchronous display function.
  • the display device 200 can also communicate with the server 300 through multiple communication methods.
  • the display device 200 may be allowed to communicate with the server 300 via a local area network, a wireless local area network, or other networks.
  • the server 300 may provide various contents and interactions to the display device 200.
  • the display device 200 transmits and receives information, interacts with an Electronic Program Guide (EPG, Electronic Program Guide), receives software program updates, or accesses a remotely stored digital media library.
  • EPG Electronic Program Guide
  • the server 300 may be a group or multiple groups, and may be one or more types of servers.
  • the server 300 provides other network service content such as video-on-demand and advertising services.
  • the display device 200 may be a liquid crystal display, an OLED (Organic Light Emitting Diode) display, a projection display device, or a smart TV.
  • OLED Organic Light Emitting Diode
  • the specific display device type, size, resolution, etc. are not limited, and those skilled in the art can understand that the display device 200 can make some changes in performance and configuration as required.
  • the display device 200 may additionally provide a smart network TV function with a computer support function. Examples include Internet TV, Smart TV, Internet Protocol TV (IPTV), and so on.
  • IPTV Internet Protocol TV
  • the display device may not have a broadcast receiving function.
  • a camera may be connected or provided on the display device 200 for presenting a picture captured by the camera on the display interface of the display device or other display devices, so as to realize interactive chats between users.
  • the picture captured by the camera can be displayed on the display device in full screen, half screen, or in any selectable area.
  • the camera is connected to the monitor rear shell through a connecting plate, and is fixedly installed on the upper middle of the monitor rear shell.
  • a connecting plate As an installable method, it can be fixedly installed at any position of the monitor rear shell to ensure its It is only necessary that the image capture area is not blocked by the rear shell.
  • the image capture area and the display device have the same display orientation.
  • the camera can be connected to the display rear shell through a connecting plate or other conceivable connectors.
  • the connector is equipped with a lifting motor.
  • the user wants to use the camera or has an application to use the camera
  • it can be embedded behind the rear shell to protect the camera from damage.
  • the camera used in this application may have 16 million pixels to achieve the purpose of ultra-high-definition display. In actual use, a camera with higher or lower than 16 million pixels can also be used.
  • the content displayed in different application scenarios of the display device can be merged in a variety of different ways, so as to achieve functions that cannot be achieved by traditional display devices.
  • the user can have a voice call with at least one other user (that is, at least one other terminal) while enjoying an audio and video program.
  • the presentation of audio and video programs can be used as the background picture, the sound of the audio and video programs can be used as the background sound, the voice call window is displayed on the background picture, and the voice call sound can be played simultaneously with the background sound through the display device.
  • the function of the display device to play the above two channels of sounds at the same time can be called “watching and chatting at the same time", and the scene where the above two channels of sounds exist at the same time is called the “watching and chatting" scene.
  • the chat window may not be displayed, and only the chat voice may be output. That is, the monitor plays audio and video programs, and the program sound and chat sound are output at the same time.
  • chat voice when the user triggers the instruction to mute the chat voice, only the audio and video program sounds are output, and the chat voice of other users is converted into text or barrage. , Presented on the display.
  • At least one video chat is conducted with other terminals.
  • the user can video chat with at least one other user while entering the education application for learning.
  • students can realize remote interaction with teachers while learning content in educational applications. Visually, you can call this function "learning while chatting”.
  • a video chat is conducted with a player entering the game.
  • a player enters a game application to participate in a game, it can realize remote interaction with other players. Visually, you can call this function "watch and play".
  • the game scene is integrated with the video picture, and the portrait in the video picture is cut out and displayed on the game picture to improve the user experience.
  • somatosensory games such as ball games, boxing games, running games, dancing games, etc.
  • human body postures and movements are acquired through the camera, body detection and tracking, and the detection of key points of human skeleton data, and then the game Animations are integrated to realize games in scenes such as sports and dance.
  • the user can interact with at least one other user in video and voice in the K song application.
  • at least one user enters the application in a chat scene multiple users can jointly complete the recording of a song.
  • the user can turn on the camera locally to obtain pictures and videos, which is vivid, and this function can be called "look in the mirror".
  • Fig. 2 exemplarily shows a configuration block diagram of the control device 100 according to an exemplary embodiment.
  • the control device 100 includes a controller 110, a communicator 130, a user input/output interface 140, a memory 190, and a power supply 180.
  • the control device 100 is configured to control the display device 200, and can receive user input operation instructions, and convert the operation instructions into instructions that can be recognized and responded to by the display device 200, acting as an intermediary for the interaction between the user and the display device 200 effect.
  • the user operates the channel addition and subtraction keys on the control device 100, and the display device 200 responds to the channel addition and subtraction operations.
  • control device 100 may be a smart device.
  • control device 100 can install various applications for controlling the display device 200 according to user requirements.
  • the mobile terminal 100B or other smart electronic devices can perform similar functions to the control device 100 after installing an application for controlling the display device 200.
  • the user can install various function keys or virtual buttons of the graphical user interface that can be provided on the mobile terminal 100B or other smart electronic devices by installing applications to realize the function of the physical keys of the control device 100.
  • the controller 110 includes a processor 112, RAM 113 and ROM 114, a communication interface, and a communication bus.
  • the controller 110 is used to control the operation and operation of the control device 100, as well as the communication and cooperation between internal components, and external and internal data processing functions.
  • the communicator 130 realizes the communication of control signals and data signals with the display device 200 under the control of the controller 110. For example, the received user input signal is sent to the display device 200.
  • the communicator 130 may include at least one of communication modules such as a WIFI module 131, a Bluetooth module 132, and an NFC module 133.
  • the user input/output interface 140 wherein the input interface includes at least one of input interfaces such as a microphone 141, a touch panel 142, a sensor 143, and a button 144.
  • input interfaces such as a microphone 141, a touch panel 142, a sensor 143, and a button 144.
  • the user can implement the user instruction input function through voice, touch, gesture, pressing and other actions.
  • the input interface converts the received analog signal into a digital signal and the digital signal into a corresponding instruction signal, and sends it to the display device 200.
  • the output interface includes an interface for sending the received user instruction to the display device 200.
  • it may be an infrared interface or a radio frequency interface.
  • the user input instruction needs to be converted into an infrared control signal according to the infrared control protocol, and then sent to the display device 200 via the infrared sending module.
  • a radio frequency signal interface a user input instruction needs to be converted into a digital signal, which is then modulated according to the radio frequency control signal modulation protocol, and then sent to the display device 200 by the radio frequency sending terminal.
  • control device 100 includes at least one of a communicator 130 and an output interface.
  • the control device 100 is equipped with a communicator 130, such as WIFI, Bluetooth, NFC and other modules, which can encode the user input command through the WIFI protocol, or the Bluetooth protocol, or the NFC protocol, and send it to the display device 200.
  • a communicator 130 such as WIFI, Bluetooth, NFC and other modules, which can encode the user input command through the WIFI protocol, or the Bluetooth protocol, or the NFC protocol, and send it to the display device 200.
  • the memory 190 is used to store various operating programs, data, and applications for driving and controlling the control device 100 under the control of the controller 110.
  • the memory 190 can store various control signal instructions input by the user.
  • the power supply 180 is used to provide operating power support for each element of the control device 100 under the control of the controller 110. Can battery and related control circuit.
  • FIG. 3 exemplarily shows a hardware configuration block diagram of a hardware system in the display device 200 according to an exemplary embodiment.
  • the structural relationship of the hardware system can be shown in Figure 3.
  • first hardware system or the first controller one hardware system in the dual hardware system architecture
  • second hardware system or the second controller the other hardware system
  • the first controller includes various processors and various interfaces of the first controller
  • the second controller includes various processors and various interfaces of the second controller.
  • a relatively independent operating system may be installed in the first controller and the second controller, and the operating system of the first controller and the operating system of the second controller may communicate with each other through a communication protocol, for example: the first controller
  • the framework layer of the operating system of the second controller and the framework layer of the operating system of the second controller can communicate for command and data transmission, so that there are two independent but interrelated subsystems in the display device 200.
  • the first controller and the second controller can realize connection, communication and power supply through a plurality of different types of interfaces.
  • the interface type of the interface between the first controller and the second controller may include a general-purpose input/output (GPIO), a USB interface, an HDMI interface, a UART interface, and the like.
  • GPIO general-purpose input/output
  • USB interface USB interface
  • HDMI interface HDMI interface
  • UART interface UART interface
  • One or more of these interfaces can be used for communication or power transmission between the first controller and the second controller.
  • the second controller can be powered by an external power source, and the first controller can be powered by the second controller instead of the external power source.
  • the first controller may also include an interface for connecting other devices or components, such as the MIPI interface for connecting a camera (Camera) shown in FIG. 3, Bluetooth interface, etc.
  • the second controller may also include a VBY interface for connecting to the TCON (Timer Control Register) of the display screen, and for connecting to a power amplifier (AMP). And speaker (Speaker) i2S interface; and IR/Key interface, USB interface, Wifi interface, Bluetooth interface, HDMI interface, Tuner interface, etc.
  • TCON Timer Control Register
  • AMP power amplifier
  • IR/Key interface USB interface, Wifi interface, Bluetooth interface, HDMI interface, Tuner interface, etc.
  • FIG. 4 is only an exemplary description of the dual hardware system architecture of the present application, and does not represent a limitation to the present application. In practical applications, both hardware systems can contain more or less hardware or interfaces as required.
  • FIG. 4 exemplarily shows a block diagram of the hardware architecture of the display device 200 according to FIG. 3.
  • the hardware system of the display device 200 may include a first controller and a second controller, and modules connected to the first controller or the second controller through various interfaces.
  • the second controller may include a tuner and demodulator 220, a communicator 230, an external device interface 250, a controller 210, a memory 290, a user input interface, a video processor 260-1, an audio processor 260-2, a display 280, audio Output interface 270, power supply.
  • the second controller may also include more or fewer modules.
  • the tuner and demodulator 220 is used to perform modulation and demodulation processing such as amplification, mixing, and resonance on the broadcast and television signals received through wired or wireless methods, so as to demodulate the user’s information from multiple wireless or cable broadcast and television signals. Select the audio and video signals carried in the frequency of the TV channel, as well as additional information (such as EPG data signals).
  • the signal path of the tuner and demodulator 220 can have many kinds, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting or Internet broadcasting, etc.; and according to different modulation types, the signal adjustment method can be digitally modulated The method may also be an analog modulation method; and according to different types of received television signals, the tuner demodulator 220 may demodulate analog signals and/or digital signals.
  • the tuner and demodulator 220 is also used to respond to the TV channel frequency selected by the user and the TV signal carried by the frequency according to the user's selection and control by the controller 210.
  • the tuner and demodulator 220 may also be in an external device, such as an external set-top box.
  • the set-top box outputs TV audio and video signals through modulation and demodulation, and inputs them to the display device 200 through the external device interface 250.
  • the communicator 230 is a component for communicating with external devices or external servers according to various communication protocol types.
  • the communicator 230 may include a WIFI module 231, a Bluetooth communication protocol module 232, a wired Ethernet communication protocol module 233, and an infrared communication protocol module and other network communication protocol modules or near field communication protocol modules.
  • the display device 200 may establish a control signal and a data signal connection with an external control device or content providing device through the communicator 230.
  • the communicator may receive the control signal of the remote controller 100A according to the control of the controller.
  • the external device interface 250 is a component that provides data transmission between the second controller 210 and the first controller and other external devices.
  • the external device interface can be connected to external devices such as set-top boxes, game devices, notebook computers, etc. in a wired/wireless manner, and can receive external devices such as video signals (such as moving images), audio signals (such as music), and additional information (such as EPG). ) And other data.
  • the external device interface 250 may include: a high-definition multimedia interface (HDMI) terminal 251, a composite video blanking synchronization (CVBS) terminal 252, an analog or digital component terminal 253, a universal serial bus (USB) terminal 254, red, green, and blue ( RGB) terminal (not shown in the figure) and any one or more.
  • HDMI high-definition multimedia interface
  • CVBS composite video blanking synchronization
  • USB universal serial bus
  • RGB red, green, and blue
  • the controller 210 controls the work of the display device 200 and responds to user operations by running various software control programs (such as an operating system and/or various application programs) stored on the memory 290.
  • various software control programs such as an operating system and/or various application programs
  • the controller 210 includes a read-only memory RAM 214, a random access memory ROM 213, a graphics processor 216, a CPU processor 212, a communication interface 218, and a communication bus.
  • the RAM 214 and the ROM 213, the graphics processor 216, the CPU processor 212, and the communication interface 218 are connected by a bus.
  • the graphics processor 216 is used to generate various graphics objects, such as icons, operation menus, and user input instructions to display graphics. Including an arithmetic unit, which performs operations by receiving various interactive instructions input by the user, and displays various objects according to the display attributes. As well as including a renderer, various objects obtained based on the arithmetic unit are generated, and the rendering result is displayed on the display 280.
  • the CPU processor 212 is configured to execute operating system and application program instructions stored in the memory 290. And according to receiving various interactive instructions input from the outside, to execute various applications, data and content, so as to finally display and play various audio and video content.
  • the CPU processor 212 may include multiple processors.
  • the multiple processors may include one main processor and multiple or one sub-processors.
  • the main processor is used to perform some operations of the display device 200 in the pre-power-on mode, and/or to display images in the normal mode.
  • the communication interface may include a first interface 218-1 to an nth interface 218-n. These interfaces may be network interfaces connected to external devices via a network.
  • the controller 210 may control the overall operation of the display device 200. For example, in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
  • the object may be any one of the selectable objects, such as a hyperlink or an icon.
  • Operations related to the selected object for example: display operations connected to hyperlink pages, documents, images, etc., or perform operations corresponding to the icon.
  • the user command for selecting the UI object may be a command input through various input devices (for example, a mouse, a keyboard, a touch pad, etc.) connected to the display device 200 or a voice command corresponding to the voice spoken by the user.
  • the memory 290 includes storing various software modules used to drive and control the display device 200.
  • various software modules stored in the memory 290 include: a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
  • the basic module is a bottom-level software module used for signal communication between various hardware in the display device 200 and sending processing and control signals to the upper-level module.
  • the detection module is a management module used to collect various information from various sensors or user input interfaces, and perform digital-to-analog conversion, analysis and management.
  • the voice recognition module includes a voice parsing module and a voice command database module.
  • the display control module is a module for controlling the display 280 to display image content, and can be used to play information such as multimedia image content and UI interfaces.
  • the communication module is a module used for control and data communication with external devices.
  • the browser module is a module used to perform data communication between browsing servers.
  • the service module is a module used to provide various services and various applications.
  • the memory 290 is also used to store and receive external data and user data, images of various items in various user interfaces, and visual effect diagrams of focus objects.
  • the user input interface is used to send a user's input signal to the controller 210, or to transmit a signal output from the controller to the user.
  • the control device (such as a mobile terminal or a remote control) can send input signals input by the user, such as a power switch signal, a channel selection signal, and a volume adjustment signal, to the user input interface, and then forward the user input interface to the controller;
  • the control device may receive output signals such as audio, video, or data output from the user input interface processed by the controller, and display the received output signal or output the received output signal in the form of audio or vibration.
  • the user may input a user command on a graphical user interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the graphical user interface (GUI).
  • GUI graphical user interface
  • the user may input a user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
  • the video processor 260-1 is used to receive video signals, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to the standard codec protocol of the input signal.
  • the video signal displayed or played directly on the display 280.
  • the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
  • the demultiplexing module is used to demultiplex the input audio and video data stream. For example, if MPEG-2 is input, the demultiplexing module will demultiplex into a video signal and an audio signal.
  • the video decoding module is used to process the demultiplexed video signal, including decoding and scaling.
  • An image synthesis module such as an image synthesizer, is used to superimpose and mix the GUI signal generated by the graphics generator with the zoomed video image according to user input or by itself, to generate a displayable image signal.
  • the frame rate conversion module is used to convert the frame rate of the input video, such as converting the frame rate of the input 24Hz, 25Hz, 30Hz, 60Hz video to the frame rate of 60Hz, 120Hz or 240Hz, where the input frame rate can be the same as the source
  • the video stream is related, and the output frame rate can be related to the update rate of the display.
  • the input has the usual format, such as the frame insertion method.
  • the display formatting module is used to change the signal output by the frame rate conversion module to a signal conforming to the display format such as a display, for example, format the signal output by the frame rate conversion module to output RGB data signals.
  • the display 280 is used to receive the image signal input from the video processor 260-1, to display video content and images and a menu control interface.
  • the display 280 includes a display component for presenting a picture and a driving component for driving image display.
  • the displayed video content can be from the video in the broadcast signal received by the tuner and demodulator 220, or from the video content input by the communicator or the interface of an external device.
  • the display 220 simultaneously displays a user manipulation interface UI generated in the display device 200 and used for controlling the display device 200.
  • the display 280 it also includes a driving component for driving the display.
  • the display 280 is a projection display, it may also include a projection device and a projection screen.
  • the audio processor 260-2 is used to receive audio signals, and perform decompression and decoding according to the standard codec protocol of the input signal, as well as audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing, and the result can be in the speaker 272 The audio signal to be played.
  • the audio output interface 270 is used to receive the audio signal output by the audio processor 260-2 under the control of the controller 210.
  • the audio output interface may include a speaker 272 or output to an external audio output terminal 274 of the generator of an external device, such as : External audio terminal or headphone output terminal, etc.
  • the video processor 260-1 may include one or more chips.
  • the audio processor 260-2 may also include one or more chips.
  • the video processor 260-1 and the audio processor 260-2 may be separate chips, or may be integrated with the controller 210 in one or more chips.
  • the power supply is used to provide power supply support for the display device 200 with power input from an external power supply under the control of the controller 210.
  • the power supply may include a built-in power supply circuit installed inside the display device 200, or may be a power supply installed outside the display device 200, such as a power interface that provides an external power supply in the display device 200.
  • the first controller may include a controller 310, a communicator 330, a detector 340, and a memory 390. In some embodiments, it may also include a user input interface, a video processor, an audio processor, a display, and an audio output interface. In some embodiments, there may also be a power supply that independently supplies power to the first controller.
  • the communicator 330 is a component for communicating with external devices or external servers according to various communication protocol types.
  • the communicator 330 may include a WIFI module 331, a Bluetooth communication protocol module 332, a wired Ethernet communication protocol module 333, and an infrared communication protocol module and other network communication protocol modules or near field communication protocol modules.
  • the communicator 330 of the first controller and the communicator 230 of the second controller also interact with each other.
  • the WiFi module 231 of the second controller is used to connect to an external network and generate network communication with an external server or the like.
  • the WiFi module 331 of the first controller is used to connect to the WiFi module 231 of the second controller without directly connecting to an external network or the like. Therefore, for the user, a display device as in the above embodiment can display a WiFi account to the outside.
  • the detector 340 is a component used by the first controller of the display device to collect signals from the external environment or interact with the outside.
  • the detector 340 may include a light receiver 342, a sensor used to collect the intensity of ambient light, which can adaptively display parameter changes by collecting ambient light, etc.; it may also include an image collector 341, such as a camera, a camera, etc., which can be used to collect external Environmental scenes, as well as gestures used to collect attributes of users or interact with users, can adaptively change display parameters, and can also recognize user gestures to achieve the function of interaction with users.
  • the external device interface 350 provides components for data transmission between the controller 310 and the second controller or other external devices.
  • the external device interface can be connected to external devices such as set-top boxes, game devices, notebook computers, etc., in a wired/wireless manner.
  • the controller 310 controls the work of the display device 200 and responds to user operations by running various software control programs (such as installed third-party applications, etc.) stored on the memory 390 and interacting with the second controller.
  • various software control programs such as installed third-party applications, etc.
  • the controller 310 includes a read-only memory ROM313, a random access memory RAM314, a graphics processor 316, a CPU processor 312, a communication interface 318, and a communication bus.
  • the ROM 313 and the RAM 314, the graphics processor 316, the CPU processor 312, and the communication interface 318 are connected by a bus.
  • the CPU processor 312 runs the system startup instruction in the ROM, and copies the operating system stored in the memory 390 to the RAM 314 to start the startup operating system. After the operating system is started, the CPU processor 312 copies various application programs in the memory 390 to the RAM 314, and then starts to run and start various application programs.
  • the CPU processor 312 is used to execute operating system and application instructions stored in the memory 390, communicate with the second controller, transmit and interact with signals, data, instructions, etc., and receive various interactive instructions from external inputs, To execute various applications, data and content, in order to finally display and play various audio and video content.
  • the communication interface may include the first interface 318-1 to the nth interface 318-n. These interfaces may be network interfaces connected to external devices via a network, or network interfaces connected to the second controller via a network.
  • the controller 310 may control the overall operation of the display device 200. For example, in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
  • the graphics processor 316 is used to generate various graphics objects, such as icons, operation menus, and user input instructions to display graphics. Including an arithmetic unit, which performs operations by receiving various interactive instructions input by the user, and displays various objects according to the display attributes. As well as including a renderer, various objects obtained based on the arithmetic unit are generated, and the rendering result is displayed on the display 280.
  • Both the graphics processor 316 of the first controller and the graphics processor 216 of the second controller can generate various graphics objects. Differentily, if application 1 is installed in the first controller, and application 2 is installed in the second controller, when the user is in the interface of application 1, and the user input instructions are performed in application 1, the graphics processing of the first controller The generator 316 generates graphic objects. When the user is on the application 2 interface and performs the user input instruction in the application 2, the graphics processor 216 of the second controller generates the graphic object.
  • Fig. 5 exemplarily shows a schematic diagram of a functional configuration of a display device according to an exemplary embodiment.
  • the memory 390 of the first controller and the memory 290 of the second controller are respectively used to store operating system, application programs, content and user data, etc.
  • the controller 310 of the first controller and the second controller 310 The system operation of driving the display device 200 and responding to various operations of the user are executed under the control of the controller 210 of the display device.
  • the memory 390 of the first controller and the memory 290 of the second controller may include volatile and/or nonvolatile memory.
  • the memory 290 is specifically used to store the operating program of the controller 210 in the drive display device 200, and store various application programs built in the display device 200, various application programs downloaded by the user from an external device, and Various graphical user interfaces related to the application, as well as various objects related to the graphical user interface, user data information, and various internal data supporting the application.
  • the memory 290 is used to store system software such as an operating system (OS) kernel, middleware, and applications, as well as to store input video data and audio data, and other user data.
  • OS operating system
  • the memory 290 is specifically used to store driver programs and related data such as the video processor 260-1 and the audio processor 260-2, the display 280, the communication interface 230, the tuner and demodulator 220, and the input/output interface.
  • the memory 290 may store software and/or programs.
  • the software programs used to represent an operating system (OS) include, for example, kernels, middleware, application programming interfaces (APIs), and/or application programs.
  • OS operating system
  • the kernel may control or manage system resources, or functions implemented by other programs (such as the middleware, API, or application program), and the kernel may provide interfaces to allow middleware and APIs, or applications to access the controller , In order to achieve control or management of system resources.
  • the memory 290 includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external command recognition module 2907, a communication control module 2908, and power control Module 2910, operating system 2911, and other application programs 2912, browser module, etc.
  • the controller 210 executes various software programs in the memory 290 such as: broadcast and television signal reception and demodulation function, TV channel selection control function, volume selection control function, image control function, display control function, audio control function, external command Various functions such as recognition function, communication control function, optical signal receiving function, power control function, software control platform supporting various functions, and browser function.
  • the memory 390 includes storing various software modules used to drive and control the display device 200.
  • various software modules stored in the memory 390 include: a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules. Since the functions of the memory 390 and the memory 290 are relatively similar, please refer to the memory 290 for related parts, which will not be repeated here.
  • the memory 390 includes an image control module 3904, an audio control module 2906, an external command recognition module 3907, a communication control module 3908, a light receiving module 3909, an operating system 3911, and other application programs 3912, a browser module, and so on.
  • the controller 210 executes various software programs in the memory 290, such as: image control function, display control function, audio control function, external command recognition function, communication control function, optical signal receiving function, power control function, support for various Functional software control platform, as well as various functions such as browser functions.
  • the external command recognition module 2907 of the second controller and the external command recognition module 3907 of the first controller can recognize different commands.
  • the external command recognition module 3907 of the first controller may include a graphic recognition module 3907-1, and the graphic recognition module 3907-1 stores a graphic database, When the camera receives graphics instructions from the outside world, it makes a corresponding relationship with the instructions in the graphics database to control the display device.
  • the external command recognition module 2907 of the second controller may include a voice recognition module 2907-2.
  • the voice recognition module 2907-2 stores a voice database, and the voice receiver When the device receives an external voice instruction or, it corresponds to the instruction in the voice database to control the display device.
  • the control device 100 such as a remote controller is connected to the second controller, and the key command recognition module interacts with the control device 100 in command.
  • Fig. 6a exemplarily shows a configuration block diagram of the software system in the display device 200 according to an exemplary embodiment.
  • the operating system 2911 includes operating software for processing various basic system services and for implementing hardware-related tasks, acting as a data processing platform between application programs and hardware components. medium.
  • part of the operating system kernel may include a series of software to manage the hardware resources of the display device and provide services for other programs or software codes.
  • part of the operating system kernel may include one or more device drivers, and the device drivers may be a set of software codes in the operating system to help operate or control devices or hardware associated with the display device.
  • the drive may contain code to manipulate video, audio, and/or other multimedia components. Examples include displays, cameras, Flash, WiFi, and audio drivers.
  • the accessibility module 2911-1 is used to modify or access the application program to realize the accessibility of the application program and the operability of its display content.
  • the communication module 2911-2 is used to connect to other peripherals via related communication interfaces and communication networks.
  • the user interface module 2911-3 is used to provide objects that display the user interface for access by various applications, and can realize user operability.
  • the control application 2911-4 is used to control process management, including runtime applications, etc.
  • the event transmission system 2914 can be implemented in the operating system 2911 or in the application 2912. In some embodiments, it is implemented in the operating system 2911 on the one hand, and implemented in the application program 2912 at the same time, for monitoring various user input events, and responding to the recognition results of various events or sub-events according to various events. And implement one or more sets of pre-defined operation procedures.
  • the event monitoring module 2914-1 is used to monitor input events or sub-events of the user input interface.
  • the event recognition module 2914-2 is used to input the definition of various events to various user input interfaces, recognize various events or sub-events, and transmit them to the processing to execute the corresponding one or more groups of processing programs .
  • the event or sub-event refers to the input detected by one or more sensors in the display device 200 and the input of an external control device (such as the control device 100, etc.).
  • an external control device such as the control device 100, etc.
  • various sub-events of voice input such as the voice input
  • gesture input sub-events of gesture recognition such as the gesture input sub-events of gesture recognition
  • sub-events of remote control button command input of control devices such as: various sub-events of voice input, gesture input sub-events of gesture recognition, and sub-events of remote control button command input of control devices.
  • one or more sub-events in the remote control include various forms, including but not limited to one or a combination of pressing up/down/left/right/, the OK key, and pressing the key. And the operations of non-physical buttons, such as moving, pressing, and releasing.
  • the interface layout management module 2913 which directly or indirectly receives various user input events or sub-events monitored by the event transmission system 2914, is used to update the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the container
  • the size or position, level, etc. of the interface are related to the various execution operations of the interface layout.
  • the application layer of the display device includes various application programs that can be executed on the display device 200.
  • the application layer 2912 of the second controller may include, but is not limited to, one or more applications, such as video-on-demand applications, application centers, game applications, and so on.
  • the application layer 3912 of the first controller may include, but is not limited to, one or more applications, such as a live TV application, a media center application, and so on. It should be noted that the application programs contained in the first controller and the second controller are determined according to the operating system and other designs. The present invention does not need to perform the application programs contained in the first controller and the second controller. Specific definitions and divisions.
  • Live TV applications can provide live TV through different sources.
  • a live TV application can provide TV signals using input from cable TV, over-the-air broadcasting, satellite services, or other types of live TV services.
  • the live TV application can display the video of the live TV signal on the display device 200.
  • Video-on-demand applications can provide videos from different storage sources. Unlike live TV applications, VOD provides video display from certain storage sources. For example, video on demand can come from the server side of cloud storage, and from the local hard disk storage that contains stored video programs.
  • Media center applications can provide various multimedia content playback applications.
  • the media center can provide services that are different from live TV or video on demand, and users can access various images or audio through the media center application.
  • Application center can provide storage of various applications.
  • the application program may be a game, an application program, or some other application program that is related to a computer system or other device but can be run on a display device.
  • the application center can obtain these applications from different sources, store them in the local storage, and then run on the display device 200.
  • independent operating systems may be installed in the first controller and the second controller, there are two independent but interrelated subsystems in the display device 200.
  • both the first controller and the N can be independently installed with Android and various APPs, so that each chip can realize a certain function, and the first controller and the second controller can cooperate to realize a certain function.
  • FIG. 7 exemplarily shows a schematic diagram of a user interface in the display device 200 according to an exemplary embodiment.
  • the user interface includes multiple view display windows, for example, a first view display window 201 and a play screen 202, where the play screen includes one or more different items laid out.
  • the user interface also includes a selector indicating that the item is selected, and the position of the selector can be moved through user input to change the selection of different items.
  • multiple view display windows can present display screens of different levels.
  • the first view display window can present the content of the video chat item
  • the second view display window can present the content of the application layer item (eg, webpage video, VOD display, application screen, etc.).
  • the presentation of display windows of different views has priority differences, and the display priorities of the view display windows are different between view display windows with different priorities.
  • the priority of the system layer is higher than the priority of the application layer.
  • the screen display of the view display window of the system layer is not blocked; and the application layer is enabled according to the user's choice.
  • the size and position of the view display window of the system layer change, the size and position of the view display window of the system layer are not affected.
  • the same level of display screen can also be presented.
  • the selector can switch between the display window of the first view and the display window of the second view, and when the size and position of the display window of the first view change, the second view
  • the size and position of the display window can be changed at any time.
  • the display device plays at least two channels of sound at the same time.
  • FIG. 8 exemplarily shows a user interface in a watching and chatting scenario.
  • the display device simultaneously plays a video program and makes a voice call with three terminal users.
  • the display device plays the video program in full screen, and the voice call window is suspended on the video playback screen in the form of a small window.
  • the watching and chatting scene is not limited to the exemplarily shown scene of watching a video program and making a voice call, but also includes a scene of listening to an audio program and making a video call.
  • a video program or audio program is paused during a voice call or the display presents a static user interface instead of a dynamic video screen, since the audio output channel of the video program is always on, this type of scene is also considered The aforementioned scene where there are at least two channels of sound, that is, watching and chatting scenes.
  • the audio and video programs played by the display device may be live TV programs or network programs.
  • the display device can conduct multi-channel video chats with multiple other terminal devices while playing audio and video programs.
  • the display device can play more than 2 sound signals at the same time.
  • the controller can receive at least two kinds of audio data, one is the audio data of audio and video programs, the audio and video programs further include live TV programs and network programs, and the second is voice Audio data of the call.
  • the controller uses the audio processor 260 to decompress and decode the aforementioned at least two audio data according to the standard encoding and decoding protocol of the input signal, as well as audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing, and process
  • the latter at least the sound signal is superimposed and sent to the audio output interface 270 (such as the speaker 272), and finally the audio and video program sound and the voice call sound are output through the audio output interface 270, for example, the audio and video program sound and the voice call are played through the speaker 272 the sound of.
  • the user can normally operate the display device through the operation control device.
  • the user can adjust the output volume value of the sound signal by operating the control device, and the power amplifier manager controls the gain of the sound signal according to the output volume value set by the user.
  • the output volume of the sound signal can be adjusted by operating a physical volume button (volume +, volume -) or virtual volume button on a remote control or a mobile terminal, or voice input.
  • the controller when the display shows the user interface as shown in FIG. 8, if the controller receives an instruction to adjust the volume input by the user by operating the control device 100, the controller responds to the instruction and displays on the upper layer of the user interface. Indicates the interface element of the current output volume.
  • the interface element may be a volume adjustment bar as shown in FIG. 10. The user can know the current output volume value of the sound signal according to the volume value shown in the volume adjustment bar, and the power amplifier manager controls the gain of the sound signal according to the output volume value.
  • the power amplifier manager in the audio processor adjusts the gain of the TV program sound signal and the voice call sound signal at the same time according to the same output volume value and superimposes them, the output volume of the TV program and the voice call output The volume is the same. Furthermore, from the user's point of view, in the "watching and chatting" scene, the two voices are mixed together, interfering with each other, causing the user to be unable to recognize them.
  • Figures 8 and 11-13 exemplarily show a schematic diagram of the volume adjustment interaction process in the watching and chatting scene. .
  • the user when playing audio and video programs and voice calls at the same time as shown in FIG. 8, the user can input an instruction to adjust the volume by operating the control device, and the controller responds to the instruction and displays the volume setting on the display.
  • the volume setting interface displays interface elements for representing the output volume of audio and video programs, and also displays volume setting items for correlating the output volume of voice calls with the output volume of audio and video programs.
  • FIG 11 exemplarily shows a volume setting interface.
  • the volume setting interface is displayed in the form of a view window on the playing screen of audio and video programs and the upper layer of the voice call window.
  • the volume setting interface includes the volume Adjustment bar 111 and volume setting items 112-114, the volume adjustment bar is used to indicate the output volume value of audio and video programs, volume setting items 112-114 are "standard mode” 112, "loud mode” 113 and "AI silent mode” respectively "114.
  • the volume setting item is used to associate the output volume of the voice call with the output volume of the audio and video program.
  • the user directly adjusts the output volume of the audio and video program by operating the control device, and the controller Then, the output volume of the voice call is adjusted according to the output volume value of the audio and video program, so that the output volume of the two is different.
  • the controller adjusts the output volume value of the voice call to the second volume according to the first volume of the output volume value of the audio and video program.
  • the first volume is related to the second volume, and the first volume ⁇ the second volume.
  • a volume setting item is preset with one adjustment coefficient, and different volume setting items correspond to different adjustment coefficients.
  • the adjustment coefficient is used to multiply the output volume value of the audio and video program to obtain the output volume value of the voice call, so that the output volume of the audio and video program is different from the output volume of the voice call.
  • the adjustment coefficient corresponding to a certain volume setting item is 1, the output volume value of the voice call does not change relative to the output volume value of the audio and video program. It can be seen that the output volume value of the audio and video program and the output volume of the voice call The value is the same.
  • the three volume setting items 112-114 shown in Figure 11 respectively correspond to an adjustment coefficient.
  • the adjustment coefficient corresponding to the "standard mode" 112 may be 1.1, which means that the person of the opposite user
  • the sound volume value can be increased by 10% relative to the playback volume value of the audio and video program
  • the adjustment coefficient corresponding to the "loud mode” 113 can be 1.2, which means that the human voice volume value of the peer user can be compared with the audio and video during video chat.
  • the playback volume value of the program is increased by 20%
  • the adjustment coefficient corresponding to the "AI mute mode" 114 can be 0, which means that the voice volume of the opposite user is muted during a voice call.
  • the user can continue to operate the volume key (volume + or volume -) on the control device to input an instruction to adjust the volume, and the controller responds to the instruction , Obtain the pre-saved default items in the volume setting item, and then adjust the output volume of audio and video programs and the output volume of voice calls according to the default items.
  • the controller obtains the adjustment coefficient 1.1 corresponding to the "standard mode” 112, and then adjusts the output volume of the voice call to the first volume according to the output volume value of the audio and video program.
  • the second volume, the second volume the first volume ⁇ 1.1.
  • the user when the display shows the volume setting interface as shown in FIG. 11, the user can select a certain volume setting item by operating the control device, and the controller can output the audio and video programs according to the selected volume setting item.
  • the "standard” mode and “sounding” mode can achieve the effect of human voice enhancement, so that in the watching and chatting scene, the sound of the audio and video program is played as the background sound, and the sound of the voice call is the foreground The sound is played, so that the user can easily distinguish the source of the sound and avoid confusion.
  • the adjustment coefficients of the "standard mode” 112 and the “boom mode” 113 are not limited to 1.1 or 1.2. In other embodiments, the "standard mode” 112 and the “boom mode” 113 may also be preset Any value of.
  • the volume setting items displayed in the volume setting interface are not limited to the above three types. In other embodiments, other volume setting items may be provided for the user to choose from according to user needs.
  • the sound reduction effect is achieved by adjusting the adjustment coefficient corresponding to the volume setting item. For example, when the adjustment coefficient corresponding to a certain volume setting item is 0.5, it means that the person of the opposite user is The sound volume value can be reduced by 50% relative to the playback volume value of the audio and video program, thereby achieving the effect of weakening the human voice.
  • the at least two channels of sounds in the watching and chatting scene, by adjusting the volume of at least two channels of sounds played at the same time by the display device, the at least two channels of sounds can be played at different output volumes, so that Users can easily distinguish the source of the sound.
  • the adjustment coefficient corresponding to the "AI silent mode” shown in FIG. 11 can be 0, which means that the voice volume of the video chatting person is reduced to 0.
  • the "AI Mute Mode” when selected (ie turned on), the text corresponding to the call audio data can be posted on the playback screen in the form of a barrage The upper level.
  • the controller stops playing the voice signal of the voice call, and presents the text corresponding to the voice call data corresponding to the peer device on the top layer of the user interface in the form of a barrage, as shown in Figure 12 .
  • the following takes the display device 200B as an example to describe the implementation of the above-mentioned "AI silent mode" 114, where the display device 200B conducts a voice call with one or more other terminal devices while playing audio and video programs.
  • An exemplary description will be given below in conjunction with the voice call communication process between the display device 200B and the display device 200A.
  • the text corresponding to the voice call data is converted by the data sending end according to the call data collected by it, and sent to the data receiving end for display.
  • the display device 200A collects the user's call data A through the microphone; when the display device 200B does not turn on the "AI silent mode” 114, the display device 200A sends the collected call data A to the display device 200B; in the display device 200B When the "AI silent mode” 114 is turned on, the display device 200A synchronizes the collected call data A to the voice server, recognizes and converts the call data A through the voice server, and obtains the text A corresponding to the call data A; then sends the text A To the display device 200B.
  • the display device 200B when the "AI silent mode" 114 is not turned on, the call data A sent by the display device 200A is received, the voice call sound signal is extracted from the call data A, and the voice call sound signal is used by the power amplifier manager Perform processing, and play the processed sound signal through the speaker; when it receives the user input to select the "AI silent mode” 114, notify the display device 200A; when the display device 200B turns on the "AI silent mode” 114 , The display device 200B receives the text A sent by the display device 200A, and displays the text A on the top layer of the user interface in real time, thereby achieving the display effect shown in FIG. 11.
  • the display device 200B notifies the display device 200A when it receives the user’s input on the volume setting interface shown in FIG. 11 to select the "AI silent mode" 114 operation, so that the display device 200A can collect After the call data A is converted into text A, it is sent to the display device 200B.
  • the data receiving end recognizes and converts the received call data to obtain the corresponding text and display it.
  • the display device 200A collects the call data A of the local user through a microphone, and sends the collected call data A to the display device 200B; for the display device 200B, when the "AI silent mode" 114 is not turned on, the display device 200A receives the display
  • the call data A sent by the device 200A extracts the voice call sound signal from the call data A, processes the voice call sound signal through the power amplifier manager, and plays it through the speaker; turn on the "AI silent mode” 114 on the display device 200B
  • the display device 200B receives the call data A sent by the display device 200A, and synchronizes the received call data A to the voice server, and recognizes and converts the call data A through the voice server to obtain the text A corresponding to the call data A; Display text A on the top level of the user interface.
  • the display of the text A has a certain time delay Ta, which is at least the display device 200A or the display device 200B through the voice server recognition The length of time required to output text A.
  • Figure 13 exemplarily shows another volume setting interface.
  • the interface shown in Figure 13 also includes the item "Volume Association Adjustment" which is used to turn on or turn off the above-mentioned volume association adjustment function.
  • Switch 115 the user can turn on or off the volume-related adjustment function by operating this item. Specifically, if the volume-related adjustment function is turned on, each volume setting item in the interface is an item that can be operated and then can be selected by the user; if the volume-related adjustment function is turned off, the volume setting item is an inoperable item. Furthermore, it cannot be selected by user operations.
  • the user cancels the display of the volume setting interface by operating the control device such as the "return” button, the "exit” button, etc., to return to the interface shown in FIG. 8.
  • FIG. 14 is a flowchart of a volume control method exemplarily shown in some embodiments of the application. As shown in FIG. 14, the method may include:
  • Step 01 When the audio and video program is played and the voice call is in progress at the same time, an instruction to adjust the volume is received from the user.
  • the embodiment of the present application proposes the concept of a sound playing scene, and the sound playing scene of a display device includes a watching and chatting scene and a normal scene.
  • the watching and chatting scene refers to a scene that includes two or more sound signal output, for example, a scene where audio and video program playback and voice call are performed at the same time.
  • step 02 when a user input instruction to adjust the volume is received, it is determined whether the current sound playback scene is a watching and chatting scene, if it is a watching and chatting scene, step 02 is executed, if not
  • the chat scene that is, the normal scene, presents an interface as shown in FIG. 10, and adjusts the volume of the video and audio program played by the display device according to the user's input.
  • the user can input an instruction to adjust the volume by operating the control device. For example, when the user presses a physical volume button for increasing or decreasing the volume on the remote control 100A, the controller receives an instruction to increase or decrease the volume sent by the remote control. For another example, when the user clicks a virtual volume button used to increase or decrease the volume on the mobile terminal 100B, the controller receives an instruction to increase or decrease the volume sent by the mobile terminal.
  • the user can also input voice instructions for increasing or decreasing the volume through the remote control, the display device, or the microphone on the mobile terminal. The user can also input an instruction to increase or decrease the volume by pressing the local volume button on the housing of the display device for increasing or decreasing the volume.
  • Step 02 In response to the instruction to adjust the volume, a volume setting interface is presented on the display.
  • the volume setting interface includes an interface element for indicating the output volume of the audio and video program and an interface element for communicating the voice call.
  • the output volume is a volume setting item associated with the output volume of the audio and video program.
  • the volume setting interface involved in step 02 may be the volume setting interface as shown in FIG. 11, the interface element used to indicate the output volume of the audio and video program may be the volume adjustment bar 111 in FIG. 11, and the volume setting item may be as shown in FIG. Items 112-114 in 11.
  • one volume setting item is preset with one adjustment coefficient, and different volume setting items correspond to different adjustment coefficients.
  • the adjustment coefficient is used to multiply the output volume value of the audio and video program to obtain the output volume value of the voice call, so that the output volume of the audio and video program is different from the output volume of the voice call.
  • the adjustment coefficient corresponding to a certain volume setting item is 1, the output volume value of the voice call does not change relative to the output volume value of the audio and video program. It can be seen that the output volume value of the audio and video program and the output volume of the voice call The value is the same.
  • Step 03 In response to the user's selection operation of the volume setting item, according to the selected volume setting item, the output volume of the audio and video program and the output volume of the voice call are adjusted in association, so that the voice The adjusted output volume of the call is different from the adjusted output volume of the audio and video program.
  • the audio stream source of the sound signal includes TV program audio and other audio based on physical type channels (ATV, DTV, HDMI, etc.), and the audio stream type of the other audio is mainly network program audio based on network type channels (STREAM_MUSIC ) And call audio (STREAM_VOICE_CALL).
  • TV program audio based on physical type channels (ATV, DTV, HDMI, etc.) and network program audio based on network type channels (STREAM_MUSIC) are the audio of audio and video programs
  • call audio (STREAM_VOICE_CALL) is the audio of voice calls.
  • the output volume value can be independently adjusted to independently control its gain according to the corresponding output volume value.
  • the output volume control mainly includes two types of branches, the main volume MainVoice and the sub-volume SubVoice.
  • the sub-volume SubVoice further includes the first sub-volume MusicVoice and the second sub-volume CallVoice, where the main volume MainVoice is The volume corresponding to the TV program, the first sub-volume MusicVoice is the volume corresponding to the network program, and the second sub-volume Call Voice is the volume corresponding to the voice call.
  • the audio stream source of the audio and video program is obtained, and the audio stream source is a TV program based on a physical type channel or a network program based on a network type channel. If the audio stream source of the audio and video program is a TV program, adjust the main volume value (MainVoice) to the first volume, which is the output volume value of the TV program; if the audio stream source of the audio and video program is a network program, The first sub-volume value (MusicVoice) is adjusted to the first volume, and the first sub-volume value is the volume value of the network program.
  • MainVoice main volume value
  • the first sub-volume value (MusicVoice) is adjusted to the first volume
  • the first sub-volume value is the volume value of the network program.
  • the first volume is multiplied by the adjustment coefficient corresponding to the selected volume setting item to obtain the second volume, which is the target volume of the voice call, and then the second sub-volume value (CallVoice) is adjusted to the second Volume, the second sub-volume value is the volume value of the voice call.
  • the second sub-volume value (CallVoice) is adjusted to the second Volume, the second sub-volume value is the volume value of the voice call.
  • the at least two channels of sounds can be played at different output volumes, so that the user can easily distinguish the source of the sound separately.
  • the display shows the volume setting interface as shown in FIG. 11, if an instruction for adjusting the volume input by the user is received, the default item pre-saved in the volume setting item is obtained; and then according to the default The project adjusts the output volume of audio and video programs and the output volume of voice calls.
  • the default item pre-saved in the volume setting item is obtained; and then the output volume of the audio and video program is adjusted according to the default item. Adjust the output volume in association with the voice call.
  • the default item may be the volume setting item selected by the user in the last operation, or the system default volume setting item.
  • each volume setting item can be marked in the state, for example, the volume setting item selected by the user or the system default volume setting item is marked as selected, and the remaining volume setting items are marked as unselected.
  • the default item can be obtained by traversing the mark state of each volume setting item.
  • the process ends.
  • the output volume of the audio and video program and the output volume of the voice call are adjusted to 0 at the same time.
  • the AI mute instruction input by the user when the AI mute instruction input by the user is received in the watching and chatting scene, in response to the AI mute instruction, the sound signal of the voice call is stopped playing, and the text corresponding to the call data is displayed as a barrage.
  • the form is posted on the upper layer of the playback screen. For example, when the display shows the volume setting interface as shown in FIG. 11, the user can select the "AI mute mode" 114 by operating the control device to input an AI mute instruction.
  • the text corresponding to the call data and the user information corresponding to the call data are obtained respectively, and the user information is used to characterize the user account that sends the call data, such as user nickname, user ID, user avatar, etc.
  • the barrage text is generated according to the text corresponding to the call data and the user information; and then the barrage text carrying the user information mark is displayed on the upper layer of the playback screen presented on the display.
  • the present invention also provides a computer storage medium, wherein the computer storage medium can store a calculation program, and when at least one controller/processor of the display device executes the computer program, the controller/processor executes the computer program.
  • the storage medium may be a magnetic disk, an optical disc, a read-only memory (English: read-only memory, abbreviated as: ROM) or a random access memory (English: random access memory, abbreviated as: RAM), etc.
  • the technology in the embodiments of the present invention can be implemented by means of software plus a necessary general hardware platform.
  • the technical solutions in the embodiments of the present invention can be embodied in the form of software products, which can be stored in a storage medium, such as ROM/RAM. , Magnetic disks, optical disks, etc., including several instructions to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute the methods described in the various embodiments or some parts of the embodiments of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Un dispositif d'affichage et un procédé de réglage de volume sont décrits. Le procédé de réglage de volume comprend : lors de la réalisation simultanée de la lecture d'un programme audio et vidéo et d'un appel vocal, la réception, par un dispositif d'affichage, d'une instruction pour indiquer un réglage de volume, et pour présenter une interface de réglage de volume sur un dispositif d'affichage, l'interface de réglage de volume comprenant un élément de réglage de volume pour associer le volume de sortie de l'appel vocal au volume de sortie du programme audio et vidéo ; et en réponse à une opération par laquelle un utilisateur sélectionne l'élément de réglage de volume, le réglage, d'une manière associée, du volume de sortie du programme audio et vidéo et du volume de sortie de l'appel vocal.
PCT/CN2020/081417 2020-03-26 2020-03-26 Dispositif d'affichage et procédé de réglage de volume WO2021189358A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/081417 WO2021189358A1 (fr) 2020-03-26 2020-03-26 Dispositif d'affichage et procédé de réglage de volume

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/081417 WO2021189358A1 (fr) 2020-03-26 2020-03-26 Dispositif d'affichage et procédé de réglage de volume

Publications (1)

Publication Number Publication Date
WO2021189358A1 true WO2021189358A1 (fr) 2021-09-30

Family

ID=77890138

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/081417 WO2021189358A1 (fr) 2020-03-26 2020-03-26 Dispositif d'affichage et procédé de réglage de volume

Country Status (1)

Country Link
WO (1) WO2021189358A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114598967A (zh) * 2022-03-03 2022-06-07 合众新能源汽车有限公司 音频策略管理系统、方法、装置及计算机可读介质
CN115002553A (zh) * 2022-04-29 2022-09-02 当趣网络科技(杭州)有限公司 基于同一影视视频边看边聊的方法和系统
CN116743905A (zh) * 2022-09-30 2023-09-12 荣耀终端有限公司 通话音量控制方法及电子设备
CN116737104A (zh) * 2022-09-16 2023-09-12 荣耀终端有限公司 音量调节方法和相关装置
WO2023245976A1 (fr) * 2022-06-20 2023-12-28 由我(万安)科技有限公司 Procédé de réglage audio, émetteur bluetooth et support de stockage lisible

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103677703A (zh) * 2012-09-18 2014-03-26 联想(北京)有限公司 电子设备及其音量调节方法
CN103686015A (zh) * 2013-12-20 2014-03-26 乐视致新电子科技(天津)有限公司 音量调节方法及系统
CN106803918A (zh) * 2017-03-02 2017-06-06 无锡纽微特科技有限公司 一种视频通话系统及实现方法
CN109683847A (zh) * 2018-12-20 2019-04-26 维沃移动通信有限公司 一种音量调节方法和终端

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103677703A (zh) * 2012-09-18 2014-03-26 联想(北京)有限公司 电子设备及其音量调节方法
CN103686015A (zh) * 2013-12-20 2014-03-26 乐视致新电子科技(天津)有限公司 音量调节方法及系统
CN106803918A (zh) * 2017-03-02 2017-06-06 无锡纽微特科技有限公司 一种视频通话系统及实现方法
CN109683847A (zh) * 2018-12-20 2019-04-26 维沃移动通信有限公司 一种音量调节方法和终端

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114598967A (zh) * 2022-03-03 2022-06-07 合众新能源汽车有限公司 音频策略管理系统、方法、装置及计算机可读介质
CN115002553A (zh) * 2022-04-29 2022-09-02 当趣网络科技(杭州)有限公司 基于同一影视视频边看边聊的方法和系统
WO2023245976A1 (fr) * 2022-06-20 2023-12-28 由我(万安)科技有限公司 Procédé de réglage audio, émetteur bluetooth et support de stockage lisible
CN116737104A (zh) * 2022-09-16 2023-09-12 荣耀终端有限公司 音量调节方法和相关装置
CN116743905A (zh) * 2022-09-30 2023-09-12 荣耀终端有限公司 通话音量控制方法及电子设备
CN116743905B (zh) * 2022-09-30 2024-04-26 荣耀终端有限公司 通话音量控制方法及电子设备

Similar Documents

Publication Publication Date Title
WO2021189358A1 (fr) Dispositif d'affichage et procédé de réglage de volume
CN112073797B (zh) 一种音量调节方法及显示设备
WO2021031629A1 (fr) Appareil d'affichage et procédé d'application d'un bouton multifonction pour dispositif de commande
CN110708581B (zh) 显示设备及呈现多媒体屏保信息的方法
WO2020248680A1 (fr) Procédé et appareil de traitement de données vidéo et dispositif d'affichage
CN111464840B (zh) 显示设备及显示设备屏幕亮度的调节方法
WO2020248681A1 (fr) Dispositif d'affichage et procédé d'affichage des états de commutation bluetooth
WO2021031620A1 (fr) Dispositif d'affichage et procédé de réglage de luminosité de rétroéclairage
WO2021031589A1 (fr) Dispositif d'affichage et procédé de réglage d'espace de gamme dynamique de couleurs
WO2021031598A1 (fr) Procédé d'ajustement auto-adaptatif pour la position d'une fenêtre de dialogue en ligne vidéo, et dispositif d'affichage
WO2020248697A1 (fr) Dispositif d'affichage et procédé de traitement des données de communication vidéo
CN112463267B (zh) 在显示设备屏幕上呈现屏保信息的方法及显示设备
CN113448529B (zh) 显示设备和音量调节方法
WO2020248699A1 (fr) Procédé de traitement du son et appareil d'affichage
WO2020248654A1 (fr) Appareil d'affichage et procéder pour afficher des applications de façon conjointe
WO2020248790A1 (fr) Procédé de commande vocale et dispositif d'affichage
WO2021169125A1 (fr) Dispositif d'affichage et procédé de commande
CN112073777B (zh) 一种语音交互方法及显示设备
CN112073666B (zh) 一种显示设备的电源控制方法及显示设备
CN112073812B (zh) 一种智能电视上的应用管理方法及显示设备
CN112073808B (zh) 一种色彩空间切换方法及显示装置
CN112073803B (zh) 一种声音再现方法及显示设备
CN112073759B (zh) 双系统之间通信方式的选取及调度方法、装置及显示设备
CN113448530A (zh) 显示设备和音量控制方法
CN112073773A (zh) 一种屏幕互动方法、装置及显示设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20926959

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20926959

Country of ref document: EP

Kind code of ref document: A1