WO2022078065A1 - Procédé de lecture de ressources de dispositif d'affichage et dispositif d'affichage - Google Patents

Procédé de lecture de ressources de dispositif d'affichage et dispositif d'affichage Download PDF

Info

Publication number
WO2022078065A1
WO2022078065A1 PCT/CN2021/113742 CN2021113742W WO2022078065A1 WO 2022078065 A1 WO2022078065 A1 WO 2022078065A1 CN 2021113742 W CN2021113742 W CN 2021113742W WO 2022078065 A1 WO2022078065 A1 WO 2022078065A1
Authority
WO
WIPO (PCT)
Prior art keywords
display device
mode
audio
earphone
user
Prior art date
Application number
PCT/CN2021/113742
Other languages
English (en)
Chinese (zh)
Inventor
王丽娟
魏强
Original Assignee
青岛海信传媒网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 青岛海信传媒网络技术有限公司 filed Critical 青岛海信传媒网络技术有限公司
Publication of WO2022078065A1 publication Critical patent/WO2022078065A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/162Interface to dedicated audio devices, e.g. audio drivers, interface to CODECs

Definitions

  • the present application relates to the field of display technology, and in particular, to a method for playing resources of a display device and a display device.
  • the resources played by the display device include video resources and audio resources.
  • the video resources and the audio resources are played synchronously, so that the user can hear the sound of things in the video while watching the video resources.
  • the resource can also be expressed as a program of the display device.
  • Each program has its corresponding PSI (Program Specific Information), which includes PMT (Program Map Table), audio information describing the program, and program specific information in the PSI. video information, etc.
  • PSI Program Specific Information
  • PMT Program Map Table
  • audio information describing the program and program specific information in the PSI.
  • video information etc.
  • the PMT also contains narration information, which conveys a visual image in concise and vivid language, which can help the blind or visually impaired to experience the visual image of the image presented by the video resource.
  • the present application provides a display device resource playback method and display device
  • the present application provides a display device, including: a display for displaying video resources in resources played by the display device; a controller for performing: detecting whether the display device is connected to a headset; connecting the headset to the display device In the case of , the current headphone mode of the display device is detected; the headphone mode is used to indicate which audio output device the display device uses to play the audio resource in the resource; in the case that the headphone mode is a headphone and a speaker, the detection The audio mode of the display device; the audio mode is used to indicate whether the resource played by the display device is in the visually impaired mode; the visually impaired mode is used to indicate that the resource played by the display device contains narration information; In the case of barrier mode, the narration output mode of the display device is detected; the narration output mode is used to indicate which audio output device the display device uses to play the narration information in the resource; according to the headphone mode and the narration output mode , select the corresponding audio output device to play the audio resource and narration information in the resource.
  • the present application also provides a method for playing a display device resource, including: detecting whether the display device is connected to an earphone; if the display device is connected to the earphone, detecting the current earphone mode of the display device; the earphone mode is used to represent Which audio output device is used by the display device to play the audio resource in the resource; in the case that the earphone mode is an earphone and a speaker, the audio mode of the display device is detected; the audio mode is used to indicate whether the resource played by the display device is is the visually impaired mode; the visually impaired mode is used to indicate that the resources played by the display device contain narration information; when the audio mode is the visually impaired mode, the narration output mode of the display device is detected; the narration output mode It is used to indicate which audio output device the display device uses to play the narration information in the resource; according to the earphone mode and the narration output mode, select the corresponding audio output device to play the audio resource and narration information in the resource.
  • FIG. 1 exemplarily shows a schematic diagram of an operation scene between a display device and a control apparatus according to some embodiments
  • FIG. 2 exemplarily shows a hardware configuration block diagram of a display device 200 according to some embodiments
  • FIG. 3 exemplarily shows a hardware configuration block diagram of the control apparatus 100 according to some embodiments
  • FIG. 4 exemplarily shows a schematic diagram of software configuration in the display device 200 according to some embodiments
  • FIG. 5 exemplarily shows a schematic diagram of displaying an icon control interface of an application in the display device 200 according to some embodiments
  • FIG. 6 is a schematic diagram of a setting interface shown in an embodiment of the application.
  • FIG. 7 is a schematic diagram of an audio mode interface shown in an embodiment of the application.
  • FIG. 8 is a schematic diagram of a second audio mode interface shown in an embodiment of the application.
  • FIG. 9 is a schematic diagram of a third audio mode interface shown in an embodiment of the application.
  • FIG. 10 is a schematic diagram of a fourth audio mode interface shown in an embodiment of the application.
  • FIG. 11 is a schematic diagram of a fifth audio mode interface shown in an embodiment of the application.
  • FIG. 12 is a schematic diagram of a second setting interface shown in an embodiment of the application.
  • FIG. 13 is a control flow chart of a controller 250 shown in an embodiment of the application.
  • FIG. 14 is a flowchart of a method for playing resources on a display device according to an embodiment of the application.
  • remote control refers to a component of an electronic device, such as the display device disclosed in this application, that can wirelessly control the electronic device, usually over a short distance.
  • infrared and/or radio frequency (RF) signals and/or Bluetooth are used to connect with electronic devices, and functional modules such as WiFi, wireless USB, Bluetooth, and motion sensors may also be included.
  • RF radio frequency
  • a hand-held touch remote control replaces most of the physical built-in hard keys in a general remote control device with a user interface in a touch screen.
  • gesture used in this application refers to a user's behavior that is used by a user to express an expected thought, action, purpose/or result through an action such as a change of hand shape or hand movement.
  • FIG. 1 exemplarily shows a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment.
  • a user may operate the display apparatus 200 through the mobile terminal 300 and the control apparatus 100 .
  • control device 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication or Bluetooth protocol communication, and other short-range communication methods, etc., and controls the display device 200 by wireless or other wired methods.
  • the user can control the display device 200 by inputting user instructions through keys on the remote control, voice input, control panel input, and the like.
  • the user can control the display device 200 by inputting corresponding control commands through the volume up/down key, channel control key, up/down/left/right movement keys, voice input key, menu key, power-on/off key, etc. on the remote control. function.
  • mobile terminals, tablet computers, computers, notebook computers, and other smart devices may also be used to control the display device 200 .
  • the display device 200 is controlled using an application running on the smart device.
  • the app can be configured to provide users with various controls in an intuitive user interface (UI) on the screen associated with the smart device.
  • UI intuitive user interface
  • the display device may not use the above-mentioned smart device or control device to receive instructions, but receive user control through touch or gesture.
  • the mobile terminal 300 may install a software application with the display device 200 to implement connection communication through a network communication protocol, so as to achieve the purpose of one-to-one control operation and data communication.
  • a control command protocol can be established between the mobile terminal 300 and the display device 200
  • the remote control keyboard can be synchronized to the mobile terminal 300
  • the function of controlling the display device 200 can be realized by controlling the user interface on the mobile terminal 300.
  • the audio and video content displayed on the mobile terminal 300 may also be transmitted to the display device 200 to implement a synchronous display function.
  • the display device 200 also performs data communication with the server 400 through various communication methods.
  • the display device 200 may be allowed to communicate via local area network (LAN), wireless local area network (WLAN), and other networks.
  • the server 400 may provide various contents and interactions to the display device 200 .
  • the display device 200 interacts by sending and receiving information, and electronic program guide (EPG), receiving software program updates, or accessing a remotely stored digital media library.
  • EPG electronic program guide
  • the server 400 may be a cluster or multiple clusters, and other network service contents such as video-on-demand and advertising services are provided through the server 400 .
  • the display device 200 may be a liquid crystal display, an OLED display, or a projection display device.
  • the specific display device type, size and resolution are not limited. Those skilled in the art can understand that the display device 200 can make some changes in performance and configuration as required.
  • the display device 200 may additionally provide a smart IPTV function that provides computer-supported functions, including but not limited to, IPTV, smart TV, Internet Protocol TV (IPTV), and the like, in addition to the broadcast receiving TV function.
  • a smart IPTV function that provides computer-supported functions, including but not limited to, IPTV, smart TV, Internet Protocol TV (IPTV), and the like, in addition to the broadcast receiving TV function.
  • FIG. 2 exemplarily shows a block diagram of the hardware configuration of the display device 200 according to the exemplary embodiment.
  • the display device 200 includes a controller 250, a tuner 210, a communicator 220, a detector 230, an input/output interface 255, a display 275, an audio output interface 285, a memory 260, a power supply 290, At least one of the user interface 265 and the external device interface 240 .
  • the display 275 is used for receiving the image signal output from the first processor, and performing components for displaying video content and images and a menu manipulation interface.
  • the display 275 includes a display screen component for presenting pictures, and a driving component for driving image display.
  • display 275 is used to present a user-manipulated UI interface generated in display device 200 and used to control display device 200 .
  • a driving component for driving the display is also included.
  • display 275 is a projection display, and may also include a projection device and projection screen.
  • Communicator 220 may be used for components that communicate with external devices or external servers according to various communication protocol types.
  • the communicator 220 may include at least one of a Wifi module 221, a Bluetooth module 222, a wired Ethernet module 223 and other network communication protocol modules or near field communication protocol modules, and an infrared receiver.
  • the display apparatus 200 may establish control signal and data signal transmission and reception between the external control apparatus 100 or the content providing apparatus through the communicator 220 .
  • the user interface 265 can be used to receive infrared control signals from the control device 100 (eg, an infrared remote control, etc.).
  • the control device 100 eg, an infrared remote control, etc.
  • the detector 230 may be used for the display device 200 to collect external environment or external interaction signals.
  • the detector 230 includes a light receiver, a sensor for collecting ambient light intensity, and can adaptively display parameter changes and the like by collecting ambient light.
  • the detector 230 may also include an image collector 232, such as a camera, a camera, etc., which can be used to collect external environment scenes, as well as to collect user attributes or interactive gestures with the user, and can adaptively change display parameters and recognize user gestures. , in order to realize the function of interaction with users.
  • the detector 230 may further include a sound collector 231 or the like, such as a microphone, which may be used to receive the user's voice.
  • a voice signal including a control instruction of the user to control the display device 200, or collecting ambient sounds is used to identify the type of the environment scene, so that the display device 200 can adaptively adapt to the ambient noise.
  • the input/output interface 255 is configured to enable data transfer between the controller 250 and other external devices or other controllers 250 . Such as receiving video signal data and audio signal data of external equipment, or command instruction data, etc.
  • the external device interface 240 may include, but is not limited to, the following: any one or more of a high-definition multimedia interface HDMI interface, an analog or data high-definition component input interface, a composite video input interface, a USB input interface, an RGB port, etc. interface. It is also possible to form a composite input/output interface by a plurality of the above-mentioned interfaces.
  • the tuner and demodulator 210 is configured to receive broadcast television signals through wired or wireless reception, and can perform modulation and demodulation processing such as amplification, frequency mixing, and resonance, and can perform modulation and demodulation processing from multiple wireless receivers.
  • the audio and video signal may include the TV audio and video signal carried in the frequency of the TV channel selected by the user, and the EPG data signal.
  • the controller 250 and the tuner 210 may be located in different separate devices, that is, the tuner 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box Wait.
  • the set-top box outputs the modulated and demodulated television audio and video signals of the received broadcast television signals to the main device, and the main device receives the audio and video signals through the first input/output interface.
  • the controller 250 controls the operation of the display device and responds to user operations.
  • the controller 250 may control the overall operation of the display apparatus 200 .
  • the controller 250 may perform an operation related to the object selected by the user command.
  • the controller 250 includes a random access memory 251 (Random Access Memory, RAM), a read-only memory 252 (Read-Only Memory, ROM), a graphics processor 253 (Graphics Processing Unit, GPU), a central processing unit At least one of a processor 254 (Central Processing Unit, CPU), an input/output interface 255 and a communication bus 256 (Bus).
  • the communication bus connects the various components.
  • RAM 251 is used to store temporary data for the operating system or other running programs.
  • ROM 252 is used to store various system startup instructions.
  • ROM 252 is used to store a basic input output system, called a Basic Input Output System (BIOS). It is used to complete the power-on self-check of the system, the initialization of each functional module in the system, the driver program of the basic input/output of the system, and the boot operating system.
  • BIOS Basic Input Output System
  • the power supply of the display device 200 starts to start, and the CPU executes the system start-up instruction in the ROM 252, and copies the temporary data of the operating system stored in the memory to the RAM 251, so as to facilitate startup or operation operating system.
  • the CPU copies the temporary data of various application programs in the memory to the RAM 251, and then, in order to start or run various application programs.
  • processor 254 executes operating system and application program instructions stored in memory. And various application programs, data and content are executed according to various interactive instructions received from the external input, so as to finally display and play various audio and video content.
  • processor 254 may include multiple processors.
  • the plurality of processors may include a main processor and one or more sub-processors.
  • the main processor is used to perform some operations of the display device 200 in the pre-power-on mode, and/or an operation of displaying a picture in the normal mode.
  • One or more sub-processors for an operation in a state such as standby mode.
  • the graphics processor 253 is used to generate various graphic objects, such as: icons, operation menus, and user input instructions to display graphics and the like. It includes an operator, which performs operations by receiving various interactive instructions input by the user, and displays various objects according to the display properties. and includes a renderer, which renders various objects obtained based on the operator, and the rendered objects are used for displaying on a display.
  • various graphic objects such as: icons, operation menus, and user input instructions to display graphics and the like. It includes an operator, which performs operations by receiving various interactive instructions input by the user, and displays various objects according to the display properties. and includes a renderer, which renders various objects obtained based on the operator, and the rendered objects are used for displaying on a display.
  • the video processor 270 is configured to receive an external video signal, perform video processing according to the standard codec protocol of the input signal, and obtain a signal directly displayed or played on the display device 200 .
  • the video processor 270 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
  • the graphics processor 253 may be integrated with the video processor, or may be separately configured.
  • the processing of the graphics signal output to the display may be performed.
  • different functions may be performed respectively. For example, GPU+FRC (Frame Rate Conversion)) architecture.
  • the audio processor 280 is configured to receive an external audio signal, perform decompression and decoding, and noise reduction, digital-to-analog conversion, and amplification processing according to a standard codec protocol of the input signal to obtain a The sound signal played in the speaker.
  • the video processor 270 may comprise one or more chips.
  • the audio processor may also include one or more chips.
  • the video processor 270 and the audio processor 280 may be separate chips, or may be integrated into one or more chips together with the controller.
  • the audio output under the control of the controller 250, receives the sound signal output by the audio processor 280, such as the speaker 286, and in addition to the speaker carried by the display device 200 itself, can be output to an external device.
  • the external audio output terminal of the device such as an external audio interface or an earphone interface, etc., may also include a short-range communication module in the communication interface, such as a Bluetooth module for outputting sound from a Bluetooth speaker.
  • the power supply 290 under the control of the controller 250, provides power supply support for the display device 200 with the power input from the external power supply.
  • the power supply 290 may include a built-in power supply circuit installed inside the display device 200 , or may be an external power supply installed in the display device 200 to provide an external power supply interface in the display device 200 .
  • the user interface 265 is used for receiving user input signals, and then sending the received user input signals to the controller 250 .
  • the user input signal may be a remote control signal received through an infrared receiver, and various user control signals may be received through a network communication module.
  • the user inputs user commands through the control device 100 or the mobile terminal 300 , the user input interface is based on the user's input, and the display device 200 responds to the user's input through the controller 250 .
  • the user may input user commands on a graphical user interface (GUI) displayed on the display 275, and the user input interface receives the user input commands through the graphical user interface (GUI).
  • GUI graphical user interface
  • the user may input a user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through a sensor to receive the user input command.
  • the memory 260 includes storing various software modules for driving the display device 200 .
  • various software modules stored in the first memory include at least one of a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
  • the basic module is used for signal communication between various hardwares in the display device 200, and is a low-level software module that sends processing and control signals to the upper-layer module.
  • FIG. 3 exemplarily shows a configuration block diagram of the control apparatus 100 according to an exemplary embodiment.
  • the control device 100 includes a controller 110 , a communication interface 130 , a user input/output interface 140 , a memory 190 , and a power supply 180 .
  • the control apparatus 100 is configured to control the display device 200 , and can receive the user's input operation instructions, and convert the operation instructions into instructions that the display device 200 can recognize and respond to, so as to play an interactive intermediary role between the user and the display device 200 .
  • the user operates the channel addition and subtraction keys on the control device 100, and the display device 200 responds to the channel addition and subtraction operation.
  • control apparatus 100 may be a smart device.
  • control apparatus 100 may install various applications for controlling the display device 200 according to user requirements.
  • the mobile terminal 300 or other intelligent electronic device can perform a similar function of the control apparatus 100 after installing the application for operating the display device 200 .
  • the user can install the application, various function keys or virtual buttons of the graphical user interface available on the mobile terminal 300 or other intelligent electronic devices, so as to realize the function of the physical key of the control apparatus 100 .
  • Controller 110 includes processor 112 and RAM 113 and ROM 114.
  • the controller is used to control the operation and operation of the control device 100, as well as the communication and cooperation between the internal components and the external and internal data processing functions.
  • the communication interface 130 realizes the communication of control signals and data signals with the display device 200 under the control of the controller 110 .
  • the received user input signal is sent to the display device 200 .
  • the communication interface 130 may include at least one of the WiFi chip 131, the Bluetooth module 132, the NFC module 133 and other near field communication modules.
  • the user input/output interface 140 wherein the input interface includes at least one of other input interfaces such as a microphone 141, a touch panel 142, a sensor 143, and a key 144.
  • the user can implement the user command input function through actions such as voice, touch, gesture, pressing, etc.
  • the input interface converts the received analog signal into a digital signal, and converts the digital signal into a corresponding command signal, and sends it to the display device 200.
  • the output interface includes an interface for transmitting received user instructions to the display device 200 .
  • it can be an infrared interface or a radio frequency interface.
  • the infrared signal interface when used, the user input instruction needs to be converted into an infrared control signal according to the infrared control protocol, and sent to the display device 200 through the infrared sending module.
  • the radio frequency signal interface when a radio frequency signal interface is used, the user input command needs to be converted into a digital signal, and then modulated according to the radio frequency control signal modulation protocol, and then sent to the display device 200 by the radio frequency transmission terminal.
  • control device 100 includes at least one of a communication interface 130 and an input-output interface 140 .
  • the control device 100 is configured with a communication interface 130, such as modules such as WiFi, Bluetooth, NFC, etc., which can send user input instructions to the display device 200 through WiFi protocol, Bluetooth protocol, or NFC protocol encoding.
  • the memory 190 is used to store various operating programs, data and applications for driving and controlling the control device 200 under the control of the controller.
  • the memory 190 can store various control signal instructions input by the user.
  • the power supply 180 is used to provide operating power support for each element of the control device 100 under the control of the controller. Can battery and related control circuit.
  • the system is divided into four layers, from top to bottom, they are an application layer (referred to as “application layer”), an application framework layer (referred to as “framework layer”) ”), the Android runtime and the system library layer (referred to as the “system runtime layer”), and the kernel layer.
  • application layer an application layer
  • frame layer an application framework layer
  • Android runtime the Android runtime
  • system library layer the system library layer
  • kernel layer the kernel layer
  • At least one application program runs in the application program layer, and these application programs may be a Window program, a system setting program, a clock program, a camera application, etc. built into the operating system; they may also be developed by a third party applications developed by the author.
  • the framework layer provides an application programming interface (API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer is equivalent to a processing center, which decides to let the applications in the application layer take action.
  • the application program can access the resources in the system and obtain the services of the system during execution through the API interface.
  • the application framework layer in the embodiment of the present application includes managers (Managers), content providers (Content Provider), etc., wherein the manager includes at least one of the following modules: an activity manager (Activity Manager) uses Interacts with all activities running in the system; Location Manager is used to provide system services or applications with access to system location services; Package Manager is used to retrieve files currently installed on the device Various information related to the application package; Notification Manager (Notification Manager) is used to control the display and clearing of notification messages; Window Manager (Window Manager) is used to manage icons, windows, toolbars, wallpapers on the user interface and desktop widgets.
  • an activity manager uses Interacts with all activities running in the system
  • Location Manager is used to provide system services or applications with access to system location services
  • Package Manager is used to retrieve files currently installed on the device Various information related to the application package
  • Notification Manager Notification Manager
  • Window Manager Window Manager
  • the activity manager is used to: manage the life cycle of each application and the usual navigation and fallback functions, such as controlling the exit of the application (including switching the user interface currently displayed in the display window to the system desktop), opening the , back (including switching the currently displayed user interface in the display window to the upper-level user interface of the currently displayed user interface), and the like.
  • the window manager is used to manage all window programs, such as obtaining the size of the display screen, judging whether there is a status bar, locking the screen, taking screenshots, and controlling the change of the display window (for example, reducing the display window to display, shaking display, twisting deformation display, etc.), etc.
  • the system runtime layer provides support for the upper layer, that is, the framework layer.
  • the Android operating system will run the C/C++ library included in the system runtime layer to implement the functions to be implemented by the framework layer.
  • the kernel layer is the layer between hardware and software. As shown in Figure 4, the kernel layer at least includes at least one of the following drivers: audio driver, display driver, Bluetooth driver, camera driver, WIFI driver, USB driver, HDMI driver, sensor driver (such as fingerprint sensor, temperature sensor, touch sensors, pressure sensors, etc.), etc.
  • the kernel layer at least includes at least one of the following drivers: audio driver, display driver, Bluetooth driver, camera driver, WIFI driver, USB driver, HDMI driver, sensor driver (such as fingerprint sensor, temperature sensor, touch sensors, pressure sensors, etc.), etc.
  • the display device receives an input operation (such as a split-screen operation) performed by the user on the display screen, and the kernel layer can generate corresponding input operations according to the input operation. Enter an event and report the event to the application framework layer.
  • the window mode (such as multi-window mode) and window position and size corresponding to the input operation are set by the activity manager of the application framework layer.
  • the window management of the application framework layer draws the window according to the settings of the activity manager, and then sends the drawn window data to the display driver of the kernel layer, and the display driver displays the corresponding application interface in different display areas of the display screen.
  • the application layer contains at least one application that can display corresponding icon controls in the display, such as: live TV application icon control, video on demand application icon control, media center application Program icon controls, application center icon controls, game application icon controls, etc.
  • the live TV application may provide live TV from different sources.
  • a live TV application may provide a TV signal using input from cable, over-the-air, satellite services, or other types of live TV services.
  • the live TV application may display the video of the live TV signal on the display device 200 .
  • a video-on-demand application may provide video from various storage sources. Unlike live TV applications, video-on-demand provides a display of video from certain storage sources. For example, video-on-demand can come from the server side of cloud storage, from local hard disk storage containing existing video programs.
  • the media center application may provide various multimedia content playback applications.
  • a media center may provide services other than live TV or video-on-demand, where users can access various images or audio through a media center application.
  • the application center may provide storage of various applications.
  • An application can be a game, an application, or some other application that is related to a computer system or other device but can be run on a Smart TV.
  • the application center can obtain these applications from various sources, store them in local storage, and then run them on the display device 200 .
  • the resources played by the display device 200 include video resources and audio resources.
  • the video resources and the audio resources are played synchronously, so that the user can hear the sounds of things in the video while watching the video resources.
  • the resource can also be expressed as a program of the display device.
  • Each program has its corresponding PSI (Program Specific Information), which includes PMT (Program Map Table), audio information describing the program, and program specific information in the PSI. video information, etc.
  • PSI Program Specific Information
  • PMT Program Map Table
  • audio information describing the program and program specific information in the PSI.
  • video information etc.
  • the PMT also contains narration information, which conveys a visual image in concise and vivid language, which can help the blind or visually impaired to experience the visual image of the image presented by the video resource.
  • narration information is usually not needed to assist in the perception of visual images, and the output of narration information will affect users with normal vision to experience the background music (such as bird calls, rain) in audio resources, etc. Affects the real feelings of users with normal vision on the video resource picture.
  • the embodiments of the present application provide a display device resource playback method and display device, which can select the corresponding audio output according to the headphone mode, the audio mode when the display device 200 is connected to the headphones, and the narration output mode in the visually impaired mode.
  • the device plays audio resources and narration information. It can meet the resource playback requirements of both visually impaired users and sighted users. When visually impaired users and normal sighted users watch resources at the same time, the narration information provided for visually impaired users will not affect the audio content of sighted users. The resource viewing experience of different users.
  • the display device 200 provided in this embodiment of the present application may include at least a display 275 and a controller 250 .
  • the display 275 can display the video resources in the resources played by the display device 200, and the controller 250 can obtain some current peripheral connection states and mode settings of the display device 200, and control the display 275 to perform corresponding mode display and control audio Resource and narration information output.
  • the peripheral device connection state of the display device 200 is the connection state of the earphone, and the controller 250 needs to determine whether the display device 200 is connected to the earphone device in addition to the speaker itself. Whether to connect headphones or not requires users to choose according to their own needs.
  • the audio resources can be played in two ways: the headset, the headset and the speaker. The audio resources are played by the headset and the audio resources are played by the headset and the speaker at the same time; without connecting the headset, the audio resources can only be played by the display device 200. speaker to play.
  • FIG. 6 is a schematic diagram of a setting interface according to an embodiment of the present application.
  • the controller 250 detects that the display device 200 is connected to the headset.
  • the headset mode on the setting interface as shown in FIG. 6 needs to be set to the highlighted optional state, and then Allows the user to choose how the audio resource is played.
  • the earphone mode specifically includes earphones, earphones and speakers. When the earphone mode is earphone, it means that the user chooses to play audio resources through the earphone; when the earphone mode is earphone and speaker, it means that the user chooses to play audio resources through the earphone and the speaker at the same time.
  • the controller 250 After the user selects a specific headphone mode, the controller 250 detects the current headphone mode, and then determines in what manner to play the audio resource.
  • the earphone and speaker modes are usually selected, so that different users can listen to audio resources through different devices without interfering with each other.
  • the audio mode of the display device 200 needs to be selected as the visually impaired mode, so that the visually impaired user can listen to the narration information in the resource.
  • FIG. 7 is a schematic diagram of an audio mode interface according to an embodiment of the present application.
  • the controller 250 needs to adjust the audio mode on the audio mode interface to the visually impaired mode.
  • the narration output mode will become a highlighted optional mode, and the user can select a specific narration output mode at this time.
  • the narration output mode on the audio mode interface is grayed out and not selectable, as shown in FIG.
  • the speaker of the device 200 outputs narration information.
  • the display device 200 can currently play audio through earphones and speakers.
  • the users need to choose which device to use to play the narration. information, that is, select the narration output mode.
  • the narration output mode can be selected as headphones, so that visually impaired users can listen to audio resources and narration information at the same time through headphones, while sighted users can listen only through the speakers of the display device 200. audio resources.
  • the controller 250 detects the user's selection, it needs to adjust the narration output mode on the audio mode interface to the earphone, as shown in FIG. 7 , at this time, the earphone plays the audio resource and the narration information at the same time, and the speaker plays the audio resource.
  • the narration output mode can be selected as the speaker, so that the visually impaired user can listen to the audio resources and the narration information through the speaker of the display device 200 at the same time, while the sighted user can only listen to the narration information through the headset audio resources.
  • the controller 250 detects the user's selection, it needs to adjust the narration output mode on the audio mode interface to the speaker, as shown in FIG. 9 , at this time, the speaker plays the audio resource and the narration information at the same time, and the earphone plays the audio resource.
  • the controller 250 If the user selects the narration output mode as all, the visually impaired user and the sighted user can listen to the audio resources and narration information by selecting headphones or speakers. In this case, after the controller 250 detects the user's selection, it needs to adjust the narration output mode on the audio mode interface to all, as shown in FIG.
  • the solutions of the embodiments of the present application can meet the resource playback requirements of both visually impaired users and normal vision users.
  • visually impaired users can use headphones alone to listen to audio resources.
  • Visually impaired users can use the speakers of the display device 200 to listen to audio resources; or visually impaired users can use the speakers of the display device 200 to listen to audio resources and narration information, while visually impaired users can use headphones alone to listen to audio resources .
  • the narration information provided for visually impaired users will not affect the audio content of visually impaired users, and the resource appreciation experience of different users can be guaranteed at the same time.
  • the controller 250 detects that the display device 200 is connected to an earphone, the earphone mode is earphone and speaker, and the audio mode is the normal mode, it means that the user viewing the display device 200 does not need to listen to the narration information, and the control 250
  • the audio mode on the audio mode interface needs to be adjusted to the normal mode, as shown in FIG. 11 , and the controller 250 selects the earphone and the speaker to play audio resources at the same time without playing the narration information.
  • the controller 250 if the controller 250 detects that the display device 200 is connected to an earphone and the earphone mode is earphone, it means that the user only chooses to use the earphone to play audio resources, and the speaker of the display device 200 has no sound output. At this time, the controller 250 needs to adjust the headset mode on the setting interface shown in FIG. 6 to the headset, and the adjusted setting interface is shown in FIG. 12 . In this case, the narration output mode on the audio mode interface becomes a grayed-out non-selectable state, and regardless of whether the controller 250 continues to detect that the audio mode is the normal mode or the visually impaired mode, it will only be played by the headphones. When the audio mode is the normal mode, the controller 250 selects the earphone to play only the audio resource; when the audio mode is the visually impaired mode, the controller 250 selects the earphone to simultaneously play the audio resource and the narration information.
  • the controller 250 detects that the display device 200 is not connected to an earphone, it indicates that the user only selects to use the speaker of the display device 200 to play the audio resource. At this time, the control 250 needs to gray out the headphone mode on the setting interface as shown in FIG. 6 and adjust it to an unavailable state to indicate that there is no headphone connection. In this case, the narration output mode on the audio mode interface will also become a grayed out non-selectable state, and no matter whether the controller 250 continues to detect that the audio mode is the normal mode or the visually impaired mode, it will only be played by the speaker. When the audio mode is the normal mode, the controller 250 selects the speaker of the display device 200 to play only the audio resource; when the audio mode is the visually impaired mode, the controller 250 selects the speaker of the display device 200 to simultaneously play the audio resource and the narration information.
  • the solutions in the above embodiments of the present application can be referred to as shown in FIG. 13 .
  • the controller 250 can perform different processing procedures for whether the display device 200 is connected to an earphone; for different earphone modes, the controller 250 can also perform different processing procedures. Processing flow; at the same time, in different headphone modes or when the headphone is not connected, the controller 250 may also perform different processing procedures for different audio modes. It can be seen that when the visually impaired user and the normal vision user use the display device 200 at the same time, the solution in the embodiment of the present application can provide the visually impaired user and the normal vision user with resource playback services in various ways according to the user's needs and selection.
  • the user's selection of various modes is completed by inputting instructions to the display device 200, and the input methods include but are not limited to remote control input, voice input, etc.
  • the remote controller indirectly inputs instructions to the display device 200, and the voice input means that the user directly inputs instructions to the display device 200 through voice.
  • FIG. 14 is a flowchart of a method for playing resources on a display device according to an embodiment of the application. As shown in FIG. 14 , the method may be implemented by control components such as the controller 250, and may specifically include the following steps:
  • Step S101 detecting whether the display device 200 is connected to an earphone.
  • Step S102 in the case that the display device 200 is connected to an earphone, the current earphone mode of the display device 200 is detected.
  • Step S103 in the case that the earphone mode is the earphone and the speaker, the audio mode of the display device 200 is detected.
  • Step S104 when the audio mode is the visually impaired mode, the narration output mode of the display device 200 is detected.
  • Step S105 according to the headphone mode and the narration output mode, select the audio resource and narration information in the playback resources of the corresponding audio output device.
  • the earphone mode is an earphone and a speaker
  • the audio mode is a visually impaired mode
  • the narration output mode is an earphone
  • the earphone mode is an earphone and a speaker
  • the audio mode is a visually impaired mode
  • the narration output mode is a speaker
  • the earphone mode is earphone and speaker
  • the audio mode is the visually impaired mode
  • the narration output mode is all
  • the earphone mode is an earphone and a speaker
  • the audio mode is a normal mode
  • the audio mode of the display device 200 is detected; in the case that the audio mode is a visually impaired mode , select the headset to play audio resources and narration information.
  • the audio mode of the display device 200 is detected; when the audio mode is the normal mode, Select the headset to play the audio source.
  • the audio mode of the display device 200 when the display device 200 is connected without headphones, the audio mode of the display device 200 is detected; when the audio mode is the visually impaired mode, the audio mode of the display device 200 is selected.
  • the speakers play audio resources and narration information.
  • the audio mode of the display device 200 when the display device 200 is connected without headphones, the audio mode of the display device 200 is detected; when the audio mode is the normal mode, the speaker of the display device 200 is selected Play audio resources.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

La présente invention se rapporte à un procédé de lecture de ressources de dispositif d'affichage et à un dispositif d'affichage. Selon la présente invention, un dispositif de sortie audio correspondant peut être sélectionné pour lire des ressources audio et des informations vocales données à part en fonction d'un mode d'écouteur et d'un mode audio au moment où un dispositif d'affichage est connecté à un écouteur et d'un mode de sortie vocale à part dans un mode de déficience visuelle. Selon des modes de réalisation de la présente invention, les exigences de lecture de ressources d'un utilisateur ayant une déficience visuelle et d'un utilisateur ayant une vision normale peuvent être satisfaites en même temps ; lorsque l'utilisateur ayant une déficience visuelle et l'utilisateur ayant une vision normale montrent des ressources en même temps, l'utilisateur ayant une déficience visuelle peut écouter les ressources audio et des informations vocales données à part en utilisant l'écouteur seul et l'utilisateur ayant une vision normale peut écouter les ressources audio en utilisant un haut-parleur du dispositif d'affichage ; ou l'utilisateur ayant une déficience visuelle peut écouter les ressources audio et les informations vocales données à part en utilisant le haut-parleur du dispositif d'affichage et l'utilisateur ayant une vision normale peut écouter les ressources audio en utilisant l'écouteur seul.
PCT/CN2021/113742 2020-10-12 2021-08-20 Procédé de lecture de ressources de dispositif d'affichage et dispositif d'affichage WO2022078065A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011084672.7A CN112214190A (zh) 2020-10-12 2020-10-12 显示设备资源播放方法及显示设备
CN202011084672.7 2020-10-12

Publications (1)

Publication Number Publication Date
WO2022078065A1 true WO2022078065A1 (fr) 2022-04-21

Family

ID=74054517

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/113742 WO2022078065A1 (fr) 2020-10-12 2021-08-20 Procédé de lecture de ressources de dispositif d'affichage et dispositif d'affichage

Country Status (2)

Country Link
CN (1) CN112214190A (fr)
WO (1) WO2022078065A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110390927B (zh) * 2019-06-28 2021-11-23 北京奇艺世纪科技有限公司 音频处理方法、装置、电子设备及计算机可读存储介质
CN112214190A (zh) * 2020-10-12 2021-01-12 青岛海信传媒网络技术有限公司 显示设备资源播放方法及显示设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5896129A (en) * 1996-09-13 1999-04-20 Sony Corporation User friendly passenger interface including audio menuing for the visually impaired and closed captioning for the hearing impaired for an interactive flight entertainment system
CN110390927A (zh) * 2019-06-28 2019-10-29 北京奇艺世纪科技有限公司 音频处理方法、装置、电子设备及计算机可读存储介质
CN110798774A (zh) * 2019-12-10 2020-02-14 深圳市思考力科技有限公司 一种带有旁白讲解功能的盲人观影专用耳机
CN112214190A (zh) * 2020-10-12 2021-01-12 青岛海信传媒网络技术有限公司 显示设备资源播放方法及显示设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5896129A (en) * 1996-09-13 1999-04-20 Sony Corporation User friendly passenger interface including audio menuing for the visually impaired and closed captioning for the hearing impaired for an interactive flight entertainment system
CN110390927A (zh) * 2019-06-28 2019-10-29 北京奇艺世纪科技有限公司 音频处理方法、装置、电子设备及计算机可读存储介质
CN110798774A (zh) * 2019-12-10 2020-02-14 深圳市思考力科技有限公司 一种带有旁白讲解功能的盲人观影专用耳机
CN112214190A (zh) * 2020-10-12 2021-01-12 青岛海信传媒网络技术有限公司 显示设备资源播放方法及显示设备

Also Published As

Publication number Publication date
CN112214190A (zh) 2021-01-12

Similar Documents

Publication Publication Date Title
CN111757171A (zh) 一种显示设备及音频播放方法
WO2021135068A1 (fr) Procédé de commande de sélection pour un dispositif de sortie sonore, et dispositif d'affichage
WO2022073392A1 (fr) Procédé d'affichage d'image et dispositif d'affichage
CN112612443B (zh) 一种音频播放方法、显示设备及服务器
CN111752518A (zh) 一种显示设备投屏方法及显示设备
CN112272417B (zh) 一种双蓝牙音响回连方法及显示设备
CN112019782B (zh) 增强型音频回传通道的控制方法及显示设备
CN112118400B (zh) 显示设备上图像的显示方法及显示设备
WO2022078065A1 (fr) Procédé de lecture de ressources de dispositif d'affichage et dispositif d'affichage
WO2022048203A1 (fr) Procédé d'affichage et dispositif d'affichage destinés à la manipulation d'informations d'invite de commande de procédé de saisie
CN112188279A (zh) 一种频道切换方法和显示设备
CN113438539A (zh) 一种数字电视节目录制方法及显示设备
CN112153440A (zh) 一种显示设备及显示系统
CN112243141A (zh) 投屏功能的显示方法及显示设备
CN111954059A (zh) 屏保的展示方法及显示设备
CN112306604B (zh) 一种传输文件的进度显示方法及显示设备
CN112399217B (zh) 显示设备及与功放设备建立通信连接的方法
CN111954043B (zh) 一种信息栏显示方法及显示设备
CN111818654B (zh) 一种信道接入方法及显示设备
CN112637957A (zh) 显示设备及显示设备与无线音箱的通信方法
CN112104950B (zh) 一种音量控制方法及显示设备
CN114302197A (zh) 一种语音分离控制方法及显示设备
CN114390190A (zh) 显示设备及监测应用启动摄像头的方法
CN113971049A (zh) 一种后台服务管理方法及显示设备
WO2022100252A1 (fr) Dispositif d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21879112

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 18/07/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21879112

Country of ref document: EP

Kind code of ref document: A1