WO2020187050A1 - Dispositif d'affichage - Google Patents

Dispositif d'affichage Download PDF

Info

Publication number
WO2020187050A1
WO2020187050A1 PCT/CN2020/078061 CN2020078061W WO2020187050A1 WO 2020187050 A1 WO2020187050 A1 WO 2020187050A1 CN 2020078061 W CN2020078061 W CN 2020078061W WO 2020187050 A1 WO2020187050 A1 WO 2020187050A1
Authority
WO
WIPO (PCT)
Prior art keywords
module
display device
environmental sound
control
user
Prior art date
Application number
PCT/CN2020/078061
Other languages
English (en)
Chinese (zh)
Inventor
王之奎
贾其燕
李本友
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Publication of WO2020187050A1 publication Critical patent/WO2020187050A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42203Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1601Constructional details related to the housing of computer displays, e.g. of CRT monitors, of flat displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • G06F9/4418Suspend and resume; Hibernate and awake
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42221Transmission circuitry, e.g. infrared [IR] or radio frequency [RF]

Definitions

  • This application relates to the field of display technology, and in particular to a display device.
  • the display device can perform various interactive functions related to the voice control instructions input by the user.
  • the voice control command input by the user can be input through the built-in microphone on the display device.
  • the current display device has two states: standby mode and running mode.
  • all or most of the services of the display device are turned on in the operating mode, such as the service that the built-in microphone on the display device collects the user's voice, the service that processes the user's voice, etc., which can control the voice input by the user. Identify and respond.
  • the display device when the display device is in standby mode, all or most of the services are turned off. At the same time, some services need to be turned on by continuous power-on, such as the service that the microphone collects the user's voice, the service that processes the user's voice, etc., to be able to Recognizing and responding to the voice control command input by the user, for example, the user inputs a voice wake-up command to wake the display device from the standby mode to the operating mode, which will cause the display device in the standby mode to have a higher standby power consumption.
  • some services need to be turned on by continuous power-on, such as the service that the microphone collects the user's voice, the service that processes the user's voice, etc., to be able to Recognizing and responding to the voice control command input by the user, for example, the user inputs a voice wake-up command to wake the display device from the standby mode to the operating mode, which will cause the display device in the standby mode to have a higher standby power consumption.
  • the embodiment of the present application provides a display device for controlling the standby power consumption of the display device in a standby mode.
  • the display device includes an environmental sound detection module, a microphone module, and a chip processing module;
  • the environmental sound detection module is used to detect the size of the environmental sound of the environment where the display device is in the standby mode, and according to the size of the environmental sound Control the microphone module and the chip processing module to turn on or off;
  • the microphone module is used to collect voice data input by the user;
  • the chip processing module is used to identify and respond to the voice wake-up input by the user from the voice data collected by the microphone module Instructions to wake up the display device and enter the operating mode.
  • the environmental sound detection module is specifically configured to control the microphone module and the chip processing module to turn off when the magnitude of the environmental sound is not within a preset range; when it is determined that the magnitude of the environmental sound is within the preset range, control the microphone The module and chip processing module are turned on.
  • the environmental sound detection module is turned off.
  • the chip processing module is further configured to detect the duration of its turn-on, and control the environmental sound detection module to turn on or off according to the duration of its turn-on.
  • the chip processing module is specifically configured to control the environmental sound detection module to turn on when it is determined that the duration of its activation exceeds a set duration; when it is determined that the duration of its activation does not exceed the set duration, control the ambient sound The detection module is closed.
  • the display device further includes a SOC module, which is used to send a state that the display device enters a standby mode or an operating mode to the chip processing module.
  • a SOC module which is used to send a state that the display device enters a standby mode or an operating mode to the chip processing module.
  • the chip processing module is further configured to control the environmental sound detection module to turn on when receiving the state that the display device enters the standby mode sent by the SOC module; When the device enters the operating mode state, the environmental sound detection module is controlled to close.
  • Fig. 1 exemplarily shows a schematic diagram of an operation scene between a display device and a control device
  • Fig. 2 exemplarily shows a configuration block diagram of the control device in Fig. 1;
  • Fig. 3 exemplarily shows a configuration block diagram of the display device in Fig. 1;
  • FIG. 4 exemplarily shows another configuration block diagram of the display device in FIG. 1;
  • FIG. 5 exemplarily shows another configuration block diagram of the display device in FIG. 1;
  • Fig. 6 exemplarily shows a block diagram of the circuit configuration of each module in Fig. 5.
  • first, second, third, etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms . These terms can only be used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Unless the context clearly indicates, terms such as “first”, “second” and other numbers used herein do not imply a sequence or order. Therefore, without departing from the teachings of the exemplary embodiments, the first element, first component, first region, first layer or first portion discussed below may be referred to as a second element, second component, second Area, second layer or second part.
  • spatially relative terms such as “internal”, “external”, “below”, “below”, “lower”, “above”, “upper”, etc. may be used in this text. Used to describe the relationship between one element or feature shown in the figure and another or more elements or features.
  • spatial relative terms may also be intended to cover different orientations of the device in use or operation. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features will be reoriented “above” the other elements or features. Therefore, the exemplary term “under” can include two relative orientations of upper and lower. The device can be oriented in other ways (rotated by 90 degrees or other directions), thereby explaining the spatial relative descriptors used herein.
  • Fig. 1 exemplarily shows a schematic diagram of an operation scene between the display device and the control device.
  • the control device 100 and the display device 20 can communicate in a wired or wireless manner.
  • control device 100 is configured to control the display device 20, which can receive operation instructions input by the user, and convert the operation instructions into instructions that the display device 20 can recognize and respond to, and act as an intermediary for the interaction between the user and the display device 20 effect.
  • the user operates the channel control key on the control device 100, and the display device 20 responds to the channel control operation.
  • the control device 100 may be a remote controller 100A, including infrared protocol communication or Bluetooth protocol communication, and other short-range communication methods, etc., to control the display device 20 through wireless or other wired methods.
  • the user can control the display device 20 by inputting user instructions through keys on the remote control, voice input, control panel input, etc.
  • the user can control the display device by inputting corresponding control commands through the volume plus and minus keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, switch machine keys, etc. on the remote control 20 Function.
  • the control device 100 may also be a smart device, such as a mobile terminal 100B, a tablet computer, a computer, a notebook computer, etc.
  • a smart device such as a mobile terminal 100B, a tablet computer, a computer, a notebook computer, etc.
  • an application program running on a smart device is used to control the display device 20.
  • the application can be configured to provide users with various controls through an intuitive user interface (UI) on the screen associated with the smart device.
  • UI intuitive user interface
  • the mobile terminal 100B can install a software application with the display device 20, realize connection communication through a network communication protocol, and realize the purpose of one-to-one control operation and data communication.
  • the mobile terminal 100B can establish a control instruction protocol with the display device 20, and by operating various function keys or virtual buttons of the user interface provided on the mobile terminal 100B, the functions of the physical keys arranged in the remote control 100A can be realized.
  • the audio and video content displayed on the mobile terminal 100B can also be transmitted to the display device 20 to realize the synchronous display function.
  • the display device 20 may provide a broadcast receiving function and a network TV function of a computer support function.
  • Exemplary display devices include digital TV, Internet TV, Internet Protocol TV (IPTV), and so on.
  • the display device 20 may be a liquid crystal display, an organic light emitting display, or a projection device.
  • the specific display device type, size and resolution are not limited.
  • the display device 20 also performs data communication with the server 300 through multiple communication methods.
  • the display device 20 may be allowed to communicate through a local area network (LAN), a wireless local area network (WLAN), and other networks.
  • the server 300 can provide various contents and interactions to the display device 20.
  • the display device 20 can send and receive information, such as receiving electronic program guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library.
  • EPG electronic program guide
  • the server 300 can be one group or multiple groups, and can be one type or multiple types of servers.
  • the server 300 provides other network service content such as video on demand and advertising services.
  • control device 100 includes a controller 110, a memory 120, a communicator 130, a user input interface 140, an output interface 150, and a power supply 160.
  • the controller 110 includes a random access memory (RAM) 111, a read only memory (ROM) 112, a processor 113, a communication interface, and a communication bus.
  • RAM random access memory
  • ROM read only memory
  • the controller 110 is used to control the operation and operation of the control device 100, as well as the communication cooperation between internal components, and external and internal data processing functions.
  • the controller 110 may control to generate a signal corresponding to the detected interaction, and This signal is sent to the display device 20.
  • the memory 120 is used to store various operating programs, data and applications for driving and controlling the control device 100 under the control of the controller 110.
  • the memory 120 can store various control signal instructions input by the user.
  • the communicator 130 realizes communication of control signals and data signals with the display device 20 under the control of the controller 110.
  • the control device 100 sends a control signal (such as a touch signal or a button signal) to the display device 20 via the communicator 130, and the control device 100 can receive the signal sent by the display device 20 via the communicator 130.
  • the communicator 130 may include an infrared signal interface 131 and a radio frequency signal interface 132.
  • the user input instruction needs to be converted into an infrared control signal according to the infrared control protocol, and then sent to the display device 20 via the infrared sending module.
  • the user input instruction needs to be converted into a digital signal, and then modulated according to the radio frequency control signal modulation protocol, and then sent to the display device 20 by the radio frequency sending terminal.
  • the user input interface 140 may include at least one of a microphone 141, a touch panel 142, a sensor 143, a button 144, etc., so that the user can input user instructions for controlling the display device 20 to the control device through voice, touch, gesture, press, etc. 100.
  • the output interface 150 outputs a user instruction received by the user input interface 140 to the display device 20, or outputs an image or voice signal received by the display device 20.
  • the output interface 150 may include an LED interface 151, a vibration interface 152 that generates vibration, a sound output interface 153 that outputs a sound, a display 154 that outputs an image, and the like.
  • the remote controller 100A can receive output signals such as audio, video, or data from the output interface 150, and display the output signals as images on the display 154, as audio on the sound output interface 153, or as vibration on the vibration interface 152. form.
  • the power supply 160 is used to provide operating power support for each element of the control device 100 under the control of the controller 110.
  • the form can be battery and related control circuit.
  • Fig. 3 exemplarily shows a configuration block diagram of the display device.
  • the display device 200 may include a tuner and demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a memory 260, a user interface 265, a video processor 270, a display 275, Audio processor 280, audio input interface 285, and power supply 290.
  • the tuner and demodulator 210 which receives broadcast television signals through wired or wireless means, can perform modulation and demodulation processing such as amplification, mixing and resonance, and is used to demodulate the television selected by the user from multiple wireless or cable broadcast television signals
  • modulation and demodulation processing such as amplification, mixing and resonance
  • the audio and video signals carried in the frequency of the channel, and additional information such as EPG data.
  • the tuner and demodulator 210 can be selected by the user and controlled by the controller 250 to respond to the frequency of the television channel selected by the user and the television signal carried by the frequency.
  • the tuner and demodulator 210 can receive signals in many ways according to different broadcasting formats of TV signals, such as terrestrial broadcasting, cable broadcasting, satellite broadcasting or Internet broadcasting; and according to different modulation types, it can be digital modulation or analog Modulation method; and according to different types of received TV signals, analog signals and digital signals can be demodulated.
  • different broadcasting formats of TV signals such as terrestrial broadcasting, cable broadcasting, satellite broadcasting or Internet broadcasting
  • modulation types it can be digital modulation or analog Modulation method
  • received TV signals, analog signals and digital signals can be demodulated.
  • the tuner demodulator 210 may also be in an external device, such as an external set-top box.
  • the set-top box outputs a television signal after modulation and demodulation, and inputs it to the display device 200 through the external device interface 240.
  • the communicator 220 is a component used to communicate with external devices or external servers according to various communication protocol types.
  • the display device 200 may transmit content data to an external device connected via the communicator 220, or browse and download content data from an external device connected via the communicator 220.
  • the communicator 220 may include a WIFI (Wireless Fidelity) module 221, a Bluetooth communication protocol module 222, a wired Ethernet communication protocol module 223 and other network communication protocol modules or near field communication protocol modules, so that the communicator 220 can be under the control of the controller 250
  • the control signal of the control device 100 is received, and the control signal is implemented as a WIFI signal, a Bluetooth signal, a radio frequency signal, etc.
  • the detector 230 is a component of the display device 200 for collecting external environmental signals or signals interacting with the outside.
  • the detector 230 may include a sound collector 231, such as a microphone, which may be used to receive a user's voice, such as a voice signal of a control instruction for the user to control the display device 200; or, may collect environmental sound for identifying the type of environmental scene, so that The display device 200 adjusts its audio output according to the environmental sound, so that the display device 200 can adapt to the environmental noise.
  • the detector 230 may also include an image collector 232, such as a camera, a camera, etc., which may be used to collect external environment scenes to adaptively change the display parameters of the display device 200; and to collect The attributes of the user or interactive gestures with the user to achieve the function of interaction between the display device and the user.
  • an image collector 232 such as a camera, a camera, etc., which may be used to collect external environment scenes to adaptively change the display parameters of the display device 200; and to collect The attributes of the user or interactive gestures with the user to achieve the function of interaction between the display device and the user.
  • the detector 230 may also include a light receiver, which is used to collect the ambient light intensity to adapt to changes in display parameters of the display device 200.
  • the detector 230 may also include a temperature sensor.
  • the display device 200 may adaptively adjust the display color temperature of the image. Exemplarily, when the temperature is relatively high, the color temperature of the displayed image of the display device 200 can be adjusted to be relatively cool; when the temperature is relatively low, the color temperature of the display device 200 can be adjusted to be relatively warm.
  • the external device interface 240 is a component that provides the controller 250 to control data transmission between the display device 200 and external devices.
  • the external device interface 240 can be connected to external devices such as set-top boxes, game devices, notebook computers, etc. in a wired/wireless manner, and can receive external devices such as video signals (such as moving image data), audio signals (such as music data), and additional information ( For example, EPG data) and other data.
  • the external device interface 240 may include: a high-definition multimedia interface (HDMI) terminal 241, a composite video blanking synchronization (CVBS) terminal 242, an analog or digital component terminal 243, a universal serial bus (USB) terminal 244, and a component (Component) Any one or more of terminals (not shown in the figure), red, green and blue (RGB) terminals (not shown in the figure), etc.
  • HDMI high-definition multimedia interface
  • CVBS composite video blanking synchronization
  • USB universal serial bus
  • Component Any one or more of terminals (not shown in the figure), red, green and blue (RGB) terminals (not shown in the figure), etc.
  • the controller 250 controls the work of the display device 200 and responds to user operations by running various software control programs (such as an operating system and various application programs) stored on the memory 260.
  • various software control programs such as an operating system and various application programs
  • the controller 250 includes a random access memory (RAM) 251, a read only memory (ROM) 252, a graphics processor 253, a processor 254, a communication interface 255, and a communication bus 256.
  • RAM random access memory
  • ROM read only memory
  • processor 253, processor 254, and communication interface 255 are connected through a communication bus 256.
  • ROM 252 used to store various system startup instructions. For example, when a power-on signal is received, the power of the display device 200 starts to start, and the processor 254 runs the system start instruction in the ROM 252, and copies the operating system stored in the memory 260 to the RAM 251 to start the operating system. After the operating system is started, the processor 254 copies various application programs in the memory 260 to the RAM 251, and then starts various application programs.
  • the graphics processor 253 is used to generate various graphics objects, such as icons, operating menus, and user input instructions to display graphics.
  • the graphics processor 253 may include an arithmetic unit, which is used to perform operations by receiving various interactive instructions input by the user, and then display various objects according to the display attributes; and includes a renderer, which is used to generate various objects obtained based on the arithmetic unit, and perform operations.
  • the rendered result is displayed on the display 275.
  • the processor 254 such as a CPU, is configured to execute operating system and application program instructions stored in the memory 260. And according to the received user input instructions, to execute various applications, data and content processing, so as to finally display and play various audio and video content.
  • the processor 254 may include multiple processors. Multiple processors may include one main processor and multiple or one sub-processor.
  • the main processor is configured to perform some initialization operations of the display device 200 in the display device preloading mode, and/or, to display screen operations in the normal mode. Multiple or one sub-processor, used to perform an operation in the standby mode of the display device.
  • the communication interface 255 may include the first interface to the nth interface. These interfaces may be network interfaces connected to external devices via a network.
  • the controller 250 may control the overall operation of the display device 200. For example, in response to receiving a user input command for selecting a graphical user interface GUI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user input command.
  • the object can be any one of the selectable objects, such as a hyperlink or an icon.
  • the operation related to the selected object such as the operation of connecting to a hyperlink page, document, image, etc., or an operation of executing a program corresponding to the object.
  • the user input command for selecting the GUI object may be a command input through various input devices (for example, a mouse, a keyboard, a touch pad, etc.) connected to the display device 200 or a voice command corresponding to a voice spoken by the user.
  • the memory 260 is used to store various types of data, software programs or application programs for driving and controlling the operation of the display device 200.
  • the memory 260 may include volatile and/or nonvolatile memory.
  • the term “storage unit” includes the memory 260, the RAM 251 and ROM 252 of the controller 250, or the memory card in the display device 200.
  • the memory 260 is specifically used to store operating programs that drive the controller 250 in the display device 200; to store various application programs built in the display device 200 and downloaded from external devices by the user; and to store configuration provided by the display 275 Data such as various GUIs, various objects related to the GUI, and visual effect images of the selector used to select GUI objects.
  • the memory 260 is specifically used to store the drivers and related data of the tuner and demodulator 210, the communicator 220, the detector 230, the external device interface 240, the video processor 270, the display 275, the audio processor 280, etc. , External data (such as audio and video data) received from the external device interface or user data (such as button information, voice information, touch information, etc.) received from the user interface.
  • External data such as audio and video data
  • user data such as button information, voice information, touch information, etc.
  • the memory 260 specifically stores software and/or programs used to represent an operating system (OS). These software and/or programs may include, for example: kernel, middleware, application programming interface (API), and/or application.
  • OS operating system
  • these software and/or programs may include, for example: kernel, middleware, application programming interface (API), and/or application.
  • the kernel can control or manage system resources and functions implemented by other programs (such as the middleware, API (Application Programming Interface), or application program); at the same time, the kernel can provide an interface to allow middleware, API Or the application program accesses the controller to control or manage system resources.
  • OS operating system
  • these software and/or programs may include, for example: kernel, middleware, application programming interface (API), and/or application.
  • the kernel can control or manage system resources and functions implemented by other programs (such as the middleware, API (Application Programming Interface), or application program); at the same time, the kernel can provide an interface to allow middleware, API Or the application program accesses the controller to control
  • various software modules stored in the memory 260 may include: a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
  • the basic module is the underlying software module used to process the signals received by each hardware element in the display device and send the processed signals to the upper application module.
  • the detection module is a management module used to collect various information from various detectors or user interfaces, and perform digital-to-analog conversion, analysis and management of the collected information.
  • the communication module is a module used to communicate control signals and data signals with external devices.
  • the display control module is a module for controlling the display to display image content, and can be used to play multimedia image content and GUI interface information.
  • the browser module is a module used to access the web server by performing web browsing operations.
  • the service module is a module used to provide various services and various applications.
  • the user interface 265 receives various user interactions. Specifically, it is used to send the input signal of the user to the controller 250, or to transmit the output signal from the controller 250 to the user.
  • the remote control 100A may send input signals such as a power switch signal, a channel selection signal, and a volume adjustment signal input by the user to the user interface 265, and then the user interface 265 transfers to the controller 250; or the remote control 100A may Receive output signals such as audio, video, or data output from the user interface 265 through the controller 250 and display the received output signal or output the received output signal as audio or vibration.
  • the user may input a user command on a graphical user interface (GUI) displayed on the display 275, and the user interface 265 receives the user input command through the GUI.
  • GUI graphical user interface
  • the user interface 265 may receive a user input command for controlling the position of the selector in the GUI to select different objects or items.
  • the user may input a user command by inputting a specific voice or gesture, and the user interface 265 recognizes the voice or gesture through a sensor to receive the user input command.
  • the video processor 270 is used to receive external video signals, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to the standard codec protocol of the input signal.
  • the video signal displayed or played directly on the display 275.
  • the video processor 270 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
  • the demultiplexing module is used to demultiplex the input audio and video data stream, such as MPEG-2 (based on the compression standard of digital storage media moving images and voice), and then the demultiplexing module will demultiplex it. Multiplexed into video signals and audio signals, etc.
  • MPEG-2 based on the compression standard of digital storage media moving images and voice
  • the video decoding module is used to process the demultiplexed video signal, including decoding and scaling.
  • An image synthesis module such as an image synthesizer, is used to superimpose and mix the GUI signal generated by the graphics generator with the zoomed video image according to user input or itself to generate an image signal for display.
  • the frame rate conversion module is used to convert the frame rate of the input video, such as converting the frame rate of the input 60Hz video to a frame rate of 120Hz or 240Hz, and the usual format is implemented in a frame-insertion mode.
  • the display formatting module is used to change the signal output by the frame rate conversion module into a signal conforming to the display format such as a display, for example, format the signal output by the frame rate conversion module to output RGB data signals.
  • the display 275 is used to receive the image signal input from the video processor 270 and display video content, images, and a menu control interface.
  • the video content may be the video content in the broadcast signal received from the tuner and demodulator 210, or the video content input from the communicator 220 or the external device interface 240.
  • the display 275 simultaneously displays a user manipulation interface UI generated in the display device 200 and used for controlling the display device 200.
  • the display 275 may include a display screen component for presenting a picture and a driving component for driving image display.
  • the display 275 may also include a projection device and a projection screen.
  • the audio processor 280 is used to receive external audio signals, and perform decompression and decoding according to the standard codec protocol of the input signal, as well as audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing.
  • the audio signal to be played is used to receive external audio signals, and perform decompression and decoding according to the standard codec protocol of the input signal, as well as audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing.
  • the audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, Advanced Audio Coding (AAC), High Efficiency AAC (HE-AAC) and other formats.
  • AAC Advanced Audio Coding
  • HE-AAC High Efficiency AAC
  • the audio output interface 285 is used to receive audio signals output by the audio processor 280 under the control of the controller 250.
  • the audio output interface 285 may include a speaker 286 or an external audio output terminal 287 output to an external device, such as a headphone output terminal.
  • the video processor 270 may include one or more chips.
  • the audio processor 280 may also include one or more chips.
  • the video processor 270 and the audio processor 280 may be separate chips, or may be integrated with the controller 250 in one or more chips.
  • the power supply 290 is used for supplying power to the display device 200 with power input from an external power supply under the control of the controller 250.
  • the power supply 290 may be a built-in power supply circuit installed inside the display device 200, or may be a power supply installed outside the display device 200.
  • Fig. 4 exemplarily shows another configuration block diagram of the display device.
  • the display device 200' may include a power board 21, a main board 23, and a microphone module 25. These components can correspond to some components in FIG. 3 to a certain extent.
  • the power board 21 is used to supply power to components such as the main board 23 and the microphone module 25.
  • the power supply board 21 may correspond to the power supply 290 in FIG. 3.
  • the main board 23 is used to process various signals in the display device. For example, processing and responding to the sound signal collected by the microphone module 25, such as converting the sound signal into a voice control command; the radio frequency signal input from the tuner demodulator 210, the digital signal input from HDMI 241 and USB 244, and the component interface 243
  • the input component signal and other signals input from the external device interface 240 undergo format conversion processing to generate a unified signal that can be recognized by the display 275, such as a low-voltage differential signal; the audio signal input from the external device interface 240 undergoes volume control and sound effect processing Then output to the audio output interface 285.
  • the motherboard 23 may include components such as the external device interface 240, the controller 250, the video processor 270, the audio processor 280, the audio output interface 285, the memory 260, and various integrated circuits in FIG. 3.
  • the controller 250 on the main board 23 may be implemented as a SOC (System on Chip) module 23a and an MCU (Micro Control Unit, Micro Control Unit) module 23b.
  • SOC System on Chip
  • MCU Micro Control Unit, Micro Control Unit
  • the SOC module 23a and the MCU module 23b can be integrated together or separated.
  • the MCU module 23b in the embodiment of the present application is a chip processing module.
  • the microphone module 25 is used to collect the user's voice or collect environmental sounds for identifying environmental scenes.
  • the microphone module 25 can collect the user's voice, so that the main board 23 can convert the sound into voice control instructions to realize various functions of controlling the display device 200'; the microphone module 25 can also collect environmental sounds, so that the main board 23 can follow The environmental sound adjusts the output audio of the audio output interface to implement the display device 200' to adapt to environmental noise.
  • the microphone module 205 may correspond to the detector 230 in FIG. 3.
  • the display device 200 can be controlled without operating the control device 100.
  • the display device 200 may directly collect the voice uttered by the user through the sound collector 231 in FIG. 3 or the microphone module 25 in FIG. 4, and then convert the voice into a voice control instruction to perform a function corresponding to the voice control instruction.
  • voice wake-up function when a display device in standby mode receives a voice control instruction containing a wake-up word from a user, the display device is awakened from standby mode and enters operating mode.
  • the microphone module 25 and the MCU module 23b are always powered on (in the on state), and the SOC module 23a remains powered off (in the off state); In the running mode, the microphone module 25, the MCU module 23b, and the SOC module 23a are all kept powered on (in an on state).
  • the microphone module 25 collects the user's voice data including the wake-up words, and transmits the voice data to the MCU module 23b; the MCU module 23b receives the voice data, and recognizes it after local recognition processing The sound data contains a wake-up word, and then the MCU module 23b determines the voice wake-up instruction to wake up the display device according to the recognition result, and triggers the SOC module 23a to enter the working state, so that the display device is awakened from the standby mode to enter the operating mode.
  • Another example is the screen capture function: when the display device in the running mode receives a voice control instruction for screenshots sent by the user, it will perform screenshot processing on the current screen of the display device.
  • the microphone module 25 collects the user's voice data containing keywords related to screenshots, and transmits the voice data to the MCU module 23b; the MCU module 23b transmits the voice data to the SOC module 23a
  • the SOC module 23a informs the voice and semantic server to perform voice recognition and result conversion on the voice data, and returns the recognition result according to the voice and semantic server, and determines the recognition result as a voice control instruction for taking a screenshot of the current screen displayed by the display device, And then execute the current screen screenshot processing.
  • most or all system services of the display device in the standby mode are in a closed state.
  • the relevant components of the media output function of the display device in the standby mode are in the off state, for example, the audio output of the display device is off and the screen is off.
  • the display device is in a low power consumption working state, and the power consumed at this time is the standby power consumption.
  • Most or all of the system services in the display device in the running mode are in an on state, and it can work normally.
  • the relevant components of the media output function of the display device in the running mode are in the on state, the audio output is in the on state, and the screen is in the on state.
  • the standby mode and operating mode of the display device can be switched mutually.
  • the display device in the running mode when the user operates the power button on the control device or shuts down through voice control, the display device in the running mode enters the standby mode.
  • the display device in the standby mode when the user operates the power button on the control device or turns on the device through voice control, the display device in the standby mode enters the operating mode.
  • the display device with voice wake-up function, in standby mode, the microphone module used to collect user voice data and the MCU module used to process user voice data containing wake-up words need to be It can work only when it is powered on, which makes the standby power consumption of the display device in the standby mode larger.
  • Fig. 5 exemplarily shows another configuration block diagram of the display device.
  • the display device 200" may include a power board 21, a main board 23, a microphone module 25, and an environmental sound detection module 27. These components may correspond to some components in FIG. 3 to a certain extent.
  • FIG. 4 The difference from FIG. 4 is that an environmental sound detection module 27 is added in FIG. 5.
  • the environmental sound detection module 27 is used to detect the size of the environmental sound, and control the turning on or off of the microphone module 25 and the MCU module 23b according to the size of the environmental sound.
  • the environmental sound detection module 27 may correspond to the detector 230 in FIG. 3.
  • the microphone module 25 and the environmental sound detection module 27 in FIG. 5 are integrated and connected to the main board 23, but they can also be connected to the main board 23 separately, which is not specifically limited.
  • the specific functions of each module will be described in detail below.
  • the SOC module 23a is used to notify the MCU module 23b of the state of the display device entering the standby mode or the operating mode, so that the MCU module 23b controls the environmental sound detection module 27 to be turned on or off.
  • the MCU module 23b is used to control the turning on or off of the environmental sound detection module 27, and at the same time to recognize the voice data input by the user, and when it recognizes that the voice data input by the user is a voice wake-up instruction, it triggers the SOC module 23a to control the display device by The standby mode enters the running mode.
  • the microphone module 25 is used to collect voice data input by the user and transmit the voice data to the MCU module 23b. For example, voice data containing wake words.
  • the environmental sound detection module 27 is used to detect the size of the environmental sound, and control the microphone module 25 and the MCU module 23b to turn on or off according to the size of the environmental sound, so as to control the standby function of the display device with the voice wake-up function in the standby mode Consumption.
  • the environmental sound detection module 27 when the display device is in the standby mode, the environmental sound detection module 27 is kept powered on, that is, in the on state, the microphone module 25 and the MCU module 23b can meet the requirements of the environmental sound detection module 27.
  • the SOC module 23a is kept powered off (that is, in the on state) or powered off (that is, in the off state), and the SOC module 23a is kept off, that is, in the off state.
  • the MCU module controls the environmental sound detection module to turn on by powering on.
  • the MCU module and the microphone module are first controlled to turn off by power off. Secondly, it can collect the external environmental sound; detect the size of the environmental sound; and determine whether the size of the environmental sound is within the preset range, so as to control the opening or closing of the microphone module and the MCU module through power-on or power-off.
  • the environmental sound detection module determines that the size of the environmental sound is not within the preset range, it means that the user is currently in an environmental sound scene where the user may not need to watch the display device, that is, the probability of the user waking up the display device in standby mode in this scene is almost 0, so there is no need for the microphone module and MCU module to work continuously in this scenario, so the environmental sound detection module keeps the microphone module and MCU module off by powering off, which can greatly reduce the standby caused by the microphone module and MCU module being powered on all the time Power consumption.
  • the environmental sound detection module determines that the size of the environmental sound is within the preset range, it means that the user is currently in an environmental sound scene where the user may need to watch the display device, that is, the user has a higher probability of waking up the display device in standby mode in this scene Therefore, in this scenario, the microphone module and the MCU module need to be continuously detected. Therefore, the environmental sound detection module controls the microphone module and the MCU module to turn on by powering on to detect whether the user issues a voice wake-up command to wake up the display device and enter the operating mode.
  • the preset range can be set based on experience or experiment.
  • the preset range is 40-70 decibels, such as 45 decibels, 55 decibels, 65 decibels, and so on.
  • the environment in this range is more suitable, such as the state of family eating dinner together or after dinner.
  • the user has a higher probability of needing to wake up the display device in standby mode; the environment is quieter when it is below 40 decibels, such as sleeping at night or during the day Unmanned, in this environment, the user may not wake up the display device in standby mode; when the environment is more than 70 decibels, the environment is noisy, such as inviting friends to a party during the day, the probability of the user waking up the display device in standby mode in this environment is very small .
  • the power consumption caused by the power-on of the environmental sound detection module is much less than the power consumption caused by the power-on of the microphone module and the MCU module.
  • the microphone module and the MCU module are intermittently controlled to be turned on or off at the same time, so that the display device in standby mode can be controlled Part of the standby power consumption.
  • the environmental sound detection module 27 can periodically maintain the power on (that is, the on state) or the power off (that is, the on state) according to whether the predefined conditions are met. Off state), the microphone module 25 and the MCU module 23b can alternately remain powered on (i.e. in the on state) or powered off (i.e. in the off state) according to whether the setting conditions of the environmental sound detection module 27 are met, and the SOC module 23a remains powered down (ie in the off state).
  • the MCU module controls the environmental sound detection module to turn on by powering on.
  • the MCU module and the microphone module are first controlled to turn off by power off. Secondly, the environmental sound detection module starts timing, and controls whether it is turned on or off according to whether the timing duration exceeds the set duration.
  • the timing duration of the environmental sound detection module exceeds the set duration, it controls itself to be turned off by power off, and keeps the microphone module and MCU module turned off. This can greatly reduce the standby power consumption caused by the microphone module and MCU module being powered on all the time.
  • the timing duration of the ambient sound detection module when the timing duration of the ambient sound detection module does not exceed the set duration, keep itself turned on; at the same time, it can collect ambient sound from the outside; detect the size of the ambient sound; and determine whether the size of the ambient sound is within the preset range to pass the Control the microphone module and MCU module to turn on or off in a power-off or power-off mode. In this way, the standby power consumption caused by the microphone module and MCU module being continuously powered on can be further reduced.
  • the environmental sound detection module controls the turning on or off of the microphone module and the MCU module according to whether the size of the environmental sound is within a preset range.
  • the specific implementation manner can refer to the first embodiment.
  • the set duration may be the duration of the environmental sound detection module being turned on once, for example, the set duration may be 60 minutes.
  • the setting time can be set according to experience and experiments.
  • the environmental sound detection module is periodically controlled to turn on or off, and the microphone module and the MCU module are further periodically controlled to turn on or off at the same time. Therefore, it is also possible to control part of the standby power consumption of the display device in the standby mode.
  • the microphone module 25, the MCU module 23b, and the environmental sound detection module 27 can alternate according to whether the preset conditions are met.
  • the SOC module 23a remains powered off (i.e. in the on state) or off state (i.e. in the off state) while the SOC module 23a remains powered off (i.e. in the off state).
  • the differences from the first and second embodiments include the following two aspects.
  • the environmental sound detection module judges that the environmental sound is within the preset range
  • the microphone module and the MCU module are controlled to be turned on through the power-on method, in order to further reduce the fact that the microphone module and the MCU module remain powered on and have not been detected for a long time
  • the MCU module and the microphone module are turned on, and the MCU module is made to start timing.
  • the timing duration exceeds the set duration, it means that the user may not need to watch the ambient sound of the display device.
  • the probability of the user waking up the display device in standby mode is very small, that is to say, the user has no further operation, such as no sound Voice, so that there is no need for the microphone module and MCU module to work continuously in this scenario, so the MCU module controls the environmental sound detection module to turn on by power-on, and at the same time, after the environmental sound detection module is turned on, it controls the MCU module and the microphone module to turn off by power-off.
  • the standby power consumption caused by the microphone module and the MCU module being powered on for a long time without detecting the user's voice can be further reduced.
  • timing duration does not exceed the set duration, it means that you are currently in a scene where the user may need to watch the ambient sound of the display device, that is, in this scenario, the user has a higher probability of needing to wake up the display device in standby mode, so a microphone module and MCU are required in this scenario
  • the module continues to detect work, so keep the microphone module and MCU module turned on, and keep the environmental sound detection module turned off, so that the microphone module is continuously preparing to collect the voice input by the user.
  • the MCU module continues to recognize the voice input by the user, and recognizes the voice input by the user. When it is a voice wake-up command, the SOC module is triggered to control the display device to enter the running mode from the standby mode.
  • the set duration may be the duration of keeping the microphone module and the MCU module turned on when it is determined that the environmental sound is within the preset range.
  • the set duration may be 15 minutes.
  • the setting time can be set according to experience and experiments.
  • the environmental sound detection module judges that the size of the environmental sound is within the preset range, the environmental sound detection module controls the microphone module and MCU module to turn on by powering on to detect whether the user issues a voice wake-up command to wake up the display device and enter operation mode.
  • the environmental sound detection module is controlled to be turned off through a power-off method, so as to reduce the standby power consumption caused by the continuous power-on of the environmental sound detection module.
  • the environmental sound detection module when the environmental sound detection module detects that the size of the environmental sound is within the preset range, it further controls the simultaneous microphone module and the MCU module by determining whether the microphone module and the MCU module are turned on for longer than the set time. Turn it on or off, thereby also being able to control part of the standby power consumption of the display device in the standby mode.
  • Embodiment 1 On the basis of the above-mentioned Embodiment 1, Embodiment 2 and Embodiment 3, when the display device is in operation mode, the microphone module 25, MCU module 23b, and SOC module 23a are all kept powered on (that is, in the on state), and the ambient sound The detection module 27 remains powered off (ie, in the off state).
  • the SOC module notifies the MCU module of the state of the display device entering the operating mode:
  • the MCU module controls the environmental sound detection module to turn off through the power-off method, so that the microphone module is continuously preparing to collect the voice input by the user.
  • the MCU module continuously receives the voice input by the user and transmits the voice to the SOC module for the SOC module to recognize the voice.
  • Voice control instructions corresponding to the voice and perform functions corresponding to the voice control instructions.
  • the MCU module, the microphone module, and the ambient sound detection module can be powered on or off by triggering at high and low levels, and then turned on or off.
  • Fig. 6 exemplarily shows a block diagram of the circuit configuration of each module in Fig. 5.
  • VCC check is the power supply of the environmental sound detection module, which is always high when the device is in standby mode/on mode
  • VCC MIC is the power supply of the microphone module, and it is always high when the device is in standby mode/on mode.
  • Level VCC MCU is the power supply of the MCU module, and it is always high when the display device is in standby mode/on mode.
  • the MCU module when the display device enters the standby mode, can control the output of CTRCL 2 to output a high level, and it can control the transistor Q1 to turn on, so that the input of the CTRL 4 input of the logic module is low. Processing, output high level from the CTRL3 output terminal of the logic module; when the CTRL3 output terminal outputs high level, it can control the transistor Q2 to turn on, thereby controlling the first switch module to turn on, so finally control the environmental sound detection module to power on.
  • the environmental sound detection module can control the output of CTRL 1 to output a high level.
  • the module and microphone module are powered off. That is, in standby mode: the ambient sound detection module is powered on and turned on, and the MCU module and microphone module are powered off and turned off.
  • the environmental sound detection module when the environmental sound detection module detects that the size of the environmental sound is within the preset range in the standby mode, the environmental sound detection module can control the CTRL1 output terminal to output a low level, which can control the transistor Q3 to turn off, thereby controlling the transistors respectively Q4 and Q5 are turned on, and the second switch module and the third switch module are further controlled to be turned on, so the MCU module and the microphone module are finally controlled to be powered on.
  • the MCU module can control the output of CTRCL2 to output a low level, which can control the transistor Q1 to turn off, so that the input of the CTRL4 input of the logic module is high, and after processing by the logic module, the output of the CTRL3 of the logic module outputs a low level;
  • the CTRL3 output terminal outputs a low level, it can control the transistor Q2 to be turned off, so that the first switch module is turned off, so the environmental sound detection module is finally controlled to be powered off. That is, when the environmental sound detection module detects that the environmental sound is within the preset range in the standby mode: the MCU module and the microphone module are powered on and turned on, and the environmental sound detection module is powered off and turned off.
  • the MCU module when the MCU module timing duration exceeds the set duration in the standby mode, the MCU module can control the CTRCL2 output terminal to output a high level, further control the CTRL3 output terminal to output a high level, and finally control the environmental sound detection module to power on.
  • the environmental sound detection module can control the CTRL1 output terminal to output a high level, and finally control the MCU module and the microphone module to power off. That is, when the MCU module timing duration in standby mode exceeds the preset duration: the environmental sound detection module is powered on and turned on, and the MCU module and microphone module are powered off and turned off.
  • the environmental sound detection module can control the CTRL1 output terminal to output a low level, and finally control the MCU module and the microphone module to power on.
  • the MCU module can control the output of CTRCL2 to output low level, further control the output of CTRL3 to output low level, and finally control the power off of the environmental sound detection module. That is, in running mode: MCU module and microphone module are powered on and turned on, and ambient sound detection module is powered off and turned off.
  • the first switch module, the second switch module, and the third switch module can be implemented as including a MOS transistor and a load resistor, and the logic module can be implemented as a form of a data selector with a function of choosing one from two.
  • FIG. 6 only exemplarily shows the circuit configuration form of each module in the display device. As the circuit configuration form of the preferred embodiment in this application, there is no specific limitation in actual application.
  • the display can be controlled. Standby power consumption of the device.
  • intermittently controlling the microphone module and MCU module to be turned on or off at the same time compared with keeping the microphone module and MCU module turned on, can greatly reduce the display device in standby mode Standby power consumption.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un dispositif d'affichage. Le dispositif d'affichage comprend un module de détection de son environnemental, un module de microphone et un module de traitement à puce. Le module de détection de son environnemental est configuré pour détecter l'amplitude d'un son environnemental d'un environnement dans lequel se trouve le dispositif d'affichage en mode veille, et commander, en fonction de l'amplitude du son environnemental, le module de microphone et le module de traitement à puce pour les allumer ou les éteindre. Le module de microphone est configuré pour acquérir des données vocales entrées par un utilisateur. Le module de traitement à puce est configuré pour reconnaître, à partir des données vocales acquises par le module de microphone, une instruction de réveil vocal entrée par l'utilisateur et répondre à celle-ci, de façon à réveiller le dispositif d'affichage pour entrer dans un mode de fonctionnement.
PCT/CN2020/078061 2019-03-15 2020-03-05 Dispositif d'affichage WO2020187050A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910199335.3 2019-03-15
CN201910199335.3A CN111698544A (zh) 2019-03-15 2019-03-15 一种显示设备

Publications (1)

Publication Number Publication Date
WO2020187050A1 true WO2020187050A1 (fr) 2020-09-24

Family

ID=72475455

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/078061 WO2020187050A1 (fr) 2019-03-15 2020-03-05 Dispositif d'affichage

Country Status (2)

Country Link
CN (1) CN111698544A (fr)
WO (1) WO2020187050A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112492393A (zh) * 2020-11-25 2021-03-12 海信视像科技股份有限公司 一种mic开关关联节能模式的实现方法及显示设备
CN113742003B (zh) * 2021-09-15 2023-08-22 深圳市朗强科技有限公司 一种基于fpga芯片的程序代码执行方法及设备
CN117130462A (zh) * 2023-03-20 2023-11-28 荣耀终端有限公司 一种设备控制方法及电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009122598A (ja) * 2007-11-19 2009-06-04 Pioneer Electronic Corp 電子機器、電子機器の制御方法、音声認識装置、音声認識方法及び音声認識プログラム
US20140278443A1 (en) * 2012-10-30 2014-09-18 Motorola Mobility Llc Voice Control User Interface with Progressive Command Engagement
CN104820556A (zh) * 2015-05-06 2015-08-05 广州视源电子科技股份有限公司 唤醒语音助手的方法及装置
CN105094816A (zh) * 2015-07-09 2015-11-25 北京君正集成电路股份有限公司 一种降低智能设备功耗的方法及智能设备
CN107528971A (zh) * 2017-08-10 2017-12-29 珠海市魅族科技有限公司 语音唤醒方法及装置、计算机装置及可读存储介质
CN109218899A (zh) * 2018-08-29 2019-01-15 出门问问信息科技有限公司 一种语音交互场景的识别方法、装置及智能音箱

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103021411A (zh) * 2012-11-27 2013-04-03 威盛电子股份有限公司 语音控制装置和语音控制方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009122598A (ja) * 2007-11-19 2009-06-04 Pioneer Electronic Corp 電子機器、電子機器の制御方法、音声認識装置、音声認識方法及び音声認識プログラム
US20140278443A1 (en) * 2012-10-30 2014-09-18 Motorola Mobility Llc Voice Control User Interface with Progressive Command Engagement
CN104820556A (zh) * 2015-05-06 2015-08-05 广州视源电子科技股份有限公司 唤醒语音助手的方法及装置
CN105094816A (zh) * 2015-07-09 2015-11-25 北京君正集成电路股份有限公司 一种降低智能设备功耗的方法及智能设备
CN107528971A (zh) * 2017-08-10 2017-12-29 珠海市魅族科技有限公司 语音唤醒方法及装置、计算机装置及可读存储介质
CN109218899A (zh) * 2018-08-29 2019-01-15 出门问问信息科技有限公司 一种语音交互场景的识别方法、装置及智能音箱

Also Published As

Publication number Publication date
CN111698544A (zh) 2020-09-22

Similar Documents

Publication Publication Date Title
WO2021109410A1 (fr) Procédé de réveil de dispositif et dispositif
CN113747232B (zh) 一种移动终端向显示设备推送媒体文件的方法及显示设备
WO2020244266A1 (fr) Procédé de commande à distance pour téléviseur intelligent, terminal mobile et téléviseur intelligent
CN111935518B (zh) 一种视频投屏方法及显示设备
WO2020187050A1 (fr) Dispositif d'affichage
WO2022073392A1 (fr) Procédé d'affichage d'image et dispositif d'affichage
WO2021203530A1 (fr) Dispositif d'affichage et procédé de distribution sélective d'émissions de télévision
WO2021164177A1 (fr) Procédé de lecture de ressource multimédia, dispositif d'affichage et terminal mobile
CN112153447B (zh) 一种显示设备及音画同步控制方法
WO2020258711A1 (fr) Procédé et dispositif de commande de rétroéclairage, et dispositif d'affichage
CN111836109A (zh) 显示设备、服务器及自动更新栏目框的方法
CN113938724A (zh) 显示设备及录屏分享方法
WO2022021669A1 (fr) Procédé pour la commande d'un mode d'image intelligent et dispositif d'affichage
WO2021189708A1 (fr) Procédé de mise en marche de protection d'écran pour dispositif d'affichage, et dispositif d'affichage
WO2021169168A1 (fr) Procédé de prévisualisation de fichier vidéo et dispositif d'affichage
US11550527B2 (en) Media file processing method for display device and display device
CN111954059A (zh) 屏保的展示方法及显示设备
WO2021031589A1 (fr) Dispositif d'affichage et procédé de réglage d'espace de gamme dynamique de couleurs
CN112272331B (zh) 一种节目频道列表快速展示的方法及显示设备
WO2021227232A1 (fr) Procédé d'affichage d'options de langue et d'options de pays, et dispositif d'affichage
WO2021196432A1 (fr) Procédé d'affichage et dispositif d'affichage pour un contenu correspondant à une commande
WO2020248699A1 (fr) Procédé de traitement du son et appareil d'affichage
WO2020147507A1 (fr) Dispositif d'affichage et procédé d'affichage
CN114302197A (zh) 一种语音分离控制方法及显示设备
WO2021189693A1 (fr) Procédé d'affichage de commande d'album, et dispositif d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20772838

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20772838

Country of ref document: EP

Kind code of ref document: A1