CN111918132B - Display device and multi-interface device judgment method - Google Patents

Display device and multi-interface device judgment method Download PDF

Info

Publication number
CN111918132B
CN111918132B CN202010731416.6A CN202010731416A CN111918132B CN 111918132 B CN111918132 B CN 111918132B CN 202010731416 A CN202010731416 A CN 202010731416A CN 111918132 B CN111918132 B CN 111918132B
Authority
CN
China
Prior art keywords
interface
equipment
identification information
instruction
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010731416.6A
Other languages
Chinese (zh)
Other versions
CN111918132A (en
Inventor
李保成
姜俊厚
司洪龙
刘晋
吴汉勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202010731416.6A priority Critical patent/CN111918132B/en
Publication of CN111918132A publication Critical patent/CN111918132A/en
Application granted granted Critical
Publication of CN111918132B publication Critical patent/CN111918132B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44231Monitoring of peripheral device or external card, e.g. to detect processing problems in a handheld device or the failure of an external recording device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42203Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Abstract

The method can be configured in a controller of the display equipment, and after the interface equipment is accessed to the display equipment through an external device interface, node identification information corresponding to the interface equipment is read through a detection instruction. And when the node identification information belongs to the preset equipment type, adding the node name of the interface equipment to the list of the callable equipment, thereby filtering the non-standard interface equipment. If the display equipment calls the interface equipment, the display equipment can be circularly searched from the list of the called equipment, so that the display equipment can accurately identify and call the interface equipment, and the problem of starting failure of the interface equipment is solved.

Description

Display device and multi-interface device judgment method
Technical Field
The application relates to the technical field of smart televisions, in particular to a display device and a multi-interface device judgment method.
Background
The smart television is a television product which is based on an Internet application technology, has an open operating system and a chip, has an open application platform, can realize a bidirectional man-machine interaction function, integrates various functions such as audio and video, entertainment, data and the like, and is used for meeting diversified and personalized requirements of users. The intelligent television can provide various media resources for users by depending on a network and various signal sources, and meets the watching requirements of different users. Different devices can be connected with the external device through the interface. For example, a USB interface may be provided on the smart television, and the camera device may be externally connected through the USB interface.
The external equipment can be called in some application programs to realize richer television functions. For example, the smart television can be provided with applications such as a "magic mirror" and a "video call", and after the applications are started and run, the applications can call an external camera to acquire user images, and display the acquired images on a screen or send acquired image information through a network, so that a magic mirror function and a video call function are realized.
Some external devices have multiple functions and may include multiple device nodes. For example, a part of external camera equipment integrates a microphone function, and includes two video nodes, that is, a standard camera equipment node and a mic equipment node. Therefore, when the application calls the external device, the external device may not be accurately identified, and a start fault may exist when the external device is started.
Disclosure of Invention
The application provides a display device and a multi-interface device judgment method, and aims to solve the problem that a traditional display device cannot accurately identify an external device.
In one aspect, the present application provides a display device, comprising: a display, an external device interface, and a controller. Wherein the display is configured to display a user interface; the external device interface is configured to access an interface apparatus; the controller is configured to perform the following program steps:
acquiring a detection instruction for detecting the interface equipment;
responding to the detection instruction, and reading node identification information corresponding to the interface equipment;
and if the node identification information belongs to the preset equipment type, adding the node name of the interface equipment to an invocable equipment list.
The display device provided by the first aspect of the present application may read the node identification information corresponding to the interface device through the detection instruction after the interface device accesses the display device through the external device interface, and add the node name of the interface device to the list of callable devices when the node identification information belongs to the preset device type, thereby filtering the non-standard interface device. If the display equipment calls the interface equipment, the display equipment can be circularly searched from the list of the called equipment, so that the display equipment can accurately identify and call the interface equipment, and the problem of starting failure of the interface equipment is solved.
Based on the foregoing display device, another aspect of the present application provides a method for determining a multi-interface device, including:
acquiring a detection instruction for detecting the interface equipment;
responding to the detection instruction, and reading node identification information corresponding to the interface equipment;
and if the node identification information belongs to the preset equipment type, adding the node name of the interface equipment to an invokable equipment list.
As can be seen from the foregoing technical solutions, the multi-interface device determining method provided in the second aspect of the present application is configured to run in a controller of a display device, and after the controller receives a detection instruction for detecting an interface device, the node identification information corresponding to the interface device may be read according to the detection instruction. If the node identification information belongs to the preset equipment type, the node name of the interface equipment is added to the list of the calling equipment, so that the calling equipment can be searched in the list of the calling equipment in a circulating mode when the interface equipment is called, the corresponding interface equipment is called, and the problem of starting failure of the interface equipment is solved.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus in an embodiment of the present application;
fig. 2 is a block diagram of a hardware configuration of a display device in an embodiment of the present application;
fig. 3 is a block diagram of a hardware configuration of a control device in an embodiment of the present application;
FIG. 4 is a schematic diagram of a software configuration of a display device in an embodiment of the present application;
fig. 5 is a schematic display diagram of an icon control interface of an application program of a display device in an embodiment of the present application;
fig. 6 is a schematic flow chart illustrating a method for determining a multi-interface device according to an embodiment of the present application;
fig. 7 is a schematic flowchart of reading node identification information in the embodiment of the present application;
FIG. 8 is a schematic flowchart illustrating a process of sending a read command to a device node according to an embodiment of the present application;
FIG. 9 is a schematic flowchart illustrating a process of invoking an interface device in an embodiment of the present application;
fig. 10 is a schematic view of a connection structure between a display device and an interface device in the embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following examples do not represent all embodiments consistent with the present application. But merely as exemplifications of systems and methods consistent with certain aspects of the application, as recited in the claims.
To make the objects, embodiments and advantages of the present application clearer, the following is a clear and complete description of exemplary embodiments of the present application with reference to the attached drawings in exemplary embodiments of the present application, and it is apparent that the exemplary embodiments described are only a part of the embodiments of the present application, and not all of the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments described herein without making any inventive step, are intended to be within the scope of the claims appended hereto. In addition, while the disclosure herein has been presented in terms of one or more exemplary examples, it should be appreciated that aspects of the disclosure may be implemented solely as a complete embodiment.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and are not necessarily intended to limit the order or sequence of any particular one, Unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Furthermore, the terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module," as used herein, refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in this application refers to a component of an electronic device (such as the display device disclosed in this application) that is typically wirelessly controllable over a relatively short range of distances. Typically using infrared and/or Radio Frequency (RF) signals and/or bluetooth to connect with the electronic device, and may also include WiFi, wireless USB, bluetooth, motion sensor, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in this application refers to a user's behavior through a change in hand shape or an action such as hand motion to convey a desired idea, action, purpose, or result.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display device 200 through the mobile terminal 300 and the control apparatus 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, etc., and the display device 200 is controlled by wireless or other wired methods. The user may input a user command through a key on a remote controller, voice input, control panel input, etc. to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
In some embodiments, mobile terminals, tablets, computers, laptops, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device. The application, through configuration, may provide the user with various controls in an intuitive User Interface (UI) on a screen associated with the smart device.
In some embodiments, the mobile terminal 300 may install a software application with the display device 200 to implement connection communication through a network communication protocol for the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 300 and the display device 200 can establish a control instruction protocol, synchronize a remote control keyboard to the mobile terminal 300, and control the display device 200 by controlling a user interface on the mobile terminal 300. The audio and video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display apparatus 200 also performs data communication with the server 400 through various communication means. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. Illustratively, the display device 200 receives software program updates, or accesses a remotely stored digital media library, by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers. Other web service contents such as video on demand and advertisement services are provided through the server 400.
The display device 200 may be a liquid crystal display, an OLED display, or a projection display device. The particular display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function of a computer support function including, but not limited to, a network tv, an intelligent tv, an Internet Protocol Tv (IPTV), and the like, in addition to the broadcast receiving tv function.
A hardware configuration block diagram of a display device 200 according to an exemplary embodiment is exemplarily shown in fig. 2.
In some embodiments, at least one of the controller 250, the tuner demodulator 210, the communicator 220, the detector 230, the input/output interface 255, the display 275, the audio output interface 285, the memory 260, the power supply 290, the user interface 265, and the external device interface 240 is included in the display apparatus 200.
In some embodiments, a display 275 receives image signals originating from the first processor output and displays video content and images and components of the menu manipulation interface.
In some embodiments, the display 275, includes a display screen assembly for presenting a picture, and a driving assembly that drives the display of an image.
In some embodiments, the video content is displayed from broadcast television content, or alternatively, from various broadcast signals that may be received via wired or wireless communication protocols. Alternatively, various image contents received from the network communication protocol and sent from the network server side can be displayed.
In some embodiments, the display 275 is used to present a user-manipulated UI interface generated in the display apparatus 200 and used to control the display apparatus 200.
In some embodiments, a driver assembly for driving the display is also included, depending on the type of display 275.
In some embodiments, display 275 is a projection display and may also include a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver.
In some embodiments, the display apparatus 200 may establish control signal and data signal transmission and reception with the external control apparatus 100 or the content providing apparatus through the communicator 220.
In some embodiments, the user interface 265 may be configured to receive infrared control signals from a control device 100 (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is a signal used by the display device 200 to collect an external environment or interact with the outside.
In some embodiments, the detector 230 includes a light receiver, a sensor for collecting the intensity of ambient light, and parameters changes can be adaptively displayed by collecting the ambient light, and the like.
In some embodiments, the detector 230 may further include an image collector, such as a camera, etc., which may be configured to collect external environment scenes, collect attributes of the user or gestures interacted with the user, adaptively change display parameters, and recognize user gestures, so as to implement a function of interaction with the user.
In some embodiments, the detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature.
In some embodiments, the display apparatus 200 may adaptively adjust a display color temperature of an image. For example, when the temperature is high, the display device 200 may be adjusted to display a color temperature of the image in a cool tone, or when the temperature is low, the display device 200 may be adjusted to display a warm tone.
In some embodiments, the detector 230 may also be a sound collector or the like, such as a microphone, which may be used to receive the user's voice. Illustratively, a voice signal including a control instruction of the user to control the display device 200, or to collect an ambient sound for recognizing an ambient scene type, so that the display device 200 can adaptively adapt to an ambient noise.
In some embodiments, as shown in fig. 2, the input/output interface 255 is configured to allow data transfer between the controller 250 and external other devices or other controllers 250. Such as receiving video signal data and audio signal data of an external device, or command instruction data, etc.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: the interface can be any one or more of a high-definition multimedia interface (HDMI), an analog or data high-definition component input interface, a composite video input interface, a USB input interface, an RGB port and the like. The plurality of interfaces may form a composite input/output interface.
In some embodiments, as shown in fig. 2, the tuning demodulator 210 is configured to receive a broadcast television signal through a wired or wireless receiving manner, perform modulation and demodulation processing such as amplification, mixing, resonance, and the like, and demodulate an audio and video signal from a plurality of wireless or wired broadcast television signals, where the audio and video signal may include a television audio and video signal carried in a television channel frequency selected by a user and an EPG data signal.
In some embodiments, the frequency points demodulated by the tuner demodulator 210 are controlled by the controller 250, and the controller 250 can send out control signals according to user selection, so that the modem responds to the television signal frequency selected by the user and modulates and demodulates the television signal carried by the frequency.
In some embodiments, the broadcast television signal may be classified into a terrestrial broadcast signal, a cable broadcast signal, a satellite broadcast signal, an internet broadcast signal, or the like according to the broadcasting system of the television signal. Or may be classified into a digital modulation signal, an analog modulation signal, and the like according to a modulation type. Or the signals are classified into digital signals, analog signals and the like according to the types of the signals.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box. Therefore, the set top box outputs the television audio and video signals modulated and demodulated by the received broadcast television signals to the main body equipment, and the main body equipment receives the audio and video signals through the first input/output interface.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
As shown in fig. 2, the controller 250 includes at least one of a Random Access Memory 251 (RAM), a Read-Only Memory 252 (ROM), a video processor 270, an audio processor 280, other processors 253 (e.g., a Graphics Processing Unit (GPU), a Central Processing Unit 254 (CPU), a Communication Interface (Communication Interface), and a Communication Bus 256(Bus), which connects the respective components.
In some embodiments, RAM 251 is used to store temporary data for the operating system or other programs that are running, and in some embodiments, ROM 252 is used to store instructions for various system boots.
In some embodiments, the ROM 252 is used to store a Basic Input Output System (BIOS). The system is used for completing power-on self-test of the system, initialization of each functional module in the system, a driver of basic input/output of the system and booting an operating system.
In some embodiments, when the power-on signal is received, the display device 200 starts to power up, the CPU executes the system boot instruction in the ROM 252, and copies the temporary data of the operating system stored in the memory to the RAM 251 so as to start or run the operating system. After the start of the operating system is completed, the CPU copies the temporary data of the various application programs in the memory to the RAM 251, and then, the various application programs are started or run.
In some embodiments, processor 254 is used to execute operating system and application program instructions stored in memory. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some demonstrative embodiments, processor 254 may include a plurality of processors. The plurality of processors may include a main processor and one or more sub-processors. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. One or more sub-processors for one operation in a standby mode or the like.
In some embodiments, the graphics processor 253 is used to generate various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And the system comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, video processor 270 is configured to receive an external video signal, and perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, etc., according to a standard codec protocol of the input signal, so as to obtain a signal that can be displayed or played on directly displayable device 200.
In some embodiments, video processor 270 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal, an audio signal and the like.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is used for converting an input video frame rate, such as a 60Hz frame rate into a 120Hz frame rate or a 240Hz frame rate, and a common format is implemented by using a frame interpolation method, for example.
The display format module is used for converting the received video output signal after the frame rate conversion, and changing the signal to conform to the signal of the display format, such as outputting an RGB data signal.
In some embodiments, the graphics processor 253 and the video processor may be integrated or separately configured, and when the graphics processor and the video processor are integrated, the graphics processor and the video processor may perform processing of graphics signals output to the display, and when the graphics processor and the video processor are separately configured, the graphics processor and the video processor may perform different functions, respectively, for example, a GPU + frc (frame Rate conversion) architecture.
In some embodiments, the audio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, and amplification processes to obtain an audio signal that can be played in a speaker.
In some embodiments, video processor 270 may comprise one or more chips. The audio processor may also comprise one or more chips.
In some embodiments, the video processor 270 and the audio processor 280 may be separate chips or may be integrated with the controller in one or more chips.
In some embodiments, the audio output, under the control of controller 250, receives sound signals output by audio processor 280, such as: the speaker 286, and an external sound output terminal of a generating device that can output to an external device, in addition to the speaker carried by the display device 200 itself, such as: external sound interface or earphone interface, etc., and may also include a near field communication module in the communication interface, for example: and the Bluetooth module is used for outputting sound of the Bluetooth loudspeaker.
The power supply 290 supplies power to the display apparatus 200 from the power input from the external power source under the control of the controller 250. The power supply 290 may include a built-in power supply circuit installed inside the display apparatus 200, or may be a power supply interface installed outside the display apparatus 200 to provide an external power supply in the display apparatus 200.
A user interface 265 for receiving an input signal of a user and then transmitting the received user input signal to the controller 250. The user input signal may be a remote controller signal received through an infrared receiver, and various user control signals may be received through the network communication module.
In some embodiments, the user inputs a user command through the control apparatus 100 or the mobile terminal 300, the user input interface responds to the user input through the controller 250 according to the user input, and the display device 200 responds to the user input through the controller 250.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on the display 275, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input a user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
The memory 260 includes a memory storing various software modules for driving the display device 200. Such as: various software modules stored in the first memory, including: at least one of a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
The base module is a bottom layer software module for signal communication between various hardware in the display device 200 and for sending processing and control signals to the upper layer module. The detection module is used for collecting various information from various sensors or user input interfaces, and the management module is used for performing digital-to-analog conversion and analysis management.
For example, the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is used for controlling the display to display the image content, and can be used for playing the multimedia image content, UI interface and other information. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing a module for data communication between browsing servers. And the service module is used for providing various services and modules including various application programs. Meanwhile, the memory 260 may also store a visual effect map for receiving external data and user data, images of various items in various user interfaces, and a focus object, and the like.
Fig. 3 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 3, the control apparatus 100 includes a controller 110, a communication interface 130, a user input/output interface, a memory, and a power supply source.
The control device 100 is configured to control the display device 200 and may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200. Such as: the user responds to the channel up and down operation by operating the channel up and down keys on the control device 100.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications that control the display apparatus 200 according to user demands.
In some embodiments, as shown in fig. 1, a mobile terminal 300 or other intelligent electronic device may function similar to the control device 100 after installing an application that manipulates the display device 200. Such as: the user may implement the functions of controlling the physical keys of the device 100 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 300 or other intelligent electronic device.
The controller 110 includes a processor 112 and RAM 113 and ROM 114, a communication interface 130, and a communication bus. The controller is used to control the operation of the control device 100, as well as the communication cooperation between the internal components and the external and internal data processing functions.
The communication interface 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display apparatus 200. The communication interface 130 may include at least one of a WiFi chip 131, a bluetooth module 132, an NFC module 133, and other near field communication modules.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touch pad 142, a sensor 143, keys 144, and other input interfaces. Such as: the user can realize a user instruction input function through actions such as voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display apparatus 200. In some embodiments, the interface may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
In some embodiments, the control device 100 includes at least one of a communication interface 130 and an input-output interface 140. The control device 100 is provided with a communication interface 130, such as: the WiFi, bluetooth, NFC, etc. modules may encode the user input command according to the WiFi protocol, or the bluetooth protocol, or the NFC protocol, and send the encoded user input command to the display device 200.
A memory 190 for storing various operation programs, data and applications for driving and controlling the control apparatus 200 under the control of the controller. The memory 190 may store various control signal commands input by a user.
And a power supply 180 for providing operational power support to the various elements of the control device 100 under the control of the controller. A battery and associated control circuitry.
In some embodiments, the system may include a Kernel (Kernel), a command parser (shell), a file system, and an application program. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user application program. The application program is compiled into machine code after being started, and a process is formed.
Referring to fig. 4, in some embodiments, the system is divided into four layers, which are, from top to bottom, an Application (Applications) layer (abbreviated as "Application layer"), an Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer.
In some embodiments, at least one application program runs in the application program layer, and the application programs can be Window (Window) programs carried by an operating system, system setting programs, clock programs, camera applications and the like; or may be an application developed by a third party developer such as a hi program, a karaoke program, a magic mirror program, or the like. In specific implementation, the application packages in the application layer are not limited to the above examples, and may actually include other application packages, which is not limited in this embodiment of the present application.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. The application program can access the resource in the system and obtain the service of the system in execution through the API interface
As shown in fig. 4, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; the file Package Manager (Package Manager) is used for detecting various information related to the application packages currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is to: managing the life cycle of each application program and the usual navigation backspacing functions, such as controlling the exit of the application program (including switching the user interface currently displayed in the display window to the system desktop), opening, backing (including switching the user interface currently displayed in the display window to the previous user interface of the user interface currently displayed), and the like.
In some embodiments, the window manager is configured to manage all window processes, such as obtaining a display size, determining whether a status bar is available, locking a screen, intercepting a screen, controlling a display change (e.g., zooming out, dithering, distorting, etc.) and the like.
In some embodiments, the system runtime layer provides support for an upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the core layer includes at least one of the following drivers: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (such as fingerprint sensor, temperature sensor, touch sensor, pressure sensor, etc.), and so on.
In some embodiments, the kernel layer further comprises a power driver module for power management.
In some embodiments, software programs and/or modules corresponding to the software architecture of fig. 4 are stored in the first memory or the second memory shown in fig. 2 or 3.
In some embodiments, taking the magic mirror application (photographing application) as an example, when the remote control receiving device receives a remote control input operation, a corresponding hardware interrupt is sent to the kernel layer. The kernel layer processes the input operation into an original input event (including information such as a value of the input operation, a timestamp of the input operation, etc.). The raw input events are stored at the kernel layer. The application program framework layer obtains an original input event from the kernel layer, identifies a control corresponding to the input event according to the current position of the focus and uses the input operation as a confirmation operation, the control corresponding to the confirmation operation is a control of a magic mirror application icon, the magic mirror application calls an interface of the application framework layer to start the magic mirror application, and then the kernel layer is called to start a camera driver, so that a static image or a video is captured through the camera.
In some embodiments, for a display device with a touch function, taking a split screen operation as an example, the display device receives an input operation (such as a split screen operation) that a user acts on a display screen, and the kernel layer may generate a corresponding input event according to the input operation and report the event to the application framework layer. The window mode (such as a multi-window mode) corresponding to the input operation, the window position and size, and the like are set by the activity manager of the application framework layer. And the window management of the application program framework layer draws a window according to the setting of the activity manager, then sends the drawn window data to the display driver of the kernel layer, and the display driver displays the corresponding application interface in different display areas of the display screen.
In some embodiments, as shown in fig. 5, the application layer containing at least one application may display a corresponding icon control in the display, such as: the system comprises a live television application icon control, a video on demand application icon control, a media center application icon control, an application center icon control, a game application icon control and the like.
In some embodiments, the live television application may provide live television via different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
In some embodiments, a video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
In some embodiments, the media center application may provide various applications for multimedia content playback. For example, a media center, which may be other than live television or video on demand, may provide services for a user to access various images or audio through a media center application.
In some embodiments, an application center may provide storage for various applications. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on the smart television. The application center may obtain these applications from different sources, store them in local storage, and then be operable on the display device 200.
Based on the above-described display apparatus 200, a user can access various interface apparatuses 500 through the external device interface 240 of the display apparatus 200. The interface apparatus 500 may be an apparatus having a specific function, and the interface apparatus 500 may be divided into an input device and an output device according to the functions thereof. For example, a camera, a microphone, and other devices capable of acquiring video and audio signals are input devices; the equipment capable of outputting audio and video signals, such as a sound box, an external display and the like, is an output device.
The input device may generate an electrical signal that may be read and/or processed by the display apparatus 200 based on the acquired signal. And transmits the collected electrical signals to the display apparatus 200 through the external device interface 240 so that the display apparatus 200 can implement various functions using the signals of the interface apparatus 500. The output device can convert the electric signal in the display apparatus 200 into other signals for output, so as to enrich the output mode of outputting audio and video signals by using the output device.
In some embodiments of the present application, the same interface device 500 may also include multiple functions, which may be independent or associated with each other. The interface device 500 may have a plurality of functions such that the interface device 500 functions as an output device or as an input device. For example, for a part of the camera apparatuses, it is possible to have an audio input function in addition to the camera function, i.e., to integrate a lens module and a microphone module in the camera apparatus. After the camera device is connected to the display device 200 through the external device interface 240, it is equivalent to simultaneously connecting two external devices, i.e., a camera and a microphone, to the display device 200.
In the embodiment of the present application, each functional module included in the interface device 500 may serve as a device node. For example, the camera device may include two video nodes, i.e., a standard camera device node and a mic node. The number of device nodes corresponding to the interface device 500 is different according to the functions integrated by the interface device 500, and in the embodiment of the present application, such an interface device 500 having multiple functions is referred to as a multi-interface device.
Since the interface apparatus 500 having a plurality of functions generally accesses the display apparatus 200 through one external device interface 240, a problem that the apparatus node type cannot be recognized easily occurs when the display apparatus 200 calls the interface apparatus 500. For example, when a "magic mirror" application is installed in the display device 200, the magic mirror application may capture a user image by invoking a camera for display on the display 275 of the display device 200. However, since the camera device node and the mic node in the camera device are both identified as video nodes, the microphone may be called when the magic mirror is used for calling the camera, and thus, images collected by the camera device cannot be obtained.
Therefore, in order to accurately call the interface device 500, in some embodiments of the present application, a multi-interface device determination method is provided, which may be applied to the controller 250 of the display device 200 to achieve accurate identification of the interface device 500. As shown in fig. 6, the judging method includes the steps of:
s1: a detection instruction for detecting the interface device is acquired.
In order to call the interface apparatus 500 to implement different functions, after the interface apparatus 500 is connected to the external device interface 240 of the display apparatus 200, the controller 250 may detect the interface apparatus 500 connected to the external device interface 240 periodically or in case of user's trigger.
The controller 250 may perform the detection of the interface device 500 by acquiring the detection instruction. The detection instruction may be actively input by the user or automatically generated by part of the operating conditions. For example, for the external device interface 240 supporting hot plug, the detection program, i.e., the detection instruction, may be automatically executed, i.e., generated, upon detecting that the user connects to the interface apparatus 500.
Obviously, the present application is not limited to the above-mentioned generation manner of the detection instruction, and in practical applications, a plurality of different detection instruction generation conditions may be set according to the application requirements of the display device 200, and when it is detected that the display device 200 satisfies the conditions, the detection instruction is generated. In general, the set generation condition may be related to a calling procedure of the interface device 500, and for example, the generation condition may be set as whether or not a specific application is started. And the specific application may be the application that needs to call the interface device 500.
S2: and responding to the detection instruction, and reading the node identification information corresponding to the interface equipment.
After the controller 250 acquires the detection instruction, the node identification information corresponding to the interface device 500 may be read according to the acquired detection instruction. The node identification information is information that can distinguish the corresponding functions of the device node. For example, by reading the node identification information, the device nodes in the camera device are respectively marked as "camera 01" and "mic 01", and by marking the device nodes, the functions corresponding to the device nodes can be determined to be a camera and a microphone respectively.
The node identification information may be readable hardware parameter information of the corresponding functional module, or information of a type of a signal processed by the corresponding functional module. For example, if the interface device 500 connected to the display device 200 is a camera device with a lens module and a microphone module, after receiving the detection instruction, the controller 250 may detect module hardware information in the interface device 500, and when the lens module is detected, mark a device node corresponding to the lens module; similarly, when the microphone module is detected, the device node corresponding to the microphone module is marked.
For the standard interface device 500, the detection of the hardware module information by the controller 250 may be determined by information of a module factory mark or an interface and a wiring type corresponding to the module. For some non-standard interface devices 500, it is often difficult to determine the node identification information through factory information or interface types, so in some embodiments of the present application, the node identification information may also be determined by detecting signal characteristics transmitted by the interface device 500. For example, for a microphone module, the type of signal collected by the microphone may be detected, and if the type of signal collected is detected to be an audio signal, the microphone module is correspondingly marked as an audio input device.
S3: and if the node identification information belongs to the preset equipment type, adding the node name of the interface equipment to an invokable equipment list.
After the node identification information corresponding to the interface device is read, the read node identification information can be matched, and the type of the preset device to which the node identification information belongs is determined. The preset device type can be determined according to the function to be called in the practical application. For example, a "magic mirror" application typically only calls a lens module for image capture, and does not call a microphone module, so for a magic mirror application, the device type may be preset as an image capture device.
By matching the node identification information with the preset device type, the device nodes in the interface device 500 may be filtered, thereby determining the device nodes that can be invoked. For facilitating subsequent calls, the node name of the interface device may be added to the list of callable devices after determining that the node identification information belongs to the preset device type.
For example, if the preset device type corresponding to the magic mirror application is an image acquisition device, the device node corresponding to the lens module belongs to the image acquisition device type through matching, and the device node corresponding to the microphone module does not belong to the image acquisition device type. The device node names corresponding to the lens modules in the interface device can be added to the list of callable devices. When the magic mirror application is started, the camera equipment is searched from the corresponding adjustable equipment list, the lens module in the camera equipment can be called, the microphone module in the camera equipment cannot be called by mistake, and the normal operation of the magic mirror application is ensured.
According to the above technical solution, after the detection instruction is obtained, the multi-interface device determining method may read the node identification information corresponding to each function module in the interface device 500 according to the detection instruction, so as to determine the device type to which the function module belongs, and add the node name belonging to the preset device type to the list of the callable devices, so as to be used for searching when the interface device 500 is called subsequently, thereby alleviating the problem of starting failure of the interface device 500.
According to the above embodiment, the display device 200 may read the node identification information corresponding to the interface device 500 in response to the detection instruction after receiving the detection instruction. The detection instruction may be automatically generated when the operation condition of the display device 200 satisfies a certain condition. Specifically, in some embodiments, the detection instruction may be generated when the interface apparatus 500 accesses the external device interface 240, that is: the step of obtaining a detection instruction for detecting the interface device further comprises:
s111: acquiring a trigger signal generated when the interface equipment is accessed to the interface of the external device;
s112: responding to the trigger signal, executing software configuration on the interface equipment, and executing reading of identification information corresponding to the interface equipment node.
When the interface apparatus 500 is connected to the external device interface 240, the power distribution on the main board of the display apparatus 200 is changed, so that the controller 250 can detect the connection state on the external device interface 240 by monitoring the power distribution change. When the interface device 500 is connected to the external device interface 240, a trigger signal is automatically generated, and a detection instruction is generated according to the trigger signal to determine a multi-interface device.
Obviously, since both the trigger signal and the detection instruction can be directly executed in the controller 250, when it is detected that the interface device 500 is accessed on the external device interface 240, the controller 250 may directly execute step S2, that is, read the node identification information corresponding to the interface device.
In practical applications, after a part of the interface device 500 is connected to the display device 200, the display device 200 needs to perform software configuration on the interface device 500 to be normally used, for example, installing a driver, setting a data transmission protocol, and the like. To this end, after acquiring the trigger signal, software configuration may be performed on the interface device in response to the trigger signal so that the accessed interface device 500 can be called.
It can be seen that, in this embodiment, each time the interface device 500 is accessed, the accessed interface device 500 may be detected, and the device type of each device accessed to the external apparatus interface 240 is determined, so as to maintain the list of callable devices, so that the subsequent display device 200 may call the interface device 500 according to the list of callable devices, thereby alleviating the problem of error proneness to call multiple interface devices.
In some embodiments, the step of obtaining the detection instruction for detecting the interface device further includes:
s121: acquiring the application program starting instruction;
s122: and starting and operating the application program corresponding to the application program starting instruction, and executing the step of reading the node identification information corresponding to the interface equipment.
In practical applications, a user may perform an interactive operation in the user interface through the control apparatus 100 to control the display device 200 to start running any application program. For example, the user may control the position of the focus cursor to be moved in the application interface by the "up, down, left, and right" keys of the control device 100, so as to select the icon of the application to be started, and then start and run the selected application by the "OK/OK" key on the control device 100. When the user presses the "OK/OK" key when the focus cursor is at the application icon position, an application start instruction is input on behalf of the user. After receiving the application program start instruction, the controller 250 may execute the corresponding application program on one hand, and may perform the step of reading the node identification information corresponding to the interface device on the other hand.
In this embodiment, the started application may be an application that needs to call the interface device 500. For such applications, the components built in the display device 200 often cannot meet the functional requirements of the applications or achieve poor functional effects. For example, the display device 200 can only output video or audio signals, but cannot input video or audio signals, and when an application program such as "magic mirror" or "video call" is run on the display device 200, the external interface device 500 needs to be called to implement the function thereof.
For this purpose, an application list may be previously set in the display device 200, and application package name information that requires the interface device 500 to be called may be described in the application list. When the user controls the display apparatus 200 to start any application, the controller 250 may perform matching in the application list according to package name information of the started application. When the matching determines that the started application program does not need to call the interface device 500, the corresponding program can be directly started and operated; when it is determined that the started application needs to call the interface device 500 by matching, in response to the application starting instruction, while the corresponding application is started and operated, a step of reading the node identification information corresponding to the interface device is also performed, so as to determine the device node attribution type of the interface device 500.
As can be seen, in this embodiment, the determination of the multiple interface devices may be performed when the user starts the application program, so that the started application program may search for the calling interface device 500 in the determined list of the calling device, and it is ensured that the interface device 500 can be correctly called when the application program list is run, thereby alleviating the problem of starting failure of the interface device 500.
In some embodiments, if the detection instruction for detecting the interface device is a power-on start instruction input by a user, the step of obtaining the detection instruction for detecting the interface device further includes:
s131: acquiring the starting instruction;
s132: responding to the starting instruction, and running an operating system starting program;
s133: and after the operation of the starting program is finished, executing the step of reading the identification information corresponding to the interface equipment node.
In practical applications, the power-on start command may be triggered and input by a user through a corresponding key on the control apparatus 100 or the display device 200. For example, the user may control the display device 200 to be powered on by pressing a "power on" key on the control apparatus 100; the user may also press or touch the "power on" key on the display device 200 to control the display device 200 to start up. After obtaining the power-on start command, the controller 250 may run a power-on start program of the operating system corresponding to the display device 200.
After the operation of the power-on start-up program is completed, the controller 250 may detect the interface device 500 by automatically executing the step of reading the identification information corresponding to the node of the interface device, so as to maintain a list of callable devices in the display device 200, so as to call the interface device 500 after the power-on operation.
It can be seen that, in this embodiment, the interface device 500 can be detected once in each boot process. After the display device 200 is powered on and operated, a list of devices that can be called may be maintained in the display device 200 for the application programs running in the display device 200 to call. The detection mode not only can maintain the available callable device list after the startup operation, but also can reduce the maintenance frequency of the callable device list, so that the operation control process of the display device 200 cannot be influenced by the maintenance process of the callable device list.
As can be seen from the above technical solutions, in the multi-interface device determining method provided in the embodiments of the present application, a user may actively input a detection instruction, or a detection instruction may be automatically generated according to a set rule, so that the display device 200 may maintain an invokable device list according to the interface device 500 accessed on the external apparatus interface 240, so that the display device 200 may be used in different operating states. The detection modes provided in the above embodiments may be wholly or partially configured in the controller of the display device 200 to perform comprehensive judgment, so that the display device 200 is suitable for different operation states.
In order to be able to read the identification information of the device node corresponding to each functional module in the interface device 500, in some embodiments, as shown in fig. 7, the step of reading the node identification information corresponding to the interface device further includes:
s21: in response to the detection instruction, sending a read command to each of the interface devices;
s22: and receiving identification information fed back by the interface equipment aiming at the read command.
After acquiring the detection instruction for detecting the interface device, the controller 250 may transmit a read command to the interface device 500 according to the instruction. The interface device 500 may feed back identification information to the controller 250 after receiving the read command, so that the controller 250 receives the identification information.
Wherein, for a standard device or an interface device 500 capable of directly performing information interaction with the display device 200, the read command may be a control instruction for feeding back information. The interface device 500 may directly feed back readable data, such as device information including name, function module number, IP address, etc., to the controller 250 for the control instruction. The controller 250 may directly determine the device node currently included in the interface device 500 from the device information, thereby directly extracting the identification information.
For a non-standard device or a simple device that cannot directly perform information interaction with the display device 200, the read instruction may be a control instruction that controls the device to directly start operation. For example, for a non-standard class of camera devices, the controller 250 may send a start instruction to the device to start the camera device for image acquisition.
After sending the control command, the controller 250 may extract the node identification information corresponding thereto by detecting the signal generated by the interface apparatus 500. For example, after the camera device is started, the controller 250 acquires a signal generated by the camera device and detects the type of the acquired signal, so as to determine the node identification information corresponding to the interface device 500. If the signal generated by the interface device 500 includes an audio signal, determining that a node identification information corresponding to the interface device 500 belongs to the audio input device; if the signal generated by the interface device 500 also includes a video signal, then it is determined that a node identification information corresponding to the interface device 500 belongs to the video input signal.
In order to identify the signal fed back by the interface device 500, in some embodiments, as shown in fig. 7, the identification information includes a signal type of data that can be transmitted by the interface device, and the step of reading the node identification information corresponding to the interface device further includes:
s23: extracting the signal type from the identification information;
s24: if the signal type is the same as the signal type corresponding to the preset equipment type, determining that the identification information belongs to the preset equipment type;
s25: and if the signal type is different from the signal type corresponding to the preset equipment type, determining that the identification information does not belong to the preset equipment type.
After sending the read command to the interface device 500, the controller 250 may determine the preset device type to which the identification information belongs by detecting the signal type fed back by the interface device 500.
For example, the preset device type corresponding to the magic mirror application is a video input device. The controller 250 may detect a signal fed back from the camera apparatus, and detect whether an image signal is included in the signal fed back from the camera apparatus. For example, whether or not a pixel can be read from the fed-back data is detected. If a pixel is read in a signal fed back by a lens module in the camera device, it is determined that the node identification information corresponding to the camera device belongs to the preset device type, and therefore, the device node corresponding to the lens module may be marked as a video input device, such as camera 01.
Similarly, for a signal fed back by the microphone module in the camera device, wherein a pixel cannot be read, it is determined that the identification information corresponding to the microphone module does not belong to the preset device type, so that the device node corresponding to the microphone module is not marked, and the original device node name is maintained.
Obviously, in practical application, when the interface device 500 has a plurality of functions, the multi-interface device determining method provided in this embodiment needs to perform corresponding device node identification, and does not need to perform device node identification for the interface device 500 with a single function. Thus, in some embodiments, as shown in FIG. 8, the step of sending a read command to each of the interface devices comprises:
s211: detecting the number of equipment nodes corresponding to the interface equipment;
s212: if the number of the equipment nodes is more than or equal to 2, sending a reading command to each equipment node;
s213: and if the number of the equipment nodes is equal to 1, the step of adding the node names of the interface equipment to the list of the callable equipment is executed.
Before sending the read command to the interface device 500, the controller 250 may determine the number of device nodes included in the interface device 500. The method for determining the number of device nodes may be determined by detecting the number of types of signals that can be output or input by the interface device 500. For example, by detecting the signal fed back by the interface device 500, and determining that the interface device 500 can feed back both the image signal and the audio signal, it is determined that the current interface device 500 includes 2 device nodes.
When the number of device nodes is equal to 1, that is, the accessed interface device 500 has only one function, the device nodes of the interface device 500 do not need to be marked, and the node names are directly added to the list of the callable devices after the device nodes are judged to be in accordance with the preset device type. When the number of the device nodes is greater than or equal to 2, it is determined that the current interface device 500 has multiple functions, and therefore, the identification information corresponding to each device node may be read by sending a read command to each device node.
As can be seen from the above embodiments, the multi-interface device determining method may determine the device node type of the interface device 500 by reading the information of the interface device 500 in advance in the process of detecting the interface device 500, and form a list of callable devices, so as to filter some non-standard devices, and realize that when the display device 200 calls the interface device 500, the interface device 500 is called directly according to the list of callable devices, thereby alleviating the problem of wrong call of the interface device 500.
In some embodiments, as shown in fig. 9, when the interface device 500 needs to be called, the method for determining multiple interface devices further includes:
s4: acquiring a control instruction for starting the interface equipment;
s5: resolving the index value in the control instruction in response to the control instruction;
s6: if the index value is in the list of the callable devices, starting to operate the interface device corresponding to the index value;
s7: and if the index value is not in the callable device list, starting to operate the interface devices in the callable device list, which are the same as the index value in type.
Along with the operation process of the user, the user may call the interface device 500 when using a certain function, i.e., input a control instruction for activating the interface device 500. For example, after opening the "video call" application, the user selects one or more contacts in the contact interface, and presses the "OK/OK" key on the control device 100 to start a video call. At this time, the display device 200 needs to start to operate the camera device, and thus, after the controller 250 detects that the user inputs the interaction command corresponding to the interaction process, it is determined that the user inputs the control instruction for starting the interface device 500.
In order to facilitate the execution of the calling process, the control instruction includes an index value of the interface device to be started. The index value may be a flag string used to indicate that the interface device 500 needs to be started, for example, the index value may be "camera 01" to represent that the application needs to call the camera device with sequence number 01. After acquiring the control command for starting the interface device 500, the controller 250 may parse the control command to determine an index value included in the control command, and perform matching in the callable list according to the index value. For example, the interface device 500 to be called is determined by determining whether the camera index value is consistent with the device node name in the callable list.
If the index value is in the list of the callable devices, the interface device 500 corresponding to the index value is directly started to operate; if the index value is not in the list of callable devices, the operation of the interface devices in the list of callable devices of the same type as the index value may be started in order to ensure that the function can be implemented. For example, if the upper application determines to turn on the rear camera (camera00) when the camera is turned on, but the display device 200 has only the video1(camera01) thereon, the existing device, i.e., the interface device 500 corresponding to the camera01, may be turned on preferentially at this time.
It can be seen that, in this embodiment, the interface device 500 may be called by using a cyclic search fault-tolerant method, and when the specified interface device 500 cannot be matched in the list of the callable devices, the interface device 500 of the same type and existing in the list of the callable devices is preferentially opened, so as to ensure the implementation of the corresponding function.
In some embodiments, to implement the method 500 for starting and running the same type of interface device, the step of starting and running the same type of interface device as the index value in the callable device list further includes:
s701: setting a priority for the interface equipment in the callable equipment list according to the control instruction;
s702: and starting the interface equipment with the highest running priority.
In practical applications, priorities may be set for the interface devices 500 in the list of callable devices according to the control instruction, and the setting of the priorities may be comprehensively evaluated according to information such as device performance, model, and interface mode, so that the interface devices 500 called each time can better implement specific functions.
When the user uses different application programs, the camera equipment which is preferentially called by the user is different. For example, a plurality of camera apparatuses are connected to the external device interface 240, and a video monitoring application and a video call application are simultaneously installed on the display apparatus 200. If the control instruction input by the user is to open the video monitoring application, setting the nodes of the camera equipment corresponding to the monitoring area to have higher priority; if the control instruction input by the user is to open the video call application, the camera device nodes set in the user area have a higher priority.
After the priority is set, if the corresponding index value in the control instruction is not in the callable device list, the interface device 500 with the highest running priority is started, and then the interface device 500 which better meets the functional requirements is called on the basis of preferentially opening the existing interface device 500.
Based on the foregoing multi-interface device determining method, some embodiments of the present application further provide a display device 200, as shown in fig. 10, including: a display 275, an external device interface 240, and a controller 250. Wherein the display 275 is configured to display a user interface; the external device interface 240 is configured to access the interface apparatus 500; the controller 250 is configured to perform the following program steps:
s1: acquiring a detection instruction for detecting the interface equipment;
s2: responding to the detection instruction, and reading node identification information corresponding to the interface equipment;
s3: and if the node identification information belongs to the preset equipment type, adding the node name of the interface equipment to an invokable equipment list.
As can be seen from the foregoing technical solutions, the display apparatus 200 provided in the present application may read the node identification information corresponding to the interface apparatus 500 through the detection instruction after the interface apparatus 500 accesses the display apparatus 200 through the external device interface 240, and add the node name of the interface apparatus 500 to the list of callable apparatuses when the node identification information belongs to the preset apparatus type, so as to filter the non-standard interface apparatus. If the display device 200 calls the interface device 500, the display device 200 can perform cyclic search from the list of the called devices, so that the display device 200 can accurately identify and call the interface device 500, and the problem of starting failure of the interface device 500 is solved.
The embodiments provided in the present application are only a few examples of the general concept of the present application, and do not limit the scope of the present application. Any other embodiments extended according to the scheme of the present application without inventive efforts will be within the scope of protection of the present application for a person skilled in the art.

Claims (10)

1. A display device, comprising:
a display;
an external device interface configured to access the interface apparatus;
a controller configured to:
acquiring the power distribution condition of the external device interface to obtain the connection state of the external device interface;
if the connection state changes, generating a trigger signal for generating a detection instruction;
responding to the detection instruction, and if the interface equipment is detected to be standard equipment, sending a reading instruction for feeding back node identification information to the interface equipment;
acquiring characteristic data fed back by the interface equipment, and extracting node identification information corresponding to the interface equipment;
if the interface equipment is detected to be nonstandard equipment, sending a starting instruction to the interface equipment;
receiving detection data fed back by the interface equipment, and obtaining a data type according to the detection data;
matching the node identification information according to the data type;
if the node identification information belongs to a preset device type, adding the node name of the interface device to a list of the calling devices, wherein the preset device type is determined according to the function to be called of the started application program; the list of callable devices corresponds to package name information for the launched application.
2. The display device according to claim 1, wherein in the step of extracting the node identification information corresponding to the interface device, the controller is further configured to:
in response to the detection instruction, sending a read command to each of the interface devices;
and receiving node identification information fed back by the interface equipment aiming at the read command.
3. The display device of claim 2, wherein in the step of sending a read command to each of the interface devices, the controller is further configured to:
detecting the number of equipment nodes corresponding to the interface equipment;
if the number of the equipment nodes is more than or equal to 2, sending a reading command to each equipment node;
and if the number of the equipment nodes is equal to 1, the step of adding the node names of the interface equipment to the list of the callable equipment is executed.
4. The display device according to claim 2, wherein the node identification information includes a signal type of data that can be transmitted by the interface device, and in the step of reading the node identification information corresponding to the interface device, the controller is further configured to:
extracting the signal type from the node identification information;
if the signal type is the same as the signal type corresponding to the preset equipment type, determining that the node identification information belongs to the preset equipment type;
and if the signal type is different from the signal type corresponding to the preset equipment type, determining that the node identification information does not belong to the preset equipment type.
5. The display apparatus according to claim 1, wherein the detection instruction for detecting the interface apparatus is a trigger signal generated by the interface apparatus when the interface apparatus is connected to the external device interface, and the controller is further configured to:
acquiring a trigger signal generated when the interface equipment is accessed to the interface of the external device;
responding to the trigger signal, executing software configuration on the interface equipment, and executing reading of node identification information corresponding to the interface equipment node.
6. The display device according to claim 1, wherein the detection instruction for detecting the interface device is an application start instruction input by a user, and the controller is further configured to:
acquiring the application program starting instruction;
and starting and operating the application program corresponding to the application program starting instruction, and executing the step of reading the node identification information corresponding to the interface equipment.
7. The display device according to claim 1, wherein the detection instruction for detecting the interface device is a power-on start instruction input by a user, and the controller is further configured to:
acquiring the starting-up instruction;
responding to the starting instruction, and running an operating system starting program;
and after the operation of the starting program is finished, executing the step of reading the node identification information corresponding to the interface equipment node.
8. The display device of claim 1, wherein the controller is further configured to:
acquiring a control instruction for starting the interface equipment, wherein the control instruction comprises an index value of the interface equipment to be started;
resolving the index value in the control instruction in response to the control instruction;
if the index value is in the list of the callable devices, starting to operate the interface device corresponding to the index value;
and if the index value is not in the callable device list, starting to operate the interface devices in the callable device list, which are the same as the index value in type.
9. The display device according to claim 8, wherein in the step of starting to run the interface device of the same type as the index value in the callable device list, the controller is further configured to:
setting a priority for the interface equipment in the callable equipment list according to the control instruction;
and starting the interface equipment with the highest running priority.
10. A multi-interface device judgment method is characterized by comprising the following steps:
acquiring the power distribution condition of an external device interface to obtain the connection state of the external device interface;
if the connection state changes, generating a trigger signal for generating a detection instruction;
responding to the detection instruction, and if the interface equipment is detected to be standard equipment, sending a reading instruction for feeding back node identification information to the interface equipment;
acquiring characteristic data fed back by the interface equipment, and extracting node identification information corresponding to the interface equipment;
if the interface equipment is detected to be nonstandard equipment, sending a starting instruction to the interface equipment;
receiving detection data fed back by the interface equipment, and obtaining a data type according to the detection data;
matching the node identification information according to the data type;
if the node identification information belongs to a preset device type, adding the node name of the interface device to a list of the calling devices, wherein the preset device type is determined according to the function to be called of the started application program; the list of callable devices corresponds to package name information for the launched application.
CN202010731416.6A 2020-07-27 2020-07-27 Display device and multi-interface device judgment method Active CN111918132B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010731416.6A CN111918132B (en) 2020-07-27 2020-07-27 Display device and multi-interface device judgment method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010731416.6A CN111918132B (en) 2020-07-27 2020-07-27 Display device and multi-interface device judgment method

Publications (2)

Publication Number Publication Date
CN111918132A CN111918132A (en) 2020-11-10
CN111918132B true CN111918132B (en) 2022-09-23

Family

ID=73280196

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010731416.6A Active CN111918132B (en) 2020-07-27 2020-07-27 Display device and multi-interface device judgment method

Country Status (1)

Country Link
CN (1) CN111918132B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114302193B (en) * 2021-01-14 2022-09-30 海信视像科技股份有限公司 Display device and protocol detection method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101900319B1 (en) * 2012-02-07 2018-09-19 삼성전자 주식회사 Method for interoperably performing service and system supporting the same
CN102819500B (en) * 2012-07-20 2016-01-20 腾讯科技(深圳)有限公司 A kind of method and device creating peripheral equipment control interface
CN105449402B (en) * 2014-08-20 2019-01-15 联想(北京)有限公司 A kind of connector and its application method, the electronic equipment for being provided with the connector
CN104216840B (en) * 2014-09-11 2018-03-23 青岛海信移动通信技术股份有限公司 The method and device that a kind of USB sets and operated to external equipment
JP6358063B2 (en) * 2014-12-02 2018-07-18 富士通株式会社 Request transmission method, information processing apparatus, and program
CN109710298A (en) * 2018-08-20 2019-05-03 平安普惠企业管理有限公司 Interface managerial method, interface management apparatus, interface management equipment and storage medium

Also Published As

Publication number Publication date
CN111918132A (en) 2020-11-10

Similar Documents

Publication Publication Date Title
CN111752518A (en) Screen projection method of display equipment and display equipment
CN112019782B (en) Control method and display device of enhanced audio return channel
CN112135180B (en) Content display method and display equipment
CN111836115B (en) Screen saver display method, screen saver skipping method and display device
CN112118400B (en) Display method of image on display device and display device
CN111970549A (en) Menu display method and display device
CN112565862A (en) Display equipment and equipment parameter memorizing method and restoring method thereof
CN111954059A (en) Screen saver display method and display device
CN112199064A (en) Interaction method of browser application and system platform and display equipment
CN111866498B (en) Camera abnormity processing method and display device
CN112328553A (en) Thumbnail capturing method and display device
CN112040340A (en) Resource file acquisition method and display device
CN112269668A (en) Application resource sharing and display equipment
CN111984167A (en) Rapid naming method and display device
CN112040535A (en) Wifi processing method and display device
CN111918132B (en) Display device and multi-interface device judgment method
CN113810747B (en) Display equipment and signal source setting interface interaction method
CN111988646B (en) User interface display method and display device of application program
CN112118476B (en) Method for rapidly displaying program reservation icon and display equipment
CN114079827A (en) Menu display method and display device
CN114390190A (en) Display equipment and method for monitoring application to start camera
CN111931692A (en) Display device and image recognition method
CN111918056A (en) Camera state detection method and display device
CN111935519B (en) Channel switching method and display device
CN113438553B (en) Display device awakening method and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant