CN111787364B - Media data acquisition method, smart television and mobile terminal - Google Patents

Media data acquisition method, smart television and mobile terminal Download PDF

Info

Publication number
CN111787364B
CN111787364B CN202010669607.4A CN202010669607A CN111787364B CN 111787364 B CN111787364 B CN 111787364B CN 202010669607 A CN202010669607 A CN 202010669607A CN 111787364 B CN111787364 B CN 111787364B
Authority
CN
China
Prior art keywords
server
media identifier
resource information
media data
media
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010669607.4A
Other languages
Chinese (zh)
Other versions
CN111787364A (en
Inventor
逯林虎
王金童
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Juhaokan Technology Co Ltd
Original Assignee
Juhaokan Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juhaokan Technology Co Ltd filed Critical Juhaokan Technology Co Ltd
Priority to CN202010669607.4A priority Critical patent/CN111787364B/en
Publication of CN111787364A publication Critical patent/CN111787364A/en
Application granted granted Critical
Publication of CN111787364B publication Critical patent/CN111787364B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4722End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
    • H04N21/8586Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot by using a URL

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The embodiment of the application discloses a media data acquisition method, an intelligent television and a mobile terminal. In the embodiment of the application, after the smart television uploads the target media data to the server, the server returns resource information for acquiring the target media data to the server, and then the smart television displays the resource information. Therefore, the mobile terminal can acquire the corresponding media data from the server side by acquiring the resource information, and the media data is transmitted from the smart television to the mobile terminal.

Description

Media data acquisition method, smart television and mobile terminal
Technical Field
The embodiment of the application relates to the technical field of media, in particular to a media data acquisition method, an intelligent television and a mobile terminal.
Background
Currently, smart televisions are capable of locally generating media data such as pictures or videos. For example, when a smart television plays a video, a video picture being played can be intercepted and stored. Or, a camera is configured on the smart television, and image acquisition can be performed through the camera. For media data generated locally by the smart television, how to transmit the media data to the mobile terminal for being viewed by a user is a problem to be solved currently.
Disclosure of Invention
The embodiment of the application provides a media data acquisition method, an intelligent television and a mobile terminal, and can transmit media data stored locally in the intelligent television to the mobile terminal. The technical scheme is as follows:
in one aspect, a smart television is provided, which includes a controller and a display;
the controller is used for uploading target media data to the server; receiving resource information of the target media data returned by the server, wherein the resource information is used for a mobile terminal to obtain the target media data;
the controller is further configured to control the display to display the resource information, so that the mobile terminal obtains the target media data according to the resource information.
In another aspect, a media data acquisition method is provided, which is applied to a smart television, and the media data acquisition includes:
uploading the target media data to a server;
receiving resource information of the target media data returned by the server, wherein the resource information is used for a mobile terminal to obtain the target media data;
and displaying the resource information so that the mobile terminal acquires the target media data according to the resource information.
In another aspect, a mobile terminal is provided, the mobile terminal comprising a controller and a display;
the controller is used for acquiring resource information of target media data displayed by the intelligent television, wherein the resource information is returned by the server after the intelligent television uploads the target media data to the server; acquiring target media data from the server according to the resource information;
the display is used for displaying a user interface and acquiring information.
In another aspect, a media data acquiring method is provided, which is applied to a mobile terminal, and includes:
acquiring resource information of target media data displayed by an intelligent television, wherein the resource information is returned by a server after the intelligent television uploads the target media data to the server;
and acquiring target media data from the server according to the resource information.
In another aspect, a media data acquisition method applied to an applet server is provided, the method including:
receiving a data acquisition request containing a second media identifier, wherein the data acquisition request is sent by the mobile terminal after scanning a graphic code generated by the smart television according to the received second media identifier:
determining a first media identifier according to the second media identifier and a pre-stored mapping relationship, wherein the first media identifier is issued to the smart television by the first server after receiving the target media data uploaded by the smart television, and the pre-stored mapping relationship is stored when the applet server receives the first media identifier uploaded by the smart television and generates the second media identifier according to the first media identifier;
accessing the first server according to the first media identifier so that the first server feeds back an access address of the target media data according to the first media identifier and an address mapping relation, wherein the address mapping relation is the mapping relation between the first media identifier and the access address stored when the first server generates the first media identifier according to the target media data uploaded by the smart television;
and receiving the access address, and sending the access address to the mobile terminal, so that the mobile terminal accesses the first server to acquire the target media data according to the access address.
In another aspect, a computer-readable storage medium is provided, in which a computer program is stored, which, when being executed by a processor, implements the steps of the media data acquisition method described above.
In another aspect, a computer program product comprising instructions is provided, which when run on a computer, causes the computer to perform the steps of the media data acquisition method described above.
The technical scheme provided by the embodiment of the application can at least bring the following beneficial effects:
in the embodiment of the application, after the smart television uploads the target media data to the server, the server returns resource information for acquiring the target media data to the server, and then the smart television displays the resource information. Therefore, the mobile terminal can acquire the corresponding media data from the server side by acquiring the resource information, and the media data is transmitted from the smart television to the mobile terminal.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is an implementation environment diagram of a media data acquisition method according to an embodiment of the present application;
fig. 2 is a schematic diagram illustrating an operational scenario between a smart tv and a control device according to an exemplary embodiment;
fig. 3 is a block diagram illustrating a hardware configuration of a smart tv according to an exemplary embodiment;
fig. 4 is a block diagram illustrating a configuration of a control apparatus according to an exemplary embodiment;
fig. 5 is a schematic diagram illustrating a functional configuration of an intelligent tv according to an exemplary embodiment;
fig. 6 is a block diagram illustrating a configuration of a software system in a smart tv according to an exemplary embodiment;
fig. 7 is a block diagram illustrating a configuration of an application in a smart tv according to an exemplary embodiment;
FIG. 8 is a flow chart illustrating a method of media data acquisition according to an exemplary embodiment;
FIG. 9 is a flow chart illustrating another method of media data acquisition in accordance with an exemplary embodiment;
fig. 10 is a block diagram of a mobile terminal according to an example embodiment.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
All other embodiments, which can be obtained by a person skilled in the art without making creative efforts based on the exemplary embodiments shown in the embodiments of the present application, belong to the protection scope of the embodiments of the present application. In addition, while the disclosure in the embodiments of the present application has been presented in terms of exemplary embodiment or embodiments, it should be appreciated that aspects of the disclosure may stand alone in a complete solution.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims of the embodiments of the application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can be implemented in sequences other than those illustrated or otherwise described herein with respect to the embodiments of the application, for example.
Furthermore, the terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module" as used in the embodiments of the present application refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote controller" used in the embodiments of the present application refers to a component of an electronic device (such as the display device disclosed in the embodiments of the present application), which can be controlled wirelessly, typically in a short distance range. The touch screen remote control device is generally connected with an electronic device by using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB (Universal Serial Bus), bluetooth, and a motion sensor.
The term "gesture" used in the embodiments of the present application refers to a user behavior used to express an intended idea, action, purpose, or result through a change in hand shape or an action such as hand movement.
Before explaining the embodiments of the present application in detail, an application scenario of the embodiments of the present application will be described.
At present, the functions of the smart television are more and more abundant. For example, when a video is played, a user triggers a video capture function provided by the smart television, so that the smart television can capture a video picture currently being played for storage. Or, through the video capturing function, the smart television can capture the video clip for storage. For another example, some smart televisions are equipped with a camera, and the smart televisions can acquire user images or other environment images through the equipped camera so as to store the user images or other environment images. For the media data such as pictures and videos generated and stored locally in the smart television, the user may want to transfer the media data to the mobile terminal of the user for viewing or share the media data with other users. Under the circumstance, the intelligent television and the mobile terminal can realize the transfer of the media data from the intelligent television to the mobile terminal through the media data acquisition method provided by the embodiment of the application.
Next, an implementation environment related to the media data acquisition method provided by the embodiment of the present application is described.
Fig. 1 is an environment diagram for implementing a media data acquisition method according to an embodiment of the present disclosure. As shown in fig. 1, the implementation environment includes a server 100, a smart tv 200, and a mobile terminal 300. Wherein, both the smart tv 200 and the mobile terminal 300 can communicate with the server 100.
It should be noted that, in one possible implementation manner, referring to fig. 1, the server 100 includes a first server 101 and a second server 102. The first server 101 is an application server of a first application installed on the smart television 200. The second server 102 is a designated applet server or an application server of the second application installed on the mobile terminal 300.
In this implementation, the smart tv 200 can communicate with the first server 101 and the second server 102, the first server 101 and the second server 102 can communicate, and the second server 102 can communicate with the mobile terminal 300.
The smart television 200 uploads media data to the first server 101, the first server 101 stores the media data uploaded by the smart television 200, generates a corresponding first media identifier for the media data, and returns the first media identifier to the smart television 200. The smart tv 200 sends the first media identification to the second server 102. The second server 102 generates resource information of the media data according to the first media identifier, and then feeds the resource information back to the smart television 200. After receiving the resource information, the smart tv 200 displays the resource information. The mobile terminal 300 acquires the resource information displayed by the smart television 200, and acquires corresponding media data from the first server 101 through the second server 102 according to the resource information, so that the media data is transferred from the smart television 200 to the mobile terminal 300.
Optionally, in another possible implementation manner, the server 100 is an application server corresponding to a specified application installed on the smart tv 200 and the mobile terminal 300, and the smart tv 200 and the mobile terminal 300 can communicate through the application server. In this case, the smart tv 200 uploads the media data to be transferred to the server 100, the server 100 stores the media data, generates resource information of the media data, and returns the resource information to the smart tv 200. After receiving the resource information, the smart television 200 displays the resource information. Then, the mobile terminal 300 acquires the resource information, and acquires corresponding media data from the server 100 according to the resource information.
It should be noted that the server in the foregoing various implementation manners may be a single server or a server cluster, which is not limited in this embodiment of the present application. The mobile terminal 300 is a mobile device such as a smart phone, a tablet computer, and the like, which is not limited in this embodiment.
Fig. 2 is a schematic diagram illustrating an operation scenario between a smart tv and a control device according to an exemplary embodiment. As shown in fig. 2, the user may operate the smart tv 200 through the mobile terminal 300 and the control device 400.
The control device 400 may be a remote controller, which includes infrared protocol communication or bluetooth protocol communication, and other short-distance communication methods, and controls the smart tv 200 in a wireless or other wired manner. The user may input a user command through a button on the remote controller, a voice input, a control panel input, etc. to control the smart tv 200. Such as: the user can input a corresponding control instruction through a volume up-down key, a channel control key, an up/down/left/right moving key, a voice input key, a menu key, a power on/off key and the like on the remote controller, so as to realize the function of controlling the smart television 200.
In some embodiments, mobile terminals, tablets, computers, laptops, and other smart devices may also be used to control the smart tv 200. For example, the smart tv 200 is controlled using an application running on the smart device. The application, through configuration, may provide the user with various controls in an intuitive User Interface (UI) on a screen associated with the smart device.
For example, the mobile terminal 300 may install a software application with the smart tv 200, implement connection communication through a network communication protocol, and implement the purpose of one-to-one control operation and data communication. Such as: the control instruction protocol can be established between the mobile terminal 300 and the intelligent television 200, the remote control keyboard is synchronized to the mobile terminal 300, and the function of controlling the intelligent television 200 is realized by controlling the user interface on the mobile terminal 300. The audio and video content displayed on the mobile terminal 300 can also be transmitted to the smart television 200, so that the synchronous display function is realized.
As also shown in fig. 2, the smart tv 200 is also in data communication with other content servers besides the server shown in fig. 1 through various communication methods. The smart tv 200 may be allowed to communicatively connect through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. These content servers may provide various content and interactions to the smart tv 200. Illustratively, the smart tv 200 receives software program updates, or accesses a remotely stored digital media library, by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The servers may be a group or a plurality of groups, and may be one or more types of servers. Other web service contents such as video on demand and advertisement services are provided through these servers.
The smart tv 200 may include a liquid crystal display, an OLED (organic light-Emitting Diode) display, and a projection smart tv. The specific smart tv type, size and resolution, etc. are not limited, and those skilled in the art will appreciate that the smart tv 200 may be modified in performance and configuration as desired.
The smart tv 200 may additionally provide an intelligent network tv function that provides a computer support function in addition to the broadcast receiving tv function. Illustratively, the smart tv 200 may also provide the functions of a web tv, a smart tv, an Internet Protocol Television (IPTV), and other smart tvs.
A block diagram of a hardware configuration of the smart tv 200 according to an exemplary embodiment is exemplarily shown in fig. 3.
In some embodiments, at least one of the controller 250, the tuner demodulator 210, the communicator 220, the detector 230, the input/output interface 255, the display 275, the audio output interface 285, the memory 260, the power supply 290, the user interface 265, and the external device interface 240 is included in the smart tv 200.
In some embodiments, a display 275 receives image signals originating from the first processor output and displays video content and images and components of the menu manipulation interface.
In some embodiments, the display 275, includes a display screen assembly for presenting a picture, and a driving assembly that drives the display of an image.
In some embodiments, the video content is displayed from broadcast television content, or alternatively, from various broadcast signals that may be received via wired or wireless communication protocols. Alternatively, various image contents received from the network communication protocol and sent from the network server side can be displayed.
In some embodiments, the display 275 is used to present a user manipulation UI interface generated in the smart tv 200 and used to control the smart tv 200.
In some embodiments, a driver assembly for driving the display is also included, depending on the type of display 275.
In some embodiments, display 275 is a projection display and may also include a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver.
In some embodiments, the smart tv 200 may establish control signal and data signal transmission and reception with the external control apparatus 100 or the content providing apparatus through the communicator 220.
In some embodiments, the user interface 265 may be configured to receive infrared control signals from a control device 100 (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is a smart tv 200 for collecting signals of an external environment or interacting with the outside.
In some embodiments, the detector 230 includes a light receiver, a sensor for collecting the intensity of ambient light, and parameters changes can be adaptively displayed by collecting the ambient light, and the like.
In some embodiments, the detector 230 may further include an image collector, such as a camera, etc., which may be configured to collect external environment scenes, collect attributes of the user or gestures interacted with the user, adaptively change display parameters, and recognize user gestures, so as to implement a function of interaction with the user.
In some embodiments, the detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature.
In some embodiments, the smart tv 200 may adaptively adjust the display color temperature of the image. For example, when the temperature is higher, the color temperature of the image displayed by the smart tv 200 can be adjusted to be a cool tone, or when the temperature is lower, the image displayed by the smart tv 200 can be adjusted to be a warm tone.
In some embodiments, the detector 230 may also be a sound collector or the like, such as a microphone, which may be used to receive the user's voice. Illustratively, a voice signal including a control instruction of the user to control the smart tv 200, or an ambient sound is collected for identifying the ambient scene type, so that the smart tv 200 can adapt to the ambient noise.
In some embodiments, as shown in fig. 3, the input/output interface 255 is configured to allow data transfer between the controller 250 and external other devices or other controllers 250. Such as receiving video signal data and audio signal data of an external device, or command instruction data, etc.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: the interface can be any one or more of a high-definition multimedia interface (HDMI), an analog or data high-definition component input interface, a composite video input interface, a USB input interface, an RGB port and the like. The plurality of interfaces may form a composite input/output interface.
In some embodiments, as shown in fig. 3, the tuning demodulator 210 is configured to receive a broadcast television signal through a wired or wireless receiving manner, perform modulation and demodulation processing such as amplification, mixing, resonance, and the like, and demodulate an audio and video signal from a plurality of wireless or wired broadcast television signals, where the audio and video signal may include a television audio and video signal carried in a television channel frequency selected by a user and an EPG data signal.
In some embodiments, the frequency points demodulated by the tuner demodulator 210 are controlled by the controller 250, and the controller 250 can send out control signals according to user selection, so that the modem responds to the television signal frequency selected by the user and modulates and demodulates the television signal carried by the frequency.
In some embodiments, the broadcast television signal may be classified into a terrestrial broadcast signal, a cable broadcast signal, a satellite broadcast signal, an internet broadcast signal, or the like according to the broadcasting system of the television signal. Or may be classified into a digital modulation signal, an analog modulation signal, and the like according to a modulation type. Or the signals are classified into digital signals, analog signals, and the like according to the type of the signals.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box. Therefore, the set top box outputs the television audio and video signals modulated and demodulated by the received broadcast television signals to the main body equipment, and the main body equipment receives the audio and video signals through the first input/output interface.
In some embodiments, controller 250 controls the operation of the smart television and responds to user actions through various software control programs stored in memory. The controller 250 may control the overall operation of the smart tv 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon. The user command for selecting the UI object may be a command input through various input devices (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the smart tv 200 or a voice command corresponding to a voice spoken by the user.
As shown in fig. 3, the controller 250 includes at least one of a Random Access Memory 251 (RAM), a Read-Only Memory 252 (ROM), a video processor 270, an audio processor 280, other processors 253 (e.g., a Graphics Processing Unit (GPU), a Central Processing Unit 254 (CPU), a Communication Interface (Communication Interface), and a Communication Bus 256(Bus), which connects the respective components.
In some embodiments, RAM 251 is used to store temporary data for the operating system or other programs that are running
In some embodiments, ROM 252 is used to store instructions for various system boots.
In some embodiments, the ROM 252 is used to store a Basic Input Output System (BIOS). The system is used for completing power-on self-test of the system, initialization of each functional module in the system, a driver of basic input/output of the system and booting an operating system.
In some embodiments, when the power-on signal is received, the power of the smart tv 200 starts to be started, the CPU executes the system start instruction in the ROM 252, and copies the temporary data of the operating system stored in the memory into the RAM 251 so as to start or run the operating system. After the start of the operating system is completed, the CPU copies the temporary data of the various application programs in the memory to the RAM 251, and then, the various application programs are started or run.
In some embodiments, CPU processor 254 is used to execute operating system and application program instructions stored in memory. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some example embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include a main processor and one or more sub-processors. And a main processor for performing some operations of the smart tv 200 in the pre-power-up mode and/or displaying a picture in the normal mode. One or more sub-processors for one operation in a standby mode or the like.
In some embodiments, the graphics processor 253 is used to generate various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And the system comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor 270 is configured to receive an external video signal, and perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, and the like according to a standard codec protocol of the input signal, so as to obtain a signal that can be directly displayed or played on the smart tv 200.
In some embodiments, video processor 270 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert an input video frame rate, such as a 60Hz frame rate into a 120Hz frame rate or a 240Hz frame rate, and the normal format is implemented in, for example, an interpolation frame mode.
The display format module is used for converting the received video output signal after the frame rate conversion, and changing the signal to conform to the signal of the display format, such as outputting an RGB data signal.
In some embodiments, the graphics processor 253 and the video processor may be integrated or separately configured, and when the graphics processor and the video processor are integrated, the graphics processor and the video processor may perform processing of graphics signals output to the display, and when the graphics processor and the video processor are separately configured, the graphics processor and the video processor may perform different functions, respectively, for example, a GPU + frc (frame Rate conversion) architecture.
In some embodiments, the audio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, and amplification processes to obtain an audio signal that can be played in a speaker.
In some embodiments, video processor 270 may comprise one or more chips. The audio processor may also comprise one or more chips.
In some embodiments, the video processor 270 and the audio processor 280 may be separate chips or may be integrated together with the controller in one or more chips.
In some embodiments, the audio output, under the control of controller 250, receives sound signals output by audio processor 280, such as: the speaker 286, and the external sound output terminal of the generating device that can output to the external device, besides the speaker carried by the smart tv 200 itself, such as: external sound interface or earphone interface, etc., and may also include a near field communication module in the communication interface, for example: and the Bluetooth module is used for outputting sound of the Bluetooth loudspeaker.
The power supply 290 provides power supply support for the smart tv 200 with power input from an external power source under the control of the controller 250. The power supply 290 may include a built-in power supply circuit installed inside the smart tv 200, or may be a power supply interface installed outside the smart tv 200 to provide an external power supply in the smart tv 200.
A user interface 265 for receiving an input signal of a user and then transmitting the received user input signal to the controller 250. The user input signal may be a remote controller signal received through an infrared receiver, and various user control signals may be received through the network communication module.
In some embodiments, the user inputs a user command through the control device 100 or the mobile terminal 300, the user input interface responds to the user input according to the user input, and the smart tv 200 responds to the user input through the controller 250.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on the display 275, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
The memory 260 includes various software modules for driving the smart tv 200. Such as: various software modules stored in the first memory, including: at least one of a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
The basic module is a bottom software module used for signal communication among the hardware in the smart television 200 and sending processing and control signals to the upper layer module. The detection module is used for collecting various information from various sensors or user input interfaces, and the management module is used for performing digital-to-analog conversion and analysis management.
For example, the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is used for controlling the display to display the image content, and can be used for playing the multimedia image content, UI interface and other information. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing a module for data communication between browsing servers. And the service module is used for providing various services and modules including various application programs. Meanwhile, the memory 260 may also store a visual effect map for receiving external data and user data, images of various items in various user interfaces, and a focus object, and the like.
Referring to fig. 4, fig. 4 is a block diagram illustrating a configuration of a control apparatus according to an exemplary embodiment. The control device 400 includes a controller 410, a communication interface 430, a user input/output interface 440, a memory 490, and a power supply 480.
The control device 400 is configured to control the smart tv 200, and may receive an input operation instruction of a user, and convert the operation instruction into an instruction recognizable and responsive by the smart tv 200, and function as an interaction intermediary between the user and the smart tv 200. Such as: the user responds to the channel add/subtract operation by operating the channel add/subtract key on the control device 400.
In some embodiments, the control device 400 may be a smart device. Such as: the control apparatus 400 may install various applications that control the smart tv 200 according to user requirements.
In some embodiments, as shown in fig. 2, the mobile terminal 300 or other intelligent electronic device may function similar to the control device 400 after installing an application for operating the smart tv 200. Such as: a user may implement the functions of controlling the physical keys of the device 400 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 300 or other intelligent electronic device.
The controller 410 includes a processor 412 and RAM413 and ROM414, a communication interface 418, and a communication bus. The controller 410 is used to control the operation of the control device 400, as well as the internal components for communication and coordination and external and internal data processing functions.
The communication interface 430 enables communication of control signals and data signals with the smart tv 200 under the control of the controller 410. Such as: and sending the received user input signal to the smart television 200. The communication interface 430 may include at least one of a WiFi chip 431, a bluetooth module 432, an NFC module 433, and other near field communication modules.
User input/output interface 440, wherein the input interface includes at least one of microphone 441, touchpad 442, sensors 443, keys 444, among other input interfaces. Such as: the user can realize the user instruction input function through actions such as voice, touch, gestures, pressing and the like, and the input interface converts the received analog signals into digital signals and converts the digital signals into corresponding instruction signals to be sent to the smart television 200.
The output interface includes an interface that transmits the received user instruction to the smart tv 200. In some embodiments, the interface may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, a user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the smart television 200 through the infrared sending module. The following steps are repeated: when the radio frequency signal interface is used, a user input instruction needs to be converted into a digital signal, and then the digital signal is modulated according to a radio frequency control signal modulation protocol and then is sent to the smart television 200 through a radio frequency sending terminal.
In some embodiments, the control device 400 includes at least one of a communication interface 430 and an output interface. The control device 400 is configured with a communication interface 430, such as: a WiFi module, a bluetooth module, an NFC (Near Field Communication) module, and the like, which may encode a user input command through a WiFi protocol, a bluetooth protocol, or an NFC protocol, and send the user input command to the smart tv 200.
And a memory 490 for storing various operation programs, data, and applications for driving and controlling the smart tv 200 under the control of the controller 410. The memory 490 may store various control signal commands input by a user.
The power supply 480, which is used to provide operational power support for the various elements of the control device 400 under the control of the controller 410, may include a battery and associated control circuitry.
Referring to fig. 5, fig. 5 is a schematic diagram illustrating a functional configuration of an intelligent television according to an exemplary embodiment.
As shown in fig. 5, the memory 290 is used to store an operating system, applications, contents, user data, and the like, and performs various operations of driving the system operation of the smart tv 200 and responding to the user under the control of the controller 210. The memory 290 may include volatile and/or nonvolatile memory.
The memory 290 is specifically configured to store an operating program for driving the controller 210 in the smart tv 200, and to store various applications built in the smart tv 200, various applications downloaded by a user from an external device, various graphical user interfaces related to the applications, various objects related to the graphical user interfaces, user data information, and internal data of various supported applications. The memory 290 is used for storing System software such as an OS (Operating System) kernel, middleware, and applications, and storing input video data and audio data, and other user data.
The memory 290 is specifically used for storing drivers and related data such as an audio/video processor, a display, a communication interface, a detector input/output interface, and the like.
In some embodiments, memory 290 may store software and/or programs, software programs for representing an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an Application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (e.g., the middleware, APIs, or applications), and the kernel may provide interfaces to allow the middleware and APIs, or applications, to access the controller to implement controlling or managing system resources.
The memory 290, for example, includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a light receiving module 2909, a power control module 2910, an operating system 2911, and other applications 2912, a browser module, and the like. The controller 210 performs functions such as: a broadcast television signal reception demodulation function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction recognition function, a communication control function, an optical signal reception function, an electric power control function, a software control platform supporting various functions, a browser function, and the like.
Referring to fig. 6, fig. 6 is a block diagram illustrating a configuration of a software system in a smart tv according to an exemplary embodiment.
As shown in fig. 6, an operating system 2911, including executing operating software for handling various basic system services and for performing hardware related tasks, acts as an intermediary for data processing performed between applications 2912 and the hardware components. In some embodiments, part of the operating system kernel may contain a series of software to manage the smart tv hardware resources and provide services for other programs or software code.
In other embodiments, a portion of the operating system kernel may include one or more device drivers, which may be a set of software code in the operating system that assists in operating or controlling the devices or hardware associated with the smart television. The drivers may contain code that operates the video, audio, and/or other multimedia components. Examples include a display screen, a camera, Flash, WiFi, and audio drivers.
The accessibility module 2911-1 is configured to modify or access the application 2912, to achieve accessibility of the application 2912 and operability of the content displayed thereon.
A communication module 2911-2 for connection to other peripherals via associated communication interfaces and a communication network.
The user interface module 2911-3 is configured to provide an object for displaying a user interface, so that each application program can access the object, and user operability can be achieved.
Control applications 2911-4 for controllable process management, including runtime applications and the like.
The event transmission system 2914, which may be implemented within the operating system 2911 or within the application programs 2912, in some embodiments, on the one hand within the operating system 2911, and in the same time within the application programs 2912, is adapted to listen for various user input events, to refer to handlers that implement one or more predefined sets of operations in response to the identification of various types of events or sub-events, depending on the various events.
The event monitoring module 2914-1 is configured to monitor an event or a sub-event input by the user input interface.
The event identification module 2914-1 is configured to input definitions of various types of events for various user input interfaces, identify various events or sub-events, and transmit the same to a process for executing one or more corresponding sets of processes.
The event or sub-event refers to an input detected by one or more sensors in the smart tv 200 and an input of an external control device (e.g., the control device 400). Such as: the method comprises the following steps of inputting various sub-events through voice, inputting gestures through gesture recognition, inputting sub-events through remote control key commands of the control equipment and the like. Illustratively, the one or more sub-events in the remote control include a variety of forms including, but not limited to, one or a combination of key presses up/down/left/right/, ok keys, key presses, and the like. And non-physical key operations such as move, hold, release, etc.
The interface layout manager 2913, directly or indirectly receiving the input events or sub-events from the event transmission system 2914, monitors the input events or sub-events, and updates the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the size, position, and level of the container, and other various execution operations related to the layout of the interface.
Referring to fig. 7, fig. 7 is a block diagram illustrating a configuration of an application in a smart tv according to an exemplary embodiment.
As shown in fig. 7, the application layer 2912 includes various applications that can be executed in the smart tv 200. Applications may include, but are not limited to, one or more applications such as: live television applications, video-on-demand applications, media center applications, application program centers, gaming applications, and the like.
The live television application can provide live television through different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live tv application may display a video of the live tv signal on the smart tv 200.
And the video-on-demand application can provide videos from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of cloud storage, from a local hard disk storage containing stored video programs.
The media center application can provide various applications for playing multimedia contents. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
The application program center can provide and store various applications. The application may be a game, an application, or some other application associated with a computer system or other device that may be run in the smart television. The application center may obtain these applications from different sources, store them in local storage, and then be run on the smart tv 200.
The following explains the media data acquisition method provided in the embodiments of the present application in detail.
Fig. 8 is a flowchart of a media data acquisition method according to an embodiment of the present application, where the method is applied to the foregoing implementation environment shown in fig. 1. Referring to fig. 8, the method includes the following steps:
step 801: and the intelligent television uploads the target media data to the server.
In the embodiment of the application, the smart television is provided with the first application, and the user selects the target media data to be transferred on the smart television through the control device or other modes. And when the intelligent television detects the selection operation of the user, the target media data is uploaded to the server side through the designated application according to the selection operation of the user.
In one possible implementation, the server includes a first server and a second server. The first server is an application server corresponding to the first application. The second server is a designated applet server or a server of a second application different from the first application, and the second application is installed on both the smart television and the mobile terminal. In this case, the smart tv uploads the target media data to the first server, which stores the target media data.
Optionally, in another possible implementation manner, the server is an application server of the first application, in which case, the mobile terminal is also installed with the first application. Accordingly, the smart television uploads the target media data to the application server, and the application server stores the target media data.
Step 802: and the server side sends the resource information of the target media data to the intelligent television.
When the server comprises a first server and a second server, after receiving the target media data sent by the intelligent television, the first server generates a first media identifier for the target media data, wherein the first media identifier is used for uniquely identifying the target media data in the first server. And then, the first server returns the first media identifier to the intelligent television. Accordingly, the smart television receives the first media identification. Then, the intelligent television sends the first media identification to the second server. If the second server is the applet server, the intelligent television calls an interface of the applet server and sends the first media identifier to the applet server. And if the second server is not an applet server but an application server of the second application, the smart television starts the second application and sends the first media identification to the second server through the second application.
In addition, after the first media identifier is generated, the first server correspondingly stores the first media identifier and the access address of the target media data in the address mapping relation. The access address of the target media data is used for representing the storage position of the target media data in the first server. Different media data and corresponding access addresses are stored in the address mapping relation.
After the second server receives the first media identifier, the second server generates a second media identifier according to the first media identifier in consideration of the fact that the first media identifier is generated by a third party and the length of the first media identifier cannot be controlled, and then the first media identifier and the second media identifier are correspondingly stored in the database. For convenience of subsequent description, the first media identifier and the second media identifier stored correspondingly are referred to as a pre-stored mapping relationship.
After generating the second media identification, in some possible embodiments, the second server generates a URL (Uniform Resource Locator) from the second media identification, where the URL includes the domain name of the second server and the second media identification, but does not include the access address of the target media data. Thereafter, the second server may use the URL as resource information of the target media data, or generate a two-dimensional code or a barcode from the URL. And a graphic code such as color information, and using the generated graphic code as resource information of the target media data.
Illustratively, assume that the second server generated URL is https:// mini-mobile. hismarttv. com/minipp/jhk/? deviceId 86100300000100100000071218006841& type 6& ids 44, 52. Com is the domain name of the second server, and ids 44 is the second media identifier.
Alternatively, in other possible embodiments, the second server generates the second media identifier directly, such as a two-dimensional code or a barcode, from the second media identifier. A graphic code such as color information, which is used as resource information of the target media data. The graphic code contains the second media identifier, but does not contain the domain name of the second server and the access address of the target media data.
Alternatively, the second server may directly use the generated second media identifier as the resource information of the target media data.
And after the second server obtains the resource information of the target media data, the second server returns the resource information to the intelligent television, and correspondingly, the intelligent television receives the resource information of the target media data.
Optionally, when the server is an application server corresponding to the smart television and a first application installed on the mobile terminal, after receiving target media data uploaded by the smart television, the application server generates a first media identifier for the target media data, where the first media identifier is used to uniquely identify the target media data. And then, the application server generates the resource information of the target media data according to the first media identification. The implementation process of the application server generating the resource information according to the first media identifier may refer to the process of the second server generating the resource information according to the second media identifier, which is described above, and this embodiment of the present application is not limited thereto.
After generating the resource information of the target media data, the application server returns the resource information to the smart television.
Step 803: the intelligent television displays the resource information of the target media data.
Step 804: the mobile terminal acquires resource information of the target media data.
As can be seen from the foregoing description, the resource information displayed by the smart television may be a URL, or a second media identifier, or a graphic code generated according to the URL, or a graphic code generated according to the second media identifier. Based on this, if it is a URL or a second media identifier, the mobile terminal may obtain the URL or the second media identifier input by the user. If the media identifier is a graphic code, the mobile terminal can scan and recognize the graphic code, so as to obtain the URL or the second media identifier contained in the graphic code.
Step 805: and the mobile terminal acquires the target media data from the server according to the resource information of the target media data.
After acquiring the resource information of the target media data, the mobile terminal acquires the target media data from the server according to the resource information.
When the server comprises a first server and a second server, the mobile terminal sends a data acquisition request to the second server according to a second media identifier in the acquired resource information, wherein the data acquisition request is used for indicating the second server to determine a first media identifier according to the second media identifier and a pre-stored mapping relation, and acquiring an access address of target media data from the first server according to the first media identifier; receiving an access address of the target media data returned by the second server; and accessing the first server and acquiring the target media data according to the access address of the target media data.
It should be noted that, as can be seen from the foregoing description, the second server may be an applet server, and may also be an application server corresponding to the second application installed in the mobile terminal. Next, taking the second server as an applet server as an example to describe a process of acquiring the target media data by the mobile terminal, for a case that the second server is an application server corresponding to the second application, the following implementation process may be referred to, and details are not described in this embodiment of the present application.
Illustratively, if only the second media identifier is included in the resource information, but the domain name of the second server is not included, the mobile terminal acquires the pre-stored domain name of the second server after acquiring the second media identifier. And if the resource information simultaneously comprises the second media identifier and the domain name of the second server, the mobile terminal acquires the domain name of the second server and the second media identifier which are contained in the resource information. And then, the mobile terminal starts the corresponding small program according to the domain name of the second server. After that, the mobile terminal sends a data acquisition request to the second server through the applet. The data acquisition request carries a second media identifier. After the second server generates the second media identifier, the second server correspondingly stores the second media identifier and the first media identifier, so that after receiving the data acquisition request, the second server acquires the second media identifier carried in the data acquisition request, and further acquires the first media identifier corresponding to the second media identifier from a pre-stored mapping relationship. And after the first media identification is obtained, the second server accesses the first server according to the first media identification. The second server may generate an access request according to the first media identifier, and send the access request to the first server through a preset port of the first server, where the access request carries the first media identifier.
After receiving the first media identifier sent by the second server, the first server obtains an access address of the target media data corresponding to the first media identifier from the stored address mapping relation according to the first media identifier, and feeds the access address back to the second server.
And after receiving the access address of the target media data fed back by the first server, the second server feeds back the access address of the target media data to the mobile terminal. After obtaining the access address of the target media data, the mobile terminal downloads the target media data from the first server according to the access address, namely the first server which can be accessed.
Optionally, when the server is an application server corresponding to a first application installed in the smart television and the mobile terminal, after the mobile terminal acquires the resource information of the target media data, the mobile terminal starts the first application according to a domain name of the application server included in the resource information or a domain name of a preset application server. And then, sending an access request to the application server through the first application, wherein the access request carries the first media identifier. And after receiving the first media identifier, the application server acquires corresponding target media data according to the first media identifier and sends the target media data to the mobile terminal.
Optionally, in some possible cases, after the mobile terminal acquires the resource information of the target media data, the mobile terminal may further share the resource information of the target media data to the mobile terminals of other users according to the sharing operation performed by the user, so that the mobile terminals of other users can acquire corresponding target media data according to the resource information.
Optionally, in some embodiments, the first server or the application server corresponding to the first application that receives the target media data may further perform AI (artificial intelligence) detection on the received target media data, so as to detect whether the target media data includes illegal data content. The illegal data content may refer to private data content designated by the user or other data content that does not comply with legal regulations. And if the target media data is detected to contain illegal data content, setting an access prohibition identifier for the target media data. In this way, when receiving an access request for accessing the target media data from a mobile terminal of another user, the first server or an application server corresponding to the first application returns a notification message indicating that access is prohibited to the corresponding user terminal.
In the embodiment of the application, after the smart television uploads the target media data to the server, the server returns resource information for acquiring the target media data to the server, and then the smart television displays the resource information. Therefore, the mobile terminal can acquire the corresponding media data from the server side by acquiring the resource information, and the media data is transmitted from the smart television to the mobile terminal.
In addition, the mobile terminal in the embodiment of the application acquires the media data stored in the smart television locally by using the small program, so that for a user, other applications do not need to be downloaded and installed, and the method and the device are convenient and fast. For a service provider, special application does not need to be developed, and development cost is reduced.
For better illustration of the above process of acquiring media data, fig. 8 shows an implementation flow of acquiring media data when the server includes a first server and a second server, and the second server is an applet server. Referring to fig. 8, the process includes the steps of:
1. the intelligent television sends the target media data to the first server.
2. The first server generates a first media identifier and correspondingly stores the first media identifier and the access address of the target media data in an address mapping relation.
3. The first server feeds back a first media identification of the target media data to the intelligent television.
4. And the intelligent television sends the first media identification to the second server.
5. And the second server generates a second media identifier according to the first media identifier, generates a two-dimensional code containing a URL according to the second media identifier, and correspondingly stores the first media identifier and the second media identifier.
6. And the second server sends the graphic code to the smart television.
7. And displaying the graphic code by the intelligent television.
8. And the mobile terminal scans and identifies the graphic code to obtain the URL, acquires the domain name of the second server and the second media identifier contained in the URL, and starts the applet according to the domain name of the second server contained in the URL.
9. And the mobile terminal sends a data acquisition request carrying the second media identifier to the second server.
10. And the second server acquires the first media identifier according to the mapping relation between the second media identifier in the data acquisition request and the prestored data.
11. And the second server sends an access request carrying the first media identifier to the first server.
12. And the first server returns the access address of the target media data to the second server according to the first media identifier and the address mapping relation in the access request.
13. The second server returns the access address of the media data to the mobile terminal.
14. And the mobile terminal accesses the first server and downloads the target media data according to the access address.
Fig. 10 is a block diagram of a mobile terminal 1000 according to an embodiment of the present disclosure. The functions of the mobile terminal in the above embodiments can be implemented by the mobile terminal shown in fig. 10. The mobile terminal 1000 may be a portable mobile terminal such as: smart phones, tablet computers, and the like. The mobile terminal 1000 may also be referred to by other names such as user equipment, portable mobile terminal, laptop mobile terminal, etc.
Generally, the mobile terminal 1000 includes: a controller 1001 and a memory 1002.
The controller 1001 may include one or more processing cores, such as a 4-core controller, an 8-core controller, and so forth. The controller 1001 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The controller 1001 may also include a master controller and a slave controller, where the master controller is a controller for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); the co-controller is a low power consumption controller for processing data in a standby state. In some embodiments, the controller 1001 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content that the display needs to display. In some embodiments, the controller 1001 may further include an AI (Artificial Intelligence) controller for processing a calculation operation related to machine learning.
Memory 1002 may include one or more computer-readable storage media, which may be non-transitory. The memory 1002 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1002 is used to store at least one instruction for execution by controller 1001 to implement the media data acquisition method provided by the method embodiments herein.
In some embodiments, the mobile terminal 1000 may further optionally include: a peripheral interface 1003 and at least one peripheral. The controller 1001, the memory 1002, and the peripheral interface 1003 may be connected by a bus or signal line. Various peripheral devices may be connected to peripheral interface 1003 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1004, touch display 1005, camera 1006, audio circuitry 1007, positioning components 1008, and power supply 1009.
The peripheral interface 1003 may be used to connect at least one peripheral related to I/O (Input/Output) to the controller 1001 and the memory 1002. In some embodiments, controller 1001, memory 1002, and peripheral interface 1003 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the controller 1001, the memory 1002, and the peripheral interface 1003 may be implemented on a separate chip or circuit board, which is not limited by the embodiment.
The Radio Frequency circuit 1004 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1004 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1004 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1004 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal controller, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 1004 may communicate with other mobile terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1004 may further include NFC (Near Field Communication) related circuit, which is not limited by the embodiments of the present application.
The display 1005 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 1005 is a touch display, the display 1005 also has the ability to capture touch signals on or over the surface of the display 1005. The touch signal may be input to the controller 1001 as a control signal to be processed. At this point, the display 1005 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 1005 may be one, providing a front panel of mobile terminal 1000; in other embodiments, the displays 1005 may be at least two, respectively disposed on different surfaces of the mobile terminal 1000 or in a folded design; in still other embodiments, the display 1005 may be a flexible display disposed on a curved surface or on a folded surface of the mobile terminal 1000. Even further, the display 1005 may be arranged in a non-rectangular irregular figure, i.e. a shaped screen. The Display 1005 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 1006 is used to capture images or video. Optionally, the camera assembly 1006 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of a mobile terminal, and a rear camera is disposed at a rear surface of the mobile terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1006 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1007 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the controller 1001 for processing, or inputting the electric signals to the radio frequency circuit 1004 to realize voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be respectively disposed at different portions of the mobile terminal 1000. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert the electrical signals from the controller 1001 or the radio frequency circuit 1004 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1007 may also include a headphone jack.
The positioning component 1008 is used to locate a current geographic Location of the mobile terminal 1000 for navigation or LBS (Location Based Service). The Positioning component 1008 can be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 1009 is used to supply power to various components in mobile terminal 1000. The power source 1009 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When the power source 1009 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, mobile terminal 1000 can also include one or more sensors 1010. The one or more sensors 1010 include, but are not limited to: acceleration sensor 1011, gyro sensor 1012, pressure sensor 1013, fingerprint sensor 1014, optical sensor 1015, and proximity sensor 1016.
The acceleration sensor 1011 may detect magnitudes of accelerations on three coordinate axes of a coordinate system established with the mobile terminal 1000. For example, the acceleration sensor 1011 may be used to detect components of the gravitational acceleration in three coordinate axes. The controller 1001 may control the touch display 1005 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1011. The acceleration sensor 1011 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1012 may detect a body direction and a rotation angle of the mobile terminal 1000, and the gyro sensor 1012 and the acceleration sensor 1011 may cooperate to collect a 3D motion of the user on the mobile terminal 1000. The controller 1001 can implement the following functions according to the data collected by the gyro sensor 1012: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1013 may be disposed on a side bezel of mobile terminal 1000 and/or on a lower layer of touch display 1005. When the pressure sensor 1013 is disposed at a side frame of the mobile terminal 1000, a user's holding signal of the mobile terminal 1000 may be detected, and the controller 1001 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1013. When the pressure sensor 1013 is disposed at a lower layer of the touch display 1005, the controller 1001 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 1005. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1014 is used to collect a fingerprint of the user, and the controller 1001 identifies the user based on the fingerprint collected by the fingerprint sensor 1014, or the fingerprint sensor 1014 identifies the user based on the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the controller 1001 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 1014 may be disposed on the front, back, or side of the mobile terminal 1000. When a physical key or vendor Logo is provided on the mobile terminal 1000, the fingerprint sensor 1014 may be integrated with the physical key or vendor Logo.
The optical sensor 1015 is used to collect the ambient light intensity. In one embodiment, the controller 1001 may control the display brightness of the touch display 1005 according to the intensity of the ambient light collected by the optical sensor 1015. Specifically, when the ambient light intensity is high, the display brightness of the touch display 1005 is turned up; when the ambient light intensity is low, the display brightness of touch display 1005 is turned down. In another embodiment, the controller 1001 may also dynamically adjust the shooting parameters of the camera assembly 1006 according to the intensity of the ambient light collected by the optical sensor 1015.
A proximity sensor 1016, also known as a distance sensor, is typically disposed on the front panel of the mobile terminal 1000. The proximity sensor 1016 is used to collect the distance between the user and the front of the mobile terminal 1000. In one embodiment, when the proximity sensor 1016 detects that the distance between the user and the front surface of the mobile terminal 1000 gradually decreases, the controller 1001 controls the touch display 1005 to switch from the bright screen state to the dark screen state; when the proximity sensor 1016 detects that the distance between the user and the front surface of the mobile terminal 1000 gradually becomes larger, the touch display 1005 is controlled by the controller 1001 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in FIG. 10 is not intended to be limiting of mobile terminal 1000, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
In some embodiments, a computer-readable storage medium is also provided, in which a computer program is stored, which when executed by a processor implements the steps of the video playing method in the above embodiments. For example, the computer readable storage medium may be a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It is noted that the computer-readable storage medium referred to in the embodiments of the present application may be a non-volatile storage medium, in other words, a non-transitory storage medium.
It should be understood that all or part of the steps for implementing the above embodiments may be implemented by software, hardware, firmware or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The computer instructions may be stored in the computer-readable storage medium described above.
That is, in some embodiments, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the steps of the video playback method described above.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the embodiments of the present application, and are not limited thereto; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (13)

1. The intelligent television is characterized by comprising a display and a controller;
the controller is configured to:
uploading the target media data to a first server;
receiving a first media identifier sent by the first server, wherein the first media identifier is an identifier which is generated by the first server and used for uniquely identifying the target media data;
sending the first media identification to a second server;
receiving resource information of the target media data returned by the second server, wherein the resource information is used for a mobile terminal to obtain the target media data, the resource information includes a second media identifier but does not include an access address, the access address represents a storage position of the target media data in the first server, and the second media identifier is generated by the second server according to the first media identifier;
and controlling the display to display the resource information so that the mobile terminal generates and sends a data acquisition request containing the second media identifier to the second server according to the resource information, wherein the data acquisition request is used for enabling the second server to acquire an access address corresponding to the target media data from the first server according to the second media identifier and a pre-stored mapping relation and feed back the access address to the mobile terminal, and the access address is used for enabling the mobile terminal to access the first server according to the access address to acquire the target media data.
2. The smart television of claim 1, wherein the second server is an applet server, and wherein the resource information further comprises a domain name of the second server.
3. The smart television of claim 2, wherein the controller controls the display to display the resource information, comprising:
the controller controls the display to display a graphic code according to the resource information, so that the mobile terminal obtains the second media identifier and carries the second media identifier to access the applet server, the applet server is used for determining the first media identifier according to a mapping relation between the second media identifier and a pre-stored mapping relation, and after obtaining an access address corresponding to the target media data from the first server according to the first media identifier, the access address is issued to the mobile terminal.
4. A media data acquisition method is applied to a smart television, and comprises the following steps:
uploading the target media data to a first server;
receiving a first media identifier sent by the first server, wherein the first media identifier is an identifier which is generated by the first server and used for uniquely identifying the target media data;
sending the first media identification to a second server;
receiving resource information of the target media data returned by the second server, wherein the resource information is used for a mobile terminal to obtain the target media data, the resource information includes a second media identifier but does not include an access address, the access address represents a storage position of the target media data in the first server, and the second media identifier is generated by the second server according to the first media identifier;
and displaying the resource information so that the mobile terminal generates and sends a data acquisition request containing the second media identifier to the second server according to the resource information, wherein the data acquisition request is used for enabling the second server to acquire an access address corresponding to the target media data from the first server according to the second media identifier and a pre-stored mapping relation and feed back the access address to the mobile terminal, and the access address is used for enabling the mobile terminal to access the first server according to the access address to acquire the target media data.
5. The method of claim 4, wherein the second server is an applet server, and wherein the resource information further comprises a domain name of the second server.
6. The method of claim 5, wherein the displaying the resource information comprises:
and displaying a graphic code according to the resource information so that the mobile terminal acquires the second media identifier and carries the second media identifier to access the applet server, wherein the applet server is used for determining the first media identifier according to the mapping relation between the second media identifier and a pre-stored mapping relation, and issuing the access address to the mobile terminal after acquiring the access address corresponding to the target media data from the first server according to the first media identifier.
7. A mobile terminal, characterized in that the mobile terminal comprises: a controller and a display;
the controller is used for acquiring resource information of target media data displayed by the smart television, the resource information is returned by the server after the smart television uploads the target media data to the server, the server comprises a first server and a second server, the resource information comprises a second media identifier but does not comprise an access address, the second media identifier is generated by the second server according to a first media identifier received from the smart television, the first media identifier is issued to the smart television by the first server after receiving the target media data uploaded by the smart television, and the access address represents a storage position of the target media data in the first server;
the controller is configured to: sending a data acquisition request to a second server according to a second media identifier in the resource information, wherein the data acquisition request is used for indicating the second server to determine a first media identifier according to a mapping relation between the second media identifier and a pre-stored mapping relation, and acquiring an access address of the target media data from the first server according to the first media identifier;
receiving an access address of the target media data returned by the second server; accessing the first server and acquiring the target media data according to the access address of the target media data;
the display is used for displaying a user interface and acquiring information.
8. The mobile terminal of claim 7, wherein the second server is an applet server, and wherein the resource information further includes a domain name of the second server;
the controller is configured to:
starting an applet corresponding to the second server according to the domain name of the second server;
and sending the data acquisition request to the second server through the applet, wherein the data acquisition request carries the second media identifier, and the second media identifier is used for the second server to acquire the first media identifier.
9. The mobile terminal according to any one of claims 7 to 8, wherein the resource information is displayed in a form of a graphic code, and the obtaining resource information of the target media data displayed by the smart tv includes:
and scanning and identifying the graphic code to obtain the resource information.
10. A media data acquisition method is applied to a mobile terminal, and the method comprises the following steps:
the method comprises the steps that resource information of target media data displayed by the intelligent television is obtained, wherein the resource information is returned by a server after the target media data are uploaded to the server by the intelligent television, the server comprises a first server and a second server, the resource information comprises a second media identifier but does not comprise an access address, the second media identifier is generated by the second server according to a first media identifier received from the intelligent television, the first media identifier is issued to the intelligent television after the first server receives the target media data uploaded by the intelligent television, and the access address represents the storage position of the target media data in the first server;
sending a data acquisition request to the second server according to a second media identifier in the resource information, wherein the data acquisition request is used for indicating the second server to determine a first media identifier according to a mapping relation between the second media identifier and a pre-stored mapping relation, and acquiring an access address of the target media data from the first server according to the first media identifier;
receiving an access address of the target media data returned by the second server;
and accessing the first server and acquiring the target media data according to the access address of the target media data.
11. The method of claim 10, wherein the second server is an applet server, and wherein the resource information further includes a domain name of the second server;
the sending a data acquisition request to the second server according to the second media identifier in the resource information includes:
according to the domain name of the second server, starting an applet corresponding to the second server;
and sending the data acquisition request to the second server through the applet, wherein the data acquisition request carries the second media identifier, and the second media identifier is used for the second server to acquire the first media identifier.
12. The method according to any one of claims 10 to 11, wherein the resource information is displayed in a form of a graphic code, and the acquiring resource information of target media data displayed by the smart television comprises:
and scanning and identifying the graphic code to obtain the resource information.
13. A media data acquisition method applied to an applet server, the method comprising:
receiving a data acquisition request containing a second media identifier, wherein the data acquisition request is sent by the mobile terminal after scanning a graphic code generated by the smart television according to the received second media identifier:
determining a first media identifier according to the second media identifier and a pre-stored mapping relationship, wherein the first media identifier is issued to the smart television by a first server after receiving target media data uploaded by the smart television, and the pre-stored mapping relationship is stored when the small program server receives the first media identifier uploaded by the smart television and generates the second media identifier according to the first media identifier;
accessing the first server according to the first media identifier, so that the first server feeds back an access address of the target media data according to the first media identifier and an address mapping relation, wherein the address mapping relation is the mapping relation between the first media identifier and the access address, which is stored when the first server generates the first media identifier according to the target media data uploaded by the smart television;
and receiving the access address, and sending the access address to the mobile terminal, so that the mobile terminal accesses the first server according to the access address to acquire the target media data.
CN202010669607.4A 2020-07-13 2020-07-13 Media data acquisition method, smart television and mobile terminal Active CN111787364B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010669607.4A CN111787364B (en) 2020-07-13 2020-07-13 Media data acquisition method, smart television and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010669607.4A CN111787364B (en) 2020-07-13 2020-07-13 Media data acquisition method, smart television and mobile terminal

Publications (2)

Publication Number Publication Date
CN111787364A CN111787364A (en) 2020-10-16
CN111787364B true CN111787364B (en) 2022-05-06

Family

ID=72768101

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010669607.4A Active CN111787364B (en) 2020-07-13 2020-07-13 Media data acquisition method, smart television and mobile terminal

Country Status (1)

Country Link
CN (1) CN111787364B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113180729B (en) * 2021-03-31 2023-07-14 上海深至信息科技有限公司 Ultrasonic data transmission method and system
CN114928608B (en) * 2022-04-21 2024-08-06 北京达佳互联信息技术有限公司 Method, device, equipment and storage medium for processing multimedia resources

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1578958A (en) * 2002-07-10 2005-02-09 夏普株式会社 Multi-medium information providing system and multi-medium information providing method
JP2006237753A (en) * 2005-02-22 2006-09-07 Pioneer Electronic Corp Program reserving system, method for reserving program, portable terminal device with camera and video device
CN104508689A (en) * 2014-04-29 2015-04-08 华为终端有限公司 A two-dimension code processing method and a terminal
CN105117836A (en) * 2015-08-19 2015-12-02 国网山东省电力公司烟台供电公司 Power grid management system of power grid geographical wiring diagram
CN105142000A (en) * 2015-08-14 2015-12-09 三星电子(中国)研发中心 Information pushing method and system based on television playing content
CN107067056A (en) * 2017-02-14 2017-08-18 阿里巴巴集团控股有限公司 Two-dimensional code generation method and its equipment and two-dimensional code identification method and its equipment
CN107547934A (en) * 2016-10-12 2018-01-05 腾讯科技(北京)有限公司 Information transferring method and device based on video
CN107770574A (en) * 2016-08-18 2018-03-06 深圳市劲嘉数媒科技有限公司 The method and apparatus of video transmission
CN109062930A (en) * 2018-06-12 2018-12-21 叶龙海 A kind of method, apparatus and system based on two dimensional code mark video
CN109803111A (en) * 2019-01-17 2019-05-24 视联动力信息技术股份有限公司 A kind of method for watching after the meeting and device of video conference

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130334300A1 (en) * 2011-01-03 2013-12-19 Curt Evans Text-synchronized media utilization and manipulation based on an embedded barcode
CN107463834A (en) * 2017-08-02 2017-12-12 成都九十度工业产品设计有限公司 A kind of information interacting method and interactive system based on mark
CN111047313B (en) * 2020-03-12 2020-12-04 支付宝(杭州)信息技术有限公司 Code scanning payment, information sending and key management method, device and equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1578958A (en) * 2002-07-10 2005-02-09 夏普株式会社 Multi-medium information providing system and multi-medium information providing method
JP2006237753A (en) * 2005-02-22 2006-09-07 Pioneer Electronic Corp Program reserving system, method for reserving program, portable terminal device with camera and video device
CN104508689A (en) * 2014-04-29 2015-04-08 华为终端有限公司 A two-dimension code processing method and a terminal
CN105142000A (en) * 2015-08-14 2015-12-09 三星电子(中国)研发中心 Information pushing method and system based on television playing content
CN105117836A (en) * 2015-08-19 2015-12-02 国网山东省电力公司烟台供电公司 Power grid management system of power grid geographical wiring diagram
CN107770574A (en) * 2016-08-18 2018-03-06 深圳市劲嘉数媒科技有限公司 The method and apparatus of video transmission
CN107547934A (en) * 2016-10-12 2018-01-05 腾讯科技(北京)有限公司 Information transferring method and device based on video
CN107067056A (en) * 2017-02-14 2017-08-18 阿里巴巴集团控股有限公司 Two-dimensional code generation method and its equipment and two-dimensional code identification method and its equipment
CN109062930A (en) * 2018-06-12 2018-12-21 叶龙海 A kind of method, apparatus and system based on two dimensional code mark video
CN109803111A (en) * 2019-01-17 2019-05-24 视联动力信息技术股份有限公司 A kind of method for watching after the meeting and device of video conference

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种基于二维码的大数据量传输方式;马俊俊等;《软件导刊》;20180531;第17卷(第5期);全文 *

Also Published As

Publication number Publication date
CN111787364A (en) 2020-10-16

Similar Documents

Publication Publication Date Title
CN109982102B (en) Interface display method and system for live broadcast room, live broadcast server and anchor terminal
CN112118477B (en) Virtual gift display method, device, equipment and storage medium
CN111208969A (en) Selection control method of sound output equipment and display equipment
CN113784220B (en) Method for playing media resources, display device and mobile device
CN112055240B (en) Display device and operation prompt display method for pairing display device with remote controller
CN111970549B (en) Menu display method and display device
CN112073762B (en) Information acquisition method based on multi-system display equipment and multi-system display equipment
CN111343495A (en) Display device and method for playing music in terminal
CN112399232A (en) Display equipment, camera priority use control method and device
CN112135280A (en) Bluetooth device playing control method and display device
CN111542031B (en) Display device and Bluetooth device pairing method
CN116235522A (en) Display method and display device
CN111970548A (en) Display device and method for adjusting angle of camera
CN111787364B (en) Media data acquisition method, smart television and mobile terminal
CN111176603A (en) Image display method for display equipment and display equipment
CN111954059A (en) Screen saver display method and display device
CN111984167B (en) Quick naming method and display device
CN112218145A (en) Smart television, VR display device and related methods
CN112214190A (en) Display equipment resource playing method and display equipment
CN112017415A (en) Recommendation method of virtual remote controller, display device and mobile terminal
CN112040340A (en) Resource file acquisition method and display device
CN110996115B (en) Live video playing method, device, equipment, storage medium and program product
CN111931692A (en) Display device and image recognition method
CN111918132A (en) Display device and multi-interface device judgment method
CN113467651A (en) Display method and display equipment for content corresponding to control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant