CN111787350B - Display device and screenshot method in video call - Google Patents

Display device and screenshot method in video call Download PDF

Info

Publication number
CN111787350B
CN111787350B CN202010769342.5A CN202010769342A CN111787350B CN 111787350 B CN111787350 B CN 111787350B CN 202010769342 A CN202010769342 A CN 202010769342A CN 111787350 B CN111787350 B CN 111787350B
Authority
CN
China
Prior art keywords
screenshot
display
control
screen
bitmap data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010769342.5A
Other languages
Chinese (zh)
Other versions
CN111787350A (en
Inventor
高琨
路锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Juhaokan Technology Co Ltd
Original Assignee
Juhaokan Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juhaokan Technology Co Ltd filed Critical Juhaokan Technology Co Ltd
Priority to CN202010769342.5A priority Critical patent/CN111787350B/en
Publication of CN111787350A publication Critical patent/CN111787350A/en
Application granted granted Critical
Publication of CN111787350B publication Critical patent/CN111787350B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/231Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/955Retrieval from the web using information identifiers, e.g. uniform resource locators [URL]
    • G06F16/9554Retrieval from the web using information identifiers, e.g. uniform resource locators [URL] by using bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/27Server based end-user applications
    • H04N21/274Storing end-user multimedia data in response to end-user request, e.g. network recorder
    • H04N21/2743Video hosting of uploaded data from client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • H04N21/42653Internal components of the client ; Characteristics thereof for processing graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4318Generation of visual interfaces for content selection or interaction; Content or additional data rendering by altering the content in the rendering process, e.g. blanking, blurring or masking an image region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/835Generation of protective data, e.g. certificates
    • H04N21/8352Generation of protective data, e.g. certificates involving content or source identification data, e.g. Unique Material Identifier [UMID]

Abstract

The embodiment of the application provides a display device and a screenshot method in a video call, wherein the display device comprises a display; a camera configured to capture video or pictures; a controller connected with the display and the camera, the controller configured to: responding to a trigger signal of a screen capture control input by a user received in a video call interface, controlling a display to hide a control menu, and acquiring bitmap data; controlling a display to display a screenshot popup window according to the bitmap data, and displaying a screenshot corresponding to the bitmap data in the screenshot popup window; sending bitmap data to a server, and acquiring a picture identifier corresponding to the bitmap data returned by the server; acquiring a network address of the server for storing bitmap data according to the picture identification; and generating a two-dimensional code according to the network address, and controlling a display to display the two-dimensional code in the screenshot popup window. The method and the device solve the problem that screenshot cannot be performed during video call.

Description

Display device and screenshot method in video call
Technical Field
The application relates to the technical field of display equipment, in particular to display equipment and a screenshot method in video call.
Background
With the increasing configuration of the smart television, more and more functions on the mobile terminal can be realized on the smart television, and the smart television is superior to mobile equipment in the experience of some applications by means of the unique large-screen display advantage of the smart television. For example, a smart television with a camera can be equipped with a video call application, so that a user can perform video chat through the smart television. However, in some functional experiences, the smart television still has some defects, for example, when a user performs a video chat on the mobile terminal, a display screen on the mobile terminal may be captured in a preset capture mode of the mobile terminal, such as simultaneously pressing a power key and a volume reduction key of the mobile terminal to capture a video, while the smart television is inconvenient to capture a video in the above mode due to a large television volume, and at present, there is no good scheme for capturing a video in a video call process of the smart television.
Disclosure of Invention
In order to solve the technical problem, the application provides a display device and a screenshot method in a video call.
In a first aspect, the present application provides a display device comprising:
a display;
a camera configured to capture video or pictures;
a controller connected with the display and camera, the controller configured to:
responding to a trigger signal of a screen capture control input by a user received in a video call interface, controlling the display to hide a control menu, and acquiring bitmap data;
controlling the display to display a screenshot popup window according to the bitmap data, and displaying a screenshot corresponding to the bitmap data in the screenshot popup window;
sending the bitmap data to a server, and acquiring a picture identifier which is returned by the server and corresponds to the bitmap data;
acquiring a network address of the server for storing the bitmap data according to the picture identification;
and generating a two-dimensional code according to the network address, and controlling the display to display the two-dimensional code in the screenshot popup window.
In some embodiments, the obtaining bitmap data comprises: and calling the screen shot of SurfaceControl through reflection to acquire bitmap data of all pictures displayed by the current system.
In some embodiments, the obtaining bitmap data comprises: the bitmap data of all the pictures displayed by the current system is obtained by the MediaProjectionManager of the android frame.
In some embodiments, the controller is further configured to:
and responding to a trigger signal of a re-screen-capturing control input by a user in the screen-capturing popup, controlling a display to exit the screen-capturing popup, and displaying the control menu.
In a second aspect, an embodiment of the present application provides a screenshot method in a video call, where the method includes:
responding to a trigger signal of a screen capture control received in a video call interface, controlling a display to hide a control menu, and acquiring bitmap data;
controlling the display to display a screenshot popup window according to the bitmap data, and displaying a screenshot corresponding to the bitmap data in the screenshot popup window;
sending the bitmap data to a server, and acquiring a picture identifier which is returned by the server and corresponds to the bitmap data;
acquiring a network address of the server for storing the bitmap data according to the picture identification;
and generating a two-dimensional code according to the network address, and controlling the display to display the two-dimensional code in the screenshot popup window.
In some embodiments, further comprising:
and responding to a trigger signal of a re-screen-capturing control input by a user in the screen-capturing popup window, controlling a display to exit the screen-capturing popup window, and displaying the control menu.
The display device and the screenshot method in the video call provided by the application have the beneficial effects that:
according to the method and the device, the screen capture control is arranged on the video call interface, after the screen capture control is triggered by a user, the control menu is hidden for screen capture, a screen capture is obtained, the screen capture is displayed on a display of the display equipment, and the user can check the screen capture; further, the bitmap data acquired in the screen capturing process is uploaded to the server, the server generates a network address and a picture identifier for storing the screenshot corresponding to the bitmap data according to the screenshot file, the display device can acquire the network address according to the picture identifier, a two-dimensional code is generated according to the network address, the two-dimensional code is displayed on the screenshot, a user can acquire the screenshot from the network address on the mobile terminal in a two-dimensional code scanning mode, the screenshot can be stored on the mobile device, and the user can store and edit the screenshot conveniently.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without any creative effort.
Fig. 1 is a schematic diagram illustrating an operational scenario between a display device and a control apparatus according to some embodiments;
a block diagram of a hardware configuration of a display device 200 according to some embodiments is illustrated in fig. 2;
a block diagram of the hardware configuration of the control device 100 according to some embodiments is illustrated in fig. 3;
a schematic diagram of a software configuration in a display device 200 according to some embodiments is illustrated in fig. 4;
FIG. 5 illustrates an icon control interface display diagram for an application in the display device 200, according to some embodiments;
a video call interface schematic diagram according to some embodiments is illustrated in fig. 6;
FIG. 7 illustrates a screenshot interaction diagram in accordance with some embodiments;
FIG. 8 is an exemplary illustration of the interaction breakdown diagram of FIG. 7;
FIG. 9 illustrates a screenshot interface diagram in accordance with some embodiments;
a flow diagram illustrating a method of screenshot in a video call in accordance with some embodiments is shown in fig. 10.
Detailed Description
To make the objects, embodiments and advantages of the present application clearer, the following is a clear and complete description of exemplary embodiments of the present application with reference to the attached drawings in exemplary embodiments of the present application, and it is apparent that the exemplary embodiments described are only a part of the embodiments of the present application, and not all of the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments described herein without inventive step, are intended to be within the scope of the claims appended hereto. In addition, while the disclosure herein has been presented in terms of exemplary embodiment or embodiments, it should be appreciated that individual aspects of the disclosure can be utilized in a variety of forms and embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the foregoing drawings are used for distinguishing between similar or analogous objects or entities and are not necessarily intended to limit the order or sequence Unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Furthermore, the terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term module, as used herein, refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in this application refers to a component of an electronic device (such as the display device disclosed in this application) that is typically wirelessly controllable over a relatively short range of distances. Typically using infrared and/or Radio Frequency (RF) signals and/or bluetooth to connect with the electronic device, and may also include WiFi, wireless USB, bluetooth, motion sensor, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in this application refers to a user's behavior through a change in hand shape or an action such as hand motion to convey a desired idea, action, purpose, or result.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display device 200 through the mobile terminal 300 and the control apparatus 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, etc., and the display device 200 is controlled by wireless or other wired methods. The user may input a user command through a key on a remote controller, voice input, control panel input, etc. to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
In some embodiments, mobile terminals, tablets, computers, laptops, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device. The application, through configuration, may provide the user with various controls in an intuitive User Interface (UI) on a screen associated with the smart device.
In some embodiments, the mobile terminal 300 may install a software application with the display device 200 to implement connection communication through a network communication protocol for the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 300 and the display device 200 can establish a control instruction protocol, synchronize a remote control keyboard to the mobile terminal 300, and control the display device 200 by controlling a user interface on the mobile terminal 300. The audio and video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display apparatus 200 also performs data communication with the server 400 through various communication means. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. Illustratively, the display device 200 receives software program updates, or accesses a remotely stored digital media library, by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers. Other web service contents such as video on demand and advertisement services are provided through the server 400.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The particular display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function of a computer support function including, but not limited to, a network tv, an intelligent tv, an Internet Protocol Tv (IPTV), and the like, in addition to the broadcast receiving tv function.
A hardware configuration block diagram of a display device 200 according to an exemplary embodiment is exemplarily shown in fig. 2.
In some embodiments, at least one of the controller 250, the tuner demodulator 210, the communicator 220, the detector 230, the input/output interface 255, the display 275, the audio output interface 285, the memory 260, the power supply 290, the user interface 265, and the external device interface 240 is included in the display apparatus 200.
In some embodiments, the display 275 is configured to receive image signals from the output of the first processor and to display video content and images and components of the menu manipulation interface.
In some embodiments, the display 275, includes a display screen component for presenting a picture, and a driving component that drives the display of an image.
In some embodiments, the video content is displayed from broadcast television content, or alternatively, from various broadcast signals that may be received via wired or wireless communication protocols. Alternatively, various image contents received from a network communication protocol and transmitted from a network server side can be displayed.
In some embodiments, the display 275 is used to present a user-manipulated UI interface generated in the display apparatus 200 and used to control the display apparatus 200.
In some embodiments, a drive assembly for driving the display is also included, depending on the type of display 275.
In some embodiments, display 275 is a projection display and may also include a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, or other network communication protocol chips or near field communication protocol chips, and an infrared receiver.
In some embodiments, the display apparatus 200 may establish control signal and data signal transmission and reception with the external control device 100 or the content providing apparatus through the communicator 220.
In some embodiments, the user interface 265 may be configured to receive infrared control signals from a control device 100 (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is a signal used by the display device 200 to collect an external environment or interact with the outside.
In some embodiments, the detector 230 includes a light receiver, a sensor for collecting the intensity of ambient light, and parameters changes can be adaptively displayed by collecting the ambient light, and the like.
In some embodiments, the detector 230 may further include an image collector, such as a camera, etc., which may be configured to collect external environment scenes, collect attributes of the user or gestures interacted with the user, adaptively change display parameters, and recognize user gestures, so as to implement a function of interaction with the user.
In some embodiments, the detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature.
In some embodiments, the display apparatus 200 may adaptively adjust a display color temperature of an image. For example, the display apparatus 200 may be adjusted to display a cool tone when the temperature is in a high environment, or the display apparatus 200 may be adjusted to display a warm tone when the temperature is in a low environment.
In some embodiments, the detector 230 may also be a sound collector or the like, such as a microphone, which may be used to receive the user's voice. Illustratively, a voice signal including a control instruction of the user to control the display device 200, or to collect an ambient sound for recognizing an ambient scene type, so that the display device 200 can adaptively adapt to an ambient noise.
In some embodiments, as shown in fig. 2, the input/output interface 255 is configured to allow data transfer between the controller 250 and external other devices or other controllers 250. Such as receiving video signal data and audio signal data of an external device, or command instruction data, etc.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: the interface can be any one or more of a high-definition multimedia interface (HDMI), an analog or data high-definition component input interface, a composite video input interface, a USB input interface, an RGB port and the like. The plurality of interfaces may form a composite input/output interface.
In some embodiments, as shown in fig. 2, the tuning demodulator 210 is configured to receive a broadcast television signal through a wired or wireless receiving manner, perform modulation and demodulation processing such as amplification, mixing, resonance, and the like, and demodulate an audio and video signal from a plurality of wireless or wired broadcast television signals, where the audio and video signal may include a television audio and video signal carried in a television channel frequency selected by a user and an EPG data signal.
In some embodiments, the frequency points demodulated by the tuner demodulator 210 are controlled by the controller 250, and the controller 250 can send out control signals according to user selection, so that the modem responds to the television signal frequency selected by the user and modulates and demodulates the television signal carried by the frequency.
In some embodiments, the broadcast television signal may be classified into a terrestrial broadcast signal, a cable broadcast signal, a satellite broadcast signal, an internet broadcast signal, or the like according to the broadcasting system of the television signal. Or may be classified into a digital modulation signal, an analog modulation signal, and the like according to a modulation type. Or the signals are classified into digital signals, analog signals and the like according to the types of the signals.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box. Therefore, the set top box outputs the television audio and video signals modulated and demodulated by the received broadcast television signals to the main body equipment, and the main body equipment receives the audio and video signals through the first input/output interface.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
As shown in fig. 2, the controller 250 includes at least one of a Random Access Memory 251 (RAM), a Read-Only Memory 252 (ROM), a video processor 270, an audio processor 280, other processors 253 (e.g., a Graphics Processing Unit (GPU), a Central Processing Unit 254 (CPU), a Communication Interface (Communication Interface), and a Communication Bus 256 (Bus), which connects the respective components.
In some embodiments, RAM 251 is used to store temporary data for the operating system or other programs that are running.
In some embodiments, ROM 252 is used to store instructions for various system boots.
In some embodiments, the ROM 252 is used to store a Basic Input Output System (BIOS). The system is used for completing power-on self-test of the system, initialization of each functional module in the system, a driver of basic input/output of the system and booting an operating system.
In some embodiments, when the power-on signal is received, the display device 200 starts to power up, the CPU executes the system boot instruction in the ROM 252, and copies the temporary data of the operating system stored in the memory to the RAM 251 so as to start or run the operating system. After the start of the operating system is completed, the CPU copies the temporary data of the various application programs in the memory to the RAM 251, and then, the various application programs are started or run.
In some embodiments, CPU processor 254 is used to execute operating system and application program instructions stored in memory. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some example embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include a main processor and one or more sub-processors. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. One or more sub-processors for one operation in a standby mode or the like.
In some embodiments, a graphics processor 253, for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And the system comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, video processor 270 is configured to receive an external video signal, perform video processing such as decompression, decoding, scaling, noise reduction, frame number conversion, resolution conversion, image synthesis, etc., according to a standard codec protocol of the input signal, and obtain a signal that can be displayed or played on directly displayable device 200.
In some embodiments, video processor 270 includes a demultiplexing module, a video decoding module, an image compositing module, a frame number conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame conversion module is used for converting the input video frame number, such as 60Hz frame number to 120Hz frame number or 240Hz frame number, and the common format is realized by adopting a frame interpolation mode.
The display format module is used for converting the received frame number into a video output signal and changing the signal to conform to the display format, such as outputting an RGB data signal.
In some embodiments, the graphics processor 253 and the video processor may be integrated or separately configured, and when the graphics processor and the video processor are integrated, the graphics processor and the video processor may perform processing of graphics signals output to a display, and when the graphics processor and the video processor are separately configured, the graphics processor and the video processor may perform different functions, for example, a GPU + FRC (Frame Rate Conversion) architecture.
In some embodiments, the audio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, and amplification processes to obtain an audio signal that can be played in a speaker.
In some embodiments, video processor 270 may comprise one or more chips. The audio processor may also comprise one or more chips.
In some embodiments, the video processor 270 and the audio processor 280 may be separate chips or may be integrated with the controller in one or more chips.
In some embodiments, the audio output, under the control of controller 250, receives sound signals output by audio processor 280, such as: the speaker 286, and an external sound output terminal of the sound generating device that can output to the external device, in addition to the speaker carried by the display device 200 itself, such as: external sound interface or earphone interface, etc., and may also include a near field communication module in the communication interface, for example: and the Bluetooth module is used for outputting sound of the Bluetooth loudspeaker.
The power supply 290 supplies power to the display device 200 from the power input from the external power source under the control of the controller 250. The power supply 290 may include a built-in power supply circuit installed inside the display apparatus 200, or may be a power supply interface installed outside the display apparatus 200 to provide an external power supply in the display apparatus 200.
A user interface 265 for receiving an input signal of a user and then transmitting the received user input signal to the controller 250. The user input signal may be a remote controller signal received through an infrared receiver, and various user control signals may be received through the network communication module.
In some embodiments, the user inputs a user command through the control apparatus 100 or the mobile terminal 300, the user input interface responds to the user input through the controller 250 according to the user input, and the display device 200 responds to the user input through the controller 250.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on the display 275, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input a user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
The memory 260 includes a memory storing various software modules for driving the display device 200. Such as: various software modules stored in the first memory, including: at least one of a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
The base module is a bottom layer software module for signal communication between various hardware in the display device 200 and for sending processing and control signals to the upper layer module. The detection module is used for collecting various information from various sensors or user input interfaces, and the management module is used for performing digital-to-analog conversion and analysis management.
For example, the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is used for controlling the display to display the image content, and can be used for playing the multimedia image content, UI interface and other information. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing a module for data communication between browsing servers. And the service module is used for providing various services and modules including various application programs. Meanwhile, the memory 260 may store a visual effect map for receiving external data and user data, images of various items in various user interfaces, and a focus object, etc.
Fig. 3 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 3, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface, a memory, and a power supply.
The control apparatus 100 is configured to control the display device 200 and may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications that control the display device 200 according to user demands.
In some embodiments, as shown in fig. 1, a mobile terminal 300 or other intelligent electronic device may function similar to the control apparatus 100 after an application for manipulating the display device 200 is installed. Such as: the user may implement the function of controlling the physical keys of the apparatus 100 by installing an application, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 300 or other intelligent electronic device.
The controller 110 includes a processor 112 and RAM 113 and ROM 114, a communication interface 130, and a communication bus. The controller is used for controlling the operation of the control device 100, as well as the communication cooperation among the internal components and the external and internal data processing functions.
The communication interface 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display apparatus 200. The communication interface 130 may include at least one of a WiFi chip 131, a bluetooth module 132, an NFC module 133, and other near field communication modules.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touch pad 142, a sensor 143, keys 144, and other input interfaces. Such as: the user can realize a user instruction input function through actions such as voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display apparatus 200. In some embodiments, the interface may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, the user input command needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. And the following steps: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then modulated according to an rf control signal modulation protocol, and then transmitted to the display device 200 through the rf transmitting terminal.
In some embodiments, the control device 100 includes at least one of a communication interface 130 and an input-output interface 140. The control device 100 is configured with a communication interface 130, such as: the WiFi, bluetooth, NFC, etc. modules may encode the user input command according to the WiFi protocol, or the bluetooth protocol, or the NFC protocol, and send the encoded user input command to the display device 200.
A memory 190 for storing various operation programs, data and applications for driving and controlling the control apparatus 200 under the control of the controller. The memory 190 may store various control signal commands input by a user.
And a power supply 180 for providing operation power support for each element of the control device 100 under the control of the controller. A battery and associated control circuitry.
In some embodiments, the system may include a Kernel (Kernel), a command parser (shell), a file system, and an application. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user application program. The application program is compiled into machine code after being started, and a process is formed.
Referring to fig. 4, in some embodiments, the system is divided into four layers, which are an Application (Applications) layer (abbreviated as "Application layer"), an Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer from top to bottom.
In some embodiments, at least one application program runs in the application program layer, and the application programs can be Window (Window) programs carried by an operating system, system setting programs, clock programs, camera applications and the like; or may be an application developed by a third party developer such as a hi program, a karaoke program, a magic mirror program, or the like. In specific implementation, the application packages in the application layer are not limited to the above examples, and may actually include other application packages, which is not limited in this embodiment of the present application.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. The application program can access the resources in the system and obtain the services of the system in execution through the API interface.
As shown in fig. 4, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; a Location Manager (Location Manager) for providing access to the system Location service to the system service or application; a Package Manager (Package Manager) for retrieving various information about an application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is to: managing the life cycle of each application program and the general navigation backspacing function, such as controlling the exit of the application program (including switching the user interface currently displayed in the display window to the system desktop), opening, backing (including switching the user interface currently displayed in the display window to the previous user interface of the user interface currently displayed), and the like.
In some embodiments, the window manager is configured to manage all window processes, such as obtaining a display size, determining whether a status bar is available, locking a screen, intercepting a screen, controlling a display change (e.g., zooming out, dithering, distorting, etc.) and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the core layer includes at least one of the following drivers: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (such as fingerprint sensor, temperature sensor, touch sensor, pressure sensor, etc.), and so on.
In some embodiments, the kernel layer further comprises a power driver module for power management.
In some embodiments, software programs and/or modules corresponding to the software architecture of fig. 4 are stored in the first memory or the second memory as shown in fig. 2 or fig. 3.
In some embodiments, for a display device with a touch function, taking a split screen operation as an example, the display device receives an input operation (such as a split screen operation) that a user acts on a display screen, and the kernel layer may generate a corresponding input event according to the input operation and report the event to the application framework layer. The window mode (such as multi-window mode) corresponding to the input operation, the position and size of the window and the like are set by an activity manager of the application framework layer. And the window management of the application program framework layer draws a window according to the setting of the activity manager, then sends the drawn window data to the display driver of the kernel layer, and the display driver displays the corresponding application interface in different display areas of the display screen.
In some embodiments, as shown in fig. 5, the application layer containing at least one application may display a corresponding icon control in the display, such as: the system comprises a live television application icon control, a video-on-demand application icon control, a media center application icon control, an application center icon control, a game application icon control and the like.
In some embodiments, the live television application may provide live television from different sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
In some embodiments, a video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides video displays from some storage source. For example, the video on demand may come from a server side of cloud storage, from a local hard disk storage containing stored video programs.
In some embodiments, the media center application may provide various applications for multimedia content playback. For example, a media center, which may be other than live television or video on demand, may provide services for a user to access various images or audio through a media center application.
In some embodiments, an application center may provide storage for various applications. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on the smart television. The application center may obtain these applications from different sources, store them in local storage, and then be operable on the display device 200.
The hardware or software architecture in some embodiments may be based on the description in the above embodiments, and in some embodiments may be based on other hardware or software architectures that are similar to the above embodiments, and it is sufficient to implement the technical solution of the present application.
In some embodiments, a video call application may be set in the application center, and when a user clicks the video call application icon control, the display device may open the video call application, control the display to display a main interface of the video call application, and the user may select a called terminal to perform video call on the main interface; or when the display device displays the video call invitation, the user serves as a called terminal, can accept the video call invitation initiated by the calling terminal, and carries out video call with the calling terminal.
During a video call, a user may have a need to capture a screen of a display device. When the display device is a mobile terminal, a user can perform screenshot by pressing a preset combination key on the mobile terminal, such as a power key and a volume reduction key, or perform screenshot by clicking a screen, such as finger joint screenshot, or perform screenshot by touching the screen, such as three-finger screenshot; when the display device is a television, the user cannot capture the screen of the display picture in the commonly-used screen capture mode due to the fact that the television is large in size, few in keys, non-touch screen and the like.
In order to solve the technical problem, the screen capture control is arranged on the video call interface, and screen capture is achieved by triggering the screen capture control.
Referring to fig. 6, which is a schematic view of a video call interface according to some embodiments of the present application, as shown in fig. 6, in a process of video call between two users, a display device may control a display to display two call windows, and images collected by cameras of display devices of both parties of the call are respectively displayed in the two call windows. One of the call windows can be a main window and can be fully paved with a display of the display device, namely, the call window is displayed in a full screen mode, and the other call window can be a secondary window and can be displayed in a floating window mode above the main window. A control menu can be arranged in the layer above the main window, and the control menu can comprise a plurality of conversation controls, such as 'closing a microphone', 'closing a camera', 'opening a picture focus', 'magic prop', 'hanging up', 'switching layout', 'small window chatting' and 'screen capturing'. In fig. 6, "screen capture" indicates a screen capture control, and the screen capture control is selected, and the highlighting mode includes setting the ground color of the call control to white. In some embodiments, to avoid the image in the call window being blocked by the control menu and improve the user video call experience, the control menu may be configured to be hidden without receiving a user operation within a preset time, and the preset time may be 3 to 5 seconds. When the control menu is hidden and the user needs to operate the control in the control menu, a preset instruction can be input to the display device to call up the control menu, the preset instruction can be a trigger signal generated by operating a preset key on the control device 100, and when the display is full-screen, the direction key can be a bottom-layer video source response key, so that the preset key can be the direction key, and certainly, the preset key can also be set as other keys, such as a "screen display" key. When the control menu is displayed and the user does not need to operate the control in the control menu, the control menu can be hidden by triggering a preset key and the like, and further, a prompt word such as 'pressing a top key to retract an operation button' can be arranged on the call window to prompt the user to operate the top key in the direction.
Referring to fig. 7, a screen capture interaction diagram of the present application according to some embodiments is shown. As shown in fig. 7, the user may interact with the display device to cause the display device to display the screenshot, or to control the display device to re-screenshot; the display device can interact with the server, and then displays the two-dimensional code on the screenshot.
In some embodiments, the user may trigger the screen capture control by selecting and clicking the screen capture control by operating the control device 100, so as to generate a trigger signal of the screen capture control.
In some embodiments, the trigger signal for the screen capture control may also be other signals, such as a voice capture instruction, which may be an instruction including a voice "screen capture".
After the display equipment receives the trigger signal of the screenshot control, if the control menu is in a display state, the control menu is hidden, then the screenshot is generated, and the screenshot is displayed on the display.
Furthermore, the display device can send the bitmap data of the screenshot to the server, generate a two-dimensional code according to the network address returned by the server and used for storing the bitmap data, display the two-dimensional code on the screenshot, and enable a user to access the server through a two-dimensional code scanning mode and download the screenshot.
In some embodiments, the display device may further display a re-screen-capture control on the screenshot, and after the user clicks the re-screen-capture control or issues a voice screenshot instruction, the display device may push out the screenshot display and re-display a control menu, where the control menu is set as the screenshot control, so that the user can directly click the screenshot control and re-screen the screenshot.
For further explanation on the process of acquiring the screenshot by the user, the embodiment of the present application further provides an interactive decomposition schematic diagram of fig. 7, and referring to fig. 8, the display device may be provided with a UI manager and a screenshot manager, where the UI manager and the screenshot manager are both function modules of a controller of the display device, the UI manager may perform UI drawing, the drawn image may be displayed by a display of the display device, the screenshot manager may interact with the UI manager to provide a screenshot and a two-dimensional code required by UI drawing for the UI manager, and the screenshot manager acquires a network address stored in the screenshot according to the interaction with the server and generates the two-dimensional code according to the network address.
In some embodiments, after the user triggers the screen capture control by clicking the screen capture instruction or sending a voice screen capture instruction, the UI manager may generate a screen capture acquisition request according to a trigger signal of the screen capture control, where the screen capture acquisition request may be used to hide a display control menu on one hand and to start a screen capture acquisition thread on the other hand.
After the screenshot obtaining request is generated, the UI manager can control the UI to hide the control menu, and images which are not shielded by the control menu can be conveniently obtained.
In some embodiments, hiding the control menu may be implemented by revoking the layer, in some embodiments, hiding the menu may also be implemented by setting the layer to be fully transparent, and in other embodiments, hiding the menu may also be implemented by setting the layer to be invisible.
And the UI manager sends a screenshot obtaining request to the screenshot manager, and the screenshot manager generates a screenshot and feeds the screenshot back to the UI manager. The screenshot manager can generate the screenshot by acquiring BitMap (BitMap) data.
In some embodiments, the screenshot manager may obtain bitmap data of all frames displayed by the current system through a reflection method, for example, obtaining bitmap data through a reflection call to the screen of surface control. The reflection calling is a mode of calling a class or an object by using a reflection mechanism, the reflection mechanism belongs to a java language mechanism, in a running state, all attributes and methods of any class can be obtained, any method and attribute (including private methods and attributes) of any object can be called, the function of the dynamically obtained information and the method for dynamically calling the object is called as the reflection mechanism of the java language, and the reflection calling can be carried out without dependence by the known class name and method name. Since the android system is a system developed based on the JAVA language, a reflective call can be made to a display device on which the system is mounted. The surfacontrol is a surface control class of an android frame, when a controller acquires a system authority, that is, when uid = android, the controller may reflect and call a screenshop of the class to acquire data being rendered by a current screen, such as bitmap data, so as to generate a screenshot, and it is also the case that the controller reflects and calls the screenshop of the class to acquire all data being rendered by the current screen, so that a finally acquired screenshot includes a control menu if the control menu in the layer is not hidden, which is unacceptable when a subsequent user switches on or shares or otherwise uses the current screen, and therefore the control menu is hidden in the scheme provided by the present application. The process of hiding the screenshot is as follows:
1. a surfacontrol instance is created in advance. The bitmap object is made available by creating a SurfaceControl instance. The creation process of the SurfaceControl instance may include defining a reflection acquisition surfesesion class, defining a reflection acquisition surfecerol class, and defining a method of reflection acquisition screenhot. When defining the reflection to obtain the surface control class, parameters of a bitmap data acquisition function such as screen capture, an image width such as 1280, an image height such as 720, a pixel format such as RGB _565 and the like can be defined; when a method for acquiring the screenshot by reflection is defined, the method acquired in the SurfaceControl class is defined as the screenshot method.
2. The bitmap object is reflected to the screen shot method in advance. And reflecting the bitmap object to a screenshop method by using a reflection mechanism, and acquiring data, such as bitmap data, which is currently rendered by a screen by using the screenshop method.
3. And responding to a trigger signal of the screen capture control, and calling a screen shot method by acquiring function reflection to acquire bitmap data. The acquisition function is a pre-created function that provides bitmap data to the controller. According to a trigger signal of the screen capture control, the screen capture manager can start an acquisition function to acquire bitmap data, wherein the acquisition function can be configured to judge whether the system authority can be acquired or not, if the system authority can be acquired, a bitmap object is called, and the bitmap object is reflected to a screenshot method to output the bitmap data.
In some embodiments, the capture function cannot obtain the system permission, and at this time, the screenshot manager may obtain bitmap data of all the frames currently displayed by the system through the mediaproject manager. The mediaproject manager is a screen recording service uniformly called by an application layer under an android frame, and can capture a current display frame by calling the service application, and the implementation mode is as follows:
1. acquiring a MediaProjectionManager instance through getSystemService;
2. calling createScreenCaptureIntent of a manager to obtain intent applying screen capture, starting Activity according to the intent, and obtaining returned intent in a callback onActityResult;
3. mediaproject is acquired by the manager instance, thereby obtaining bitmap data.
It can be seen that the reflection method can be preferentially used to obtain bitmap data when the video call application has the system right, and the bitmap data can be obtained by mediaproject manager when the video call application does not have the system right.
In some embodiments, after obtaining the bitmap data, the screenshot manager may generate a screenshot according to the bitmap data, and send the screenshot to the UI manager.
In some embodiments, the screenshot manager, after generating the screenshot, can also store the screenshot in a location accessible to the UI manager for the UI manager to recall.
And after acquiring the screenshot, the UI manager generates a screenshot popup window, loads the screenshot into the screenshot popup window, and controls the UI to popup the screenshot popup window, so that the screenshot is displayed, and a user can see the screenshot on the display equipment.
In some embodiments, to facilitate the user to save the screenshot, the controller may save bitmap data in a preset path of the display device, generate a local file, and then upload the local file to the server via a stream (data stream), for example, upload the local file via an Http post request whose contentType is application/octet-stream.
In some embodiments, the screenshot manager can perform parallel processing, on one hand, a screenshot is generated according to bitmap data and sent to the UI manager, on the other hand, the bitmap data is stored as a local file, the local file is uploaded to the server, and the screenshot generation efficiency is improved.
After receiving the screenshot file, the server allocates a picture identifier for the screenshot file, feeds the picture identifier back to the display device, processes the screenshot file, and generates an H5 page loaded with the screenshot corresponding to the bitmap data, where the picture identifier may include MediaID, and the MediaID may be a random character string. After receiving the MediaID, the screenshot manager of the display device can generate a picture access request containing the MediaID, send the picture access request to the server, and after receiving the picture access request, the server extracts the MediaID in the picture access request, searches for an H5 page corresponding to the MediaID, and further feeds back a network address of the H5 page to the display device. The screenshot manager of the display device can generate the two-dimensional code according to the network address of the H5 page and send the two-dimensional code to the UI manager, or the screenshot manager can store the two-dimensional code to a storage position accessible by the UI manager so that the UI manager can call the two-dimensional code.
And after the UI manager acquires the two-dimensional code, the two-dimensional code is superposed above the screenshot in the screenshot popup window. The two-dimensional code is configured to jump to the H5 page in response to a scan signal.
A user can scan the two-dimensional code through the mobile terminal to access the H5 page, a storage option can be popped up by pressing the screenshot for a long time on the H5 page, and the screenshot can be stored to the mobile terminal by triggering the storage option by the user.
In some embodiments, the UI manager may also expose a re-screen control. Referring to fig. 9, which is a schematic diagram of a screenshot interface according to some embodiments, as shown in fig. 9, a screenshot displayed on the screenshot interface may include two conversation windows, a re-screenshot control and a two-dimensional code. Wherein, below the two-dimensional code, but the suggestion is shown: and scanning the code to obtain a picture, and prompting a user to scan the code. In some embodiments, when the user is not satisfied with the screenshot image, the screenshot re-control can be triggered by clicking the screenshot re-control or sending a voice screenshot command, and a trigger signal of the screenshot re-control is generated.
And the UI manager controls the UI to exit the screenshot popup window according to the control signal for indicating the screenshot again, and displays the control menu shown in the figure 6, so that the user can trigger the screenshot control again to perform screenshot.
Further, when the control menu is displayed, the UI manager may default to select the screenshot control in the control menu, and the selection may include setting the focus of the display interface on the screenshot control in the control menu, that is, highlighting the screenshot control, so that the user may conveniently trigger the screenshot control through the control device 100, for example, clicking a "confirm" button on the control device 100 to trigger the screenshot control.
To further explain the screenshot process, an embodiment of the present application provides a screenshot method in a video call, referring to fig. 10, where the screenshot method includes the following steps:
step S110: and responding to a trigger signal of a screen capture control received in a video call interface, controlling the display to hide a control menu, and acquiring bitmap data.
In some embodiments, the trigger signal of the screen capture control may be a signal that the screen capture control is selected and clicked or a voice screen capture instruction. Within a preset time after the video call application is started, or after the user calls the control menu, the user may trigger the screenshot control in the control menu, for example, select the screenshot control by operating the control device 100, and then trigger the screenshot control by the control device 100, for example, click an "ok" button on the control device 100 to trigger the screenshot control, or send a voice screenshot instruction to the display device to trigger the screenshot control.
After the user triggers the screen capture control, the controller of the display device can hide a control menu in the video call interface and acquire bitmap data of the current display interface.
In some embodiments, the controller may obtain bitmap data of all frames displayed by the current system through a reflection method; in some embodiments, the controller may also obtain bitmap data for all frames currently being presented by the system via mediaproject manager.
Step S120: and controlling the display to display a screenshot popup window according to the bitmap data, and displaying a screenshot corresponding to the bitmap data in the screenshot popup window.
After the controller acquires the bitmap data, a screenshot can be generated according to the bitmap data, and the screenshot is displayed on the display. In some embodiments, the controller may pop up a screenshot pop-up on the current system presentation screen, displaying the screenshot within the screenshot pop-up, which may be smaller than a full screen window of the display.
Step S130: and sending the bitmap data to a server, and acquiring a picture identifier which is returned by the server and corresponds to the bitmap data.
In some embodiments, the controller may store the bitmap data in a preset path of the display device, generate a screenshot file, and then transmit the screenshot file to the server.
After receiving the screenshot file, the server allocates a MediaID for the screenshot file, feeds the MediaID back to the display device, processes the screenshot file, and generates an H5 page loaded with the corresponding screenshot of the bitmap data.
Step S140: and acquiring a network address of the server for storing the bitmap data according to the picture identification.
In some embodiments, the controller of the display device may generate a picture access request including the MediaID after receiving the MediaID, send the picture access request to the server, and after receiving the picture access request, the server extracts the MediaID in the picture access request, searches for the H5 page corresponding to the MediaID, and feeds back the network address of the H5 page to the display device.
Step S150: and generating a two-dimensional code according to the network address, and controlling the display to display the two-dimensional code in the screenshot popup window.
In some embodiments, the controller of the display device may generate a two-dimensional code based on the network address of the H5 page, the two-dimensional code configured to jump to the H5 page. The controller refreshes the two-dimensional code into a screenshot popup window of the display, so that a user can scan the two-dimensional code through the mobile terminal to access the H5 page, and the picture is stored in the H5 page and then is sent to the mobile terminal.
Furthermore, a screen re-capturing control can be displayed in the screen capturing popup window, when a user is not satisfied with the screen capturing image, the screen re-capturing control can be triggered, a control signal for indicating the re-capturing is generated, the controller controls the display to exit the screen capturing popup window according to the control signal for indicating the re-capturing, and the control menu shown in fig. 6 is displayed, so that the user can trigger the screen capturing control again to capture the screen.
Further, the controller may set the focus of the display interface on a screen capture control in the control menu, that is, highlight the screen capture control, so that the user may trigger the screen capture control through the control device 100, for example, click a "confirm" button on the control device 100 to trigger the screen capture control.
As can be seen from the above embodiments, in the embodiments of the present application, a screenshot control is set on a video call interface, and when a user triggers the screenshot control, a control menu is hidden to capture a screenshot, so that the screenshot is obtained and displayed on a display of a display device, so that the user can view the screenshot; further, bitmap data acquired in the screen capturing process is stored in a screen capturing file, the screen capturing file is uploaded to a server, the server generates an H5 page and a MediaID according to the screen capturing file, a display device can acquire a network address of the H5 page according to the MediaID, a two-dimensional code is generated according to the network address, the two-dimensional code is displayed on the screen capturing, a user can acquire the screen capturing on the H5 page on the mobile terminal in a mode of scanning the two-dimensional code, the screen capturing can be stored on the mobile device, and the user can store and edit the screen capturing conveniently.
Since the above embodiments are all described by referring to and combining with other embodiments, the same portions are provided between different embodiments, and the same and similar portions between the various embodiments in this specification may be referred to each other. And will not be described in detail herein.
It is noted that, in this specification, relational terms such as "first" and "second," and the like, are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Furthermore, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a circuit structure, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such circuit structure, article, or apparatus. Without further limitation, the phrases "comprising a" \8230; "defining an element do not exclude the presence of additional like elements in circuit structures, articles, or devices comprising the element.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims. The above embodiments of the present application do not limit the scope of the present application.

Claims (6)

1. A display device, comprising:
a display;
a camera configured to capture video or pictures;
a controller connected with the display and camera, the controller configured to:
displaying a video call interface, wherein the video call interface comprises an opposite-end video window and a home-end video window;
receiving a call-out instruction input by a remote controller, and calling out a control menu on the video call interface, wherein the control menu is provided with a screen capture control;
controlling the display to hide the control menu in response to receiving a trigger signal of the screen capture control input by the remote controller in the video call interface;
when the video call application has a system right, in response to a trigger signal of the screenshot control, a screenshot manager calls a scenshot method through collection function reflection to obtain bitmap data which is rendered by a current screen, wherein a SurfaceControl instance is created in advance before the trigger signal of the screenshot control is received, the creation process of the SurfaceControl instance comprises defining reflection to obtain a SurfaceSession class, defining reflection to obtain a SurfaceControl class and defining reflection to obtain a scenshot method, wherein when the reflection to obtain the SurfaceControl class, a collection function, an image width, an image height and a pixel format of the bitmap data are defined, and when the reflection to obtain the scenshot method is defined, the method obtained in the SurfaceControl class is defined as the scenshot method; a bitmap object is reflected to the screen shot method by utilizing a reflection mechanism in advance; the acquisition function is configured to judge whether the system authority can be acquired or not, if the system authority can be acquired, the bitmap object is called, and the bitmap object is reflected to a screen method to output bitmap data;
when the video call application does not have the system right, acquiring a MediaProjectionManager instance through getSystemService; calling createScreenCaptureInterent of manager to obtain intent of applying screen capture, starting Activity according to the intent, and obtaining returned intent in callback onactive result; acquiring mediaproject through a manager instance, thereby obtaining the bitmap data;
controlling the display to display a screenshot popup window according to the bitmap data, and displaying a screenshot corresponding to the bitmap data in the screenshot popup window, wherein the screenshot popup window covers the whole video call interface;
sending the bitmap data to a server, acquiring a picture identifier which is returned by the server and corresponds to the bitmap data, and acquiring a network address of the server for storing the bitmap data according to the picture identifier;
generating a two-dimensional code according to the network address, and controlling the display to display the two-dimensional code and a re-screen-capturing control on the upper layer of the screenshot;
receiving a trigger signal input by the remote controller to the re-screen capture control;
responding to a trigger signal of the screen re-capturing control, exiting the screen capturing popup, re-displaying the video call interface and the control menu, and selecting the screen capturing control in the control menu by default;
and receiving a triggering instruction of the default selected screen capturing control input by the remote controller to re-capture the screen.
2. The display device of claim 1, wherein the controlling the display to display the two-dimensional code and the re-screen control in an upper layer of the screenshot comprises:
and superposing the two-dimensional code and the re-screen capture control above the screen capture in the screen capture popup window.
3. The display device according to claim 1, wherein the sending the bitmap data to a server comprises:
storing the bitmap data in a preset path of the display equipment to generate a screenshot file;
and sending the screenshot file to a server.
4. The display device of claim 1, wherein the network address comprises an ip address of an H5 page.
5. The display device of claim 1, wherein the controller is further configured to:
and responding to a trigger signal of a preset key input by a user received in a video call interface, and controlling the display to display a control menu, wherein the control menu comprises a screen capture control.
6. A screenshot method in a video call, comprising:
receiving a call-out instruction input by a remote controller through an interface of a video, and calling out a control menu on a video call interface, wherein the video call interface comprises an opposite-end video window and a local-end video window, and the control menu is provided with a screen capture control;
responding to a trigger signal of the screen capture control input by the remote controller received in the video call interface, and controlling a display to hide the control menu;
when the video call application has a system right, in response to a trigger signal of the screenshot control, a screenshot manager calls a scenshot method through collection function reflection to obtain bitmap data which is rendered by a current screen, wherein a SurfaceControl instance is created in advance before the trigger signal of the screenshot control is received, the creation process of the SurfaceControl instance comprises defining reflection to obtain a SurfaceSession class, defining reflection to obtain a SurfaceControl class and defining reflection to obtain a scenshot method, wherein when the reflection to obtain the SurfaceControl class, a collection function, an image width, an image height and a pixel format of the bitmap data are defined, and when the reflection to obtain the scenshot method is defined, the method obtained in the SurfaceControl class is defined as the scenshot method; a reflection mechanism is used for reflecting the bitmap object to a screen shot method in advance; the acquisition function is configured to judge whether the system authority can be acquired or not, if the system authority can be acquired, the bitmap object is called, and the bitmap object is reflected to a screen method to output bitmap data;
when the video call application does not have the system right, acquiring a MediaProjectionManager instance through getSystemService; calling createScreenCaptureIntent of a manager to obtain intent applying screen capture, starting Activity according to the intent, and obtaining returned intent in a callback onActityResult; acquiring mediaproject through a manager instance, thereby obtaining the bitmap data;
controlling the display to display a screenshot popup window according to the bitmap data, and displaying a screenshot corresponding to the bitmap data in the screenshot popup window, wherein the screenshot popup window covers the whole video call interface;
sending the bitmap data to a server, acquiring a picture identifier which is returned by the server and corresponds to the bitmap data, and acquiring a network address of the server for storing the bitmap data according to the picture identifier;
generating a two-dimensional code according to the network address, and controlling the display to display the two-dimensional code and a re-screen-capturing control in an upper layer of the screenshot in the screenshot popup window;
receiving a trigger signal input by the remote controller to the re-screen capturing control;
responding to a trigger signal of the screen re-capturing control, exiting the screen capturing popup, re-displaying the video call interface and the control menu, and selecting the screen capturing control in the control menu by default;
and receiving a triggering instruction of the default selected screen capturing control input by the remote controller to re-capture the screen.
CN202010769342.5A 2020-08-03 2020-08-03 Display device and screenshot method in video call Active CN111787350B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010769342.5A CN111787350B (en) 2020-08-03 2020-08-03 Display device and screenshot method in video call

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010769342.5A CN111787350B (en) 2020-08-03 2020-08-03 Display device and screenshot method in video call

Publications (2)

Publication Number Publication Date
CN111787350A CN111787350A (en) 2020-10-16
CN111787350B true CN111787350B (en) 2023-01-20

Family

ID=72765760

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010769342.5A Active CN111787350B (en) 2020-08-03 2020-08-03 Display device and screenshot method in video call

Country Status (1)

Country Link
CN (1) CN111787350B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114979042B (en) * 2021-02-23 2023-11-24 北京和缓医疗科技有限公司 Method and system for uploading picture in video call

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104320440A (en) * 2014-09-30 2015-01-28 珠海市君天电子科技有限公司 Picture remote storage method and picture remote storage processing device
CN104407804A (en) * 2014-11-25 2015-03-11 广州酷狗计算机科技有限公司 Screen capturing method and screen capturing device as well as electronic device
CN107896279A (en) * 2017-11-16 2018-04-10 维沃移动通信有限公司 Screenshotss processing method, device and the mobile terminal of a kind of mobile terminal
CN109168069A (en) * 2018-09-03 2019-01-08 聚好看科技股份有限公司 A kind of recognition result subregion display methods, device and smart television

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10237621B2 (en) * 2016-03-24 2019-03-19 Dish Technologies Llc Direct capture and sharing of screenshots from video programming
CN109413490A (en) * 2018-11-08 2019-03-01 四川长虹电器股份有限公司 Method based on mobile terminal interception television image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104320440A (en) * 2014-09-30 2015-01-28 珠海市君天电子科技有限公司 Picture remote storage method and picture remote storage processing device
CN104407804A (en) * 2014-11-25 2015-03-11 广州酷狗计算机科技有限公司 Screen capturing method and screen capturing device as well as electronic device
CN107896279A (en) * 2017-11-16 2018-04-10 维沃移动通信有限公司 Screenshotss processing method, device and the mobile terminal of a kind of mobile terminal
CN109168069A (en) * 2018-09-03 2019-01-08 聚好看科技股份有限公司 A kind of recognition result subregion display methods, device and smart television

Also Published As

Publication number Publication date
CN111787350A (en) 2020-10-16

Similar Documents

Publication Publication Date Title
CN111970549B (en) Menu display method and display device
CN112135180B (en) Content display method and display equipment
CN113938724A (en) Display device and screen recording sharing method
CN112437334A (en) Display device
CN111954059A (en) Screen saver display method and display device
WO2022048203A1 (en) Display method and display device for manipulation prompt information of input method control
CN111984167B (en) Quick naming method and display device
CN112272331B (en) Method for rapidly displaying program channel list and display equipment
CN111669662A (en) Display device, video call method and server
CN112269668A (en) Application resource sharing and display equipment
CN116017006A (en) Display device and method for establishing communication connection with power amplifier device
CN112017415A (en) Recommendation method of virtual remote controller, display device and mobile terminal
CN111787350B (en) Display device and screenshot method in video call
CN114390190B (en) Display equipment and method for monitoring application to start camera
CN114079827A (en) Menu display method and display device
CN112328553A (en) Thumbnail capturing method and display device
CN111931692A (en) Display device and image recognition method
CN111935530A (en) Display device
CN111988646A (en) User interface display method and display device of application program
US11960674B2 (en) Display method and display apparatus for operation prompt information of input control
CN112199612B (en) Bookmark adding and combining method and display equipment
CN113438553B (en) Display device awakening method and display device
CN112231088B (en) Browser process optimization method and display device
CN113194355B (en) Video playing method and display equipment
CN112835631B (en) Method for starting homepage application and display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant