CN112269554B - Display system and display method - Google Patents

Display system and display method Download PDF

Info

Publication number
CN112269554B
CN112269554B CN202011218309.XA CN202011218309A CN112269554B CN 112269554 B CN112269554 B CN 112269554B CN 202011218309 A CN202011218309 A CN 202011218309A CN 112269554 B CN112269554 B CN 112269554B
Authority
CN
China
Prior art keywords
information
laser projection
mobile terminal
display
projection equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011218309.XA
Other languages
Chinese (zh)
Other versions
CN112269554A (en
Inventor
李泽广
郭大勃
肖纪臣
吴超
王学磊
邱若强
穆聪聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Laser Display Co Ltd
Original Assignee
Qingdao Hisense Laser Display Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Laser Display Co Ltd filed Critical Qingdao Hisense Laser Display Co Ltd
Publication of CN112269554A publication Critical patent/CN112269554A/en
Application granted granted Critical
Publication of CN112269554B publication Critical patent/CN112269554B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements

Abstract

The embodiment of the application provides a display system and a display method, wherein the display system comprises: the system comprises a laser projection device and a mobile terminal; the laser projection equipment is in communication connection with the mobile terminal; a laser projection device configured to: determining first information and second information according to the acquired multimedia information; displaying the first information and sending the second information to the mobile terminal; the first information is information used for displaying by the laser projection equipment in the multimedia information; the second information is information used for displaying the mobile terminal in the multimedia information; the mobile terminal is configured to: and receiving and displaying the second information. The display system displays other information which needs to be displayed in the laser projection equipment on the mobile terminal, so that a user can view the picture displayed by the laser projection equipment without shielding, and view other information which needs to be displayed in the laser projection equipment through the mobile terminal, and the use experience of the user is greatly improved.

Description

Display system and display method
The application requires that the application number submitted in 2019 on 11/04 is 201911067968.5, and the name of the invention is as follows: chinese patent application of display equipment, display method and computing equipment, application number of 201911067355.1 submitted by 11 months and 04 days in 2019, and invention name: chinese patent application of a display device and a display method of a menu interface, and application number of 201911067372.5 submitted by 11/04 in 2019, have the following invention names: priority of chinese patent application for display devices, the entire contents of which are incorporated by reference in the present application.
Technical Field
The application relates to the technical field of smart televisions, in particular to a display system and a display method.
Background
With the continuous development of the technical field of smart televisions, the laser projection equipment can provide audio, video, pictures and the like for users to play and display.
In the process of playing a video by a user using a laser projection device, there is usually some information other than the played video that needs to be displayed on the display screen of the laser projection device. For example, a menu interface, new device access information, system push messages, image search information, video incoming call reminder, and the like. The information is displayed on a playing interface which is being played by the laser projection equipment, so that the video which is being watched by the user can be shielded, the influence on watching television by the user is brought, and the watching experience of the user is reduced.
Disclosure of Invention
The application provides a display system and a display method, which are used for improving the watching experience of watching a television, and the technical scheme is as follows:
in a first aspect, the present application provides a display system comprising:
the system comprises a laser projection device and a mobile terminal; the laser projection equipment is in communication connection with the mobile terminal.
The laser projection device configured to: determining first information and second information according to the acquired multimedia information; displaying the first information and sending the second information to the mobile terminal; the first information is information used for displaying by the laser projection equipment in the multimedia information; the second information is information used for displaying by the mobile terminal in the multimedia information.
The mobile terminal is configured to: and receiving and displaying the second information.
Further, the first information includes at least one of: video stream information, television home page information.
Further, the second information includes at least one of: the relevant information of the laser projection equipment, the relevant information of the first information, the information received by the laser projection equipment, the broadcasting reminding information of the reserved program, the split screen broadcasting information of the reserved program, the image search identification result and the relevant information of the target object in the image search identification result.
Further, the second information comprises a graph search identification result; and the image search identification result is the identification result of the image currently displayed by the laser projection equipment after the laser projection equipment receives the image search instruction.
Further, the mobile terminal is further configured to: determining a first operation, wherein the first operation is used for selecting a target object in the recognition result; determining association information of the target object in response to the first operation; and displaying the associated information of the target object.
Further, the mobile terminal is further configured to: generating a second instruction under the condition that the laser projection equipment displays the first information and the mobile terminal displays the second information; the second instruction is used for instructing the laser projection equipment to send the first information to the mobile terminal and instructing the laser projection equipment to display the second information; sending the second instruction to the laser projection equipment; and receiving and displaying the first information.
Further, the laser projection device is further configured to: receiving a second instruction from the mobile terminal; and responding to the second instruction, displaying the second information, and sending the first information to the mobile terminal.
Further, the mobile terminal is further configured to: generating a third instruction; the third instruction is used for indicating the laser projection equipment to determine second information and sending the second information to the mobile terminal; and sending the third instruction to the laser projection equipment.
The laser projection device configured to: receiving the third instruction; determining the second information in response to the third instruction.
In a second aspect, the present application provides a display method, including:
determining first information and second information according to the acquired multimedia information; the first information is information used for displaying by the laser projection equipment in the multimedia information; the second information is information used for displaying by the mobile terminal in the multimedia information.
And displaying the first information.
And sending the second information to the mobile terminal.
Further, the method further comprises: receiving a second instruction from the mobile terminal equipment; the second instruction is used for instructing the laser projection equipment to send the first information to the mobile terminal and instructing the laser projection equipment to display the second information; and responding to the second instruction, displaying the second information, and sending the first information to the mobile terminal.
In a third aspect, the present application provides a display method, comprising:
receiving and displaying second information from the laser projection equipment; the second information comprises a graph search identification result; and the image search identification result is the identification result of the image currently displayed by the laser projection equipment after the laser projection equipment receives the image search instruction.
Determining a first operation, wherein the first operation is used for selecting a target object in the recognition result.
Determining association information of the target object in response to the first operation; and displaying the associated information of the target object.
Further, the method further comprises: generating a second instruction under the condition that the laser projection equipment displays the first information and the mobile terminal displays the second information; the second instruction is used for instructing the laser projection equipment to send the first information to the mobile terminal and instructing the laser projection equipment to display the second information; sending the second instruction to the laser projection equipment; receiving and displaying the first information from the laser projection device.
In a fourth aspect, a computing device is provided, comprising:
a memory for storing program instructions.
And the processor is used for calling the program instructions stored in the memory and executing the method of the second aspect according to the obtained program.
In a fifth aspect, a computing device is provided, comprising:
a memory for storing program instructions.
And the processor is used for calling the program instructions stored in the memory and executing the method of the third aspect according to the obtained program.
Based on the technical scheme, the display system, the display method and the computing equipment provided by the application can display information such as a menu interface, streaming media information and system push messages on the mobile terminal when a user watches a video program by using the laser projection equipment. Therefore, the information can not cover the picture displayed by the laser projection equipment, so that the user can view the information at the mobile terminal while watching the picture displayed by the laser projection equipment without shielding, and the use experience of the user is greatly improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
In order to more clearly illustrate the technical solutions of the present invention or the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can obtain other drawings without inventive labor.
Fig. 1 is a schematic diagram of a display system provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of a laser projection apparatus according to an embodiment of the present disclosure;
FIG. 3 is a schematic structural diagram of another laser projection apparatus provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application;
fig. 5 is a schematic view of an application scenario in which a display system interacts with a control device and a server according to an embodiment of the present application;
fig. 6 is a block diagram of a configuration of a control device according to an embodiment of the present application;
fig. 7 is a schematic application layer diagram of a laser projection apparatus according to an embodiment of the present disclosure;
fig. 8 is a schematic flowchart of a display method according to an embodiment of the present disclosure;
fig. 9-15 illustrate user interface interaction with a user in a display system according to an exemplary embodiment.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The concept to which the present application relates will be first explained below with reference to the drawings. It should be noted that the following descriptions of the concepts are only for the purpose of facilitating understanding of the contents of the present application, and do not represent limitations on the scope of the present application.
The term "module," as used in various embodiments of the present application, may refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in various embodiments of the present application refers to a component of an electronic device, such as a laser projection device, or a mobile terminal as disclosed herein, that is capable of wirelessly controlling the electronic device, typically over a relatively short distance. The component may be generally connected to an electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as wireless fidelity (WIFI), wireless, universal Serial Bus (USB), bluetooth, and motion sensor. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in the embodiments of the present application refers to a user behavior used to express an intended idea, action, purpose, or result through a change in hand shape or an action such as hand movement.
The term "hardware system" used in the embodiments of the present application may refer to a physical component having computing, controlling, storing, inputting and outputting functions, which is formed by a mechanical, optical, electrical and magnetic device such as an Integrated Circuit (IC), a Printed Circuit Board (PCB) and the like. In various embodiments of the present application, a hardware system may also be referred to as a motherboard (or chip).
As shown in fig. 1, a display system 10 provided in the embodiment of the present application includes: a laser projection device 100, and a mobile terminal 200.
The laser projection device 100 and the mobile terminal 200 are communicatively connected.
The laser projection device 100 is configured to analyze and process multimedia information, determine different multimedia information and perform distribution processing, and determine whether each type of multimedia information is displayed by the laser projection device 100 (first information) or the mobile terminal 200 (second information). After the laser projection device 100 splits the multimedia information, the laser projection device 100 displays first information; and transmits the second information to the mobile terminal 200.
The laser projection device 100 comprises a first controller 101 and a first display 102. The first controller 101 is configured to implement the above-mentioned functions of analyzing and processing multimedia information, determining and offloading different multimedia information, determining first information and second information, and sending the second information to the mobile terminal 200. The first controller 101 is further configured to drive and control the first display 102 to display the first information. The first display 102 is used to display first information in response to control driving of the first controller 101.
The mobile terminal 200 includes a second controller 201 and a second display 202. The second controller 201 is configured to receive second information sent by the laser projection apparatus 100, and drive and control the second display 202 to display the second information. The second display 202 is used to display second information in response to control driving of the second controller 201.
The first display 102 and the second display 202 may be used to display different display screens. For example, the first display 102 may be used to display a picture of a television program and the second display 202 may be used to display a picture of a notification-like message, a voice assistant, etc.
Alternatively, the content displayed by the first display 102 and the content displayed by the second display 202 may be independent of each other and not affected by each other. For example, while the first display 102 is playing a television program, the second display 202 may display information such as time, weather, temperature, reminder messages, etc. that are not related to the television program.
Optionally, there may also be an association between the content displayed by the first display 102 and the content displayed by the second display 202. For example, when the first display 102 plays the main screen of a video chat, the second display 202 may display information such as the head portrait, the chat duration, and the like of the user currently accessing the video chat.
Optionally, some or all of the content displayed by the second display 202 may be adjusted to be displayed by the first display 102. For example, the time, weather, temperature, reminder messages, etc. displayed by the first display 102 may be adjusted to be displayed by the first display 102, while other information is displayed by the second display 202.
In addition, the first display 102 displays a multi-party interactive picture while displaying a traditional television program picture, and the multi-party interactive picture does not block the traditional television program picture. The display mode of the traditional television program picture and the multi-party interactive picture is not limited by the application. For example, the position and the size of the traditional television program picture and the multi-party interactive picture can be set according to the priority of the traditional television program picture and the multi-party interactive picture.
Taking the example that the priority of the traditional television program picture is higher than that of the multi-party interactive picture, the area of the traditional television program picture is larger than that of the multi-party interactive picture, and the multi-party interactive picture can be positioned at one side of the traditional television program picture and can also be arranged at one corner of the multi-party interactive picture in a floating manner.
As shown in fig. 1, a camera may be connected or disposed on the first display 102, and is used to present a picture taken by the camera on a display interface of a laser projection device, a mobile terminal, or other display devices, so as to implement an interactive chat between users. Specifically, the picture shot by the camera may be displayed on the laser projection device in a full screen, a half screen, or any selectable area.
As an optional connection mode, the camera is connected with the rear shell of the laser projection device through the connecting plate, and is fixedly installed in the middle of the upper side of the rear shell of the laser projection device.
As another optional connection mode, the camera is connected to the rear housing of the laser projection device through a connection board or another conceivable connector, the connector is provided with a lifting motor, when a user wants to use the camera or has an application program to use the camera, the camera is lifted out of the laser projection device, and when the camera is not needed, the camera can be embedded into the rear housing, so that the camera is protected from being damaged and privacy safety of the user is protected.
As an embodiment, the number of the cameras used in the present application may be 1600 ten thousand, so as to achieve the purpose of ultra high definition display. In actual use, cameras higher or lower than 1600 ten thousand pixels may also be used.
After the camera is installed on the laser projection equipment, the contents displayed by the laser projection equipment in different application scenes can be fused in various different modes, so that the function which cannot be realized by the traditional laser projection equipment is achieved.
Illustratively, a user may conduct a video chat with at least one other user while watching a video program. The presentation of the video program may be as a background frame with a window of video chat displayed over the background frame. The function is called 'chat while watching'.
Optionally, in a scene of "chat while watching", at least one video chat is performed across terminals while watching a live video or a network video.
In another example, a user can conduct a video chat with at least one other user while entering the educational application for learning. For example, a student may interact remotely with a teacher while learning content in an educational application. Vividly, this function can be called "chatting while learning".
In another example, a user conducts a video chat with a player entering a card game while playing the game. For example, a player may enable remote interaction with other players when entering a gaming application to participate in a game. Figuratively, this function may be referred to as "watch while playing".
Optionally, the game scene is fused with the video picture, the portrait in the video picture is scratched and displayed in the game picture, and the user experience is improved.
Optionally, in the motion sensing game (such as ball hitting, boxing, running and dancing), the human posture and motion, limb detection and tracking and human skeleton key point data detection are obtained through the camera, and then the human posture and motion, the limb detection and tracking and the human skeleton key point data detection are fused with the animation in the game, so that the game of scenes such as sports and dancing is realized.
In another example, a user may interact with at least one other user in a karaoke application in video and voice. Vividly, this function can be called "sing while watching". Optionally, when at least one user enters the application in a chat scenario, multiple users may jointly complete recording of a song.
In another example, a user may turn on a camera locally to take pictures and videos, figurative, which may be referred to as "looking into the mirror".
In other examples, more or less functionality may be added. The function of the laser projection device is not particularly limited in this application.
It should be noted that fig. 1 only illustrates the case where the camera is disposed on the housing of the first display, and in a specific implementation, the position where the camera is disposed may be determined according to actual requirements. For example, the first controller housing, the first display housing, the second controller housing, the second display housing, or a separate housing, which is not limited in this application.
As shown in fig. 2, a hardware structure diagram of a laser projection apparatus provided in an embodiment of the present application includes:
the system comprises a first multimedia unit 201, a first projection display control unit 202, a first control unit 203, an eye protection plate control unit 204, a first imaging projection display unit 205, a power supply module 206, a light source driving unit 207 and an audio processing unit 208.
The functions of the unit modules are described in detail below:
the first multimedia unit 201 is configured to receive an external input signal, and send the corresponding input signal to the first projection display unit 202 and the first control unit 203, respectively. The first multimedia unit 201 may specifically include at least one of the following: the system comprises a WIFI module, a wireless input module, an Ethernet input module, a USB input module, a High Definition Multimedia Interface (HDMI) input module, a touch key input module, a light sensing module and a far-field voice module, wherein the first Multimedia unit 201 can receive an external input signal according to at least one module.
In addition, the first multimedia control unit 201 is further configured to perform an integrated circuit bus (I2C) communication with the first control unit 203; and supplies the VB1 video signal to the first projection display control unit 202.
The first projection display control unit 202 is configured to send a Low-Voltage Differential Signaling (LVDS) or a High-Speed Serial Interface (HSSI) video signal to the first imaging projection display unit 205, and a control signal configured to drive and control the first imaging projection display unit 205 to display a video picture and control an operating state of the first imaging projection display unit 205. The first projection display control unit 202 is further configured to provide a Pulse Width Modulation (PWM) signal and a Duty signal to the light source driving unit 211, and perform I2C communication with the first control unit 203.
The first control unit 203 is used for controlling the working state of the heat dissipation device of the laser projection equipment; monitoring the ambient temperature and the laser temperature; controlling the rotating speed of the speckle eliminating wheel; and controls the power-on and power-off of the light source driving unit 211. The first control unit 203 is further configured to perform I2C communication or serial communication with the eye protection plate.
And the eye protection plate control unit 204 is used for controlling the working mode of the eye protection plate of the laser projection device.
The first imaging projection display unit 205 includes a lens portion of the laser projection device, or an optical system composed of a light valve, an illumination lens, and a projection lens of the laser projection device. The first imaging projection display unit 205 is configured to receive the LVDS or HSSI video signal or the like from the first projection display control unit 202 and project the video signal onto the first display.
And a power supply module 206 for supplying power to the display system. For example, the power module 206 may provide 12V power for the first control unit; providing a 12V power supply for the first projection display control unit; an 18V power supply is provided for the audio processing unit and the first multimedia unit.
The light source driving unit 207 is used to provide an energy source for the laser.
The laser includes: a blue laser 212, a green laser 213 and a red laser 214 for providing laser light of three primary colors. The three lasers are used for providing laser light sources for the laser projection equipment.
Fig. 3 shows another schematic structural diagram of a laser projection apparatus provided in an embodiment of the present application, and as shown in fig. 3, the laser projection apparatus includes: a Television (TV) board 310, a display panel 320, a light source 330, and a light source driving circuit 340.
Hereinafter, each device related to fig. 3 will be described in detail:
the TV board 310 is mainly used to receive and decode external audio and video signals. The TV board module 310 is provided with a System on Chip (SoC) capable of decoding data of different data formats into a normalized format and transmitting the data of the normalized format to the display panel 320 through, for example, a connector (connector).
The display panel 320 may be provided with a Field Programmable Gate Array (FPGA) 321, a control processing module 322, and a light modulation device 323.
The FPGA321 is used for processing an input video image signal, such as performing Motion Estimation and Motion Compensation (MEMC) frequency multiplication processing, or implementing an image enhancement function such as image correction.
And the control processing module 322 is connected with the algorithm processing module FPGA and is used for receiving the processed video image processing signal data as image data to be projected. The control processing module 322 outputs a current PWM brightness adjustment signal and an enable control signal according to image data to be transmitted, and implements timing and lighting control of the light source 330 through the light source driving circuit 340.
The light modulation device 323 can receive the video image signal output from the TV board 310 and analytically know the divisional luminance signal and the image component of the video image. Alternatively, the optical modulation device 323 may receive the image signal to be projected output by the FPGA321, and the image signal to be projected may include an image brightness signal and an image component after absorption.
The light source 330 may be a red light source, a blue light source, and a green light source, and the light sources of the three colors may emit light simultaneously or in a time sequence. The light source 330 is driven to light up according to the timing of image display indicated by the control instruction of the control processing module 322.
The mobile terminal described in the embodiment of the present application may be a mobile terminal that can communicate with other devices and can display a screen, such as a mobile phone, a wearable device (e.g., a smart watch), a tablet computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), an Augmented Reality (AR) \ Virtual Reality (VR) device, and the like, and the specific form of the mobile terminal 200 is not particularly limited in the embodiment of the present application.
Fig. 4 is a schematic structural diagram of a mobile terminal 200 according to an embodiment of the present disclosure.
The mobile terminal 200 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the mobile terminal 200. In other embodiments of the present application, mobile terminal 200 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The charging management module 140 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the mobile terminal 200. The charging management module 140 may also supply power to the mobile terminal through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In other embodiments, the power management module 141 may be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the mobile terminal 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in mobile terminal 200 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the mobile terminal 200. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the mobile terminal 200, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of mobile terminal 200 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that mobile terminal 200 may communicate with networks and other devices via wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The mobile terminal 200 implements a display function through the GPU, the display screen 194, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the mobile terminal 200 may include 1 or N display screens 194, N being a positive integer greater than 1.
The mobile terminal 200 may implement a photographing function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, mobile terminal 200 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the mobile terminal 200 selects a frequency point, the digital signal processor is used to perform fourier transform or the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The mobile terminal 200 may support one or more video codecs. In this way, the mobile terminal 200 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU may implement applications such as intelligent recognition of the mobile terminal 200, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the mobile terminal 200. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 121 may be used to store computer-executable program instructions, including instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like. The storage data area may store data (e.g., audio data, a phonebook, etc.) created during use of the mobile terminal 200, and the like. In addition, the internal memory 121 may include a high speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a Universal Flash Storage (UFS), and the like. The processor 110 executes various functional applications of the mobile terminal 200 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The mobile terminal 200 may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The mobile terminal 200 can listen to music through the speaker 170A or listen to a hands-free call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the mobile terminal 200 receives a call or voice information, it can receive voice by bringing the receiver 170B close to the human ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The mobile terminal 200 may be provided with at least one microphone 170C. In other embodiments, the mobile terminal 200 may be provided with two microphones 170C to implement a noise reduction function in addition to collecting sound signals. In other embodiments, three, four or more microphones 170C may be further disposed on the mobile terminal 200 to collect voice signals, reduce noise, identify voice sources, and implement directional recording functions.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The mobile terminal 200 may receive a key input, and generate a key signal input related to user setting and function control of the mobile terminal 200.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card.
As shown in fig. 5, the display system 10 in the embodiment of the present application is a schematic diagram of an application scenario interacting with the control device 400 and the server 500.
The control device 400 may be a remote controller 400A, which can communicate with the laser projection apparatus 100 through an infrared protocol communication, a bluetooth protocol communication, a ZigBee (ZigBee) protocol communication, or other short-distance communication method, and is used to control the laser projection apparatus 100 through a wireless or other wired method. The user may input user instructions via keys on remote control 400A, voice input, control panel input, etc. to control laser projection device 100. Such as: the user may input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right movement keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller 400A to control the functions of the laser projection apparatus 100.
The control device 400 may also be an intelligent device, such as the mobile terminal 200, which may communicate with the laser projection device 100 through a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), or another Network, and implement control over the laser projection device 100 through an application program corresponding to the laser projection device 100. For example, laser projection device 100 is controlled using an application running on a smart device. The application may provide various controls to the User through an intuitive User Interface (UI) on a screen associated with the smart device.
For example, the mobile terminal 200 and the laser projection apparatus 100 may each be installed with a software application, so that connection communication between the two may be realized through a network communication protocol, and thus, the purpose of one-to-one control operation and data communication may be realized. Such as: a control instruction protocol can be established between the mobile terminal 200 and the laser projection device 100, a remote control keyboard is synchronized to the mobile terminal 200, and the function of controlling the laser projection device 100 is realized by controlling a user interface on the mobile terminal 200; the audio and video contents displayed on the mobile terminal 200 may also be transmitted to the laser projection device 100, so as to implement a synchronous display function.
The server 500 may be a video server, an Electronic Program Guide (EPG) server, a cloud server, or the like.
The laser projection device 100 may be in data communication with the server 500 through a variety of communication means. In various embodiments of the present application, the laser projection device 100 may be allowed to be in wired or wireless communication with the server 500 via a local area network, a wireless local area network, or other network. Server 500 may provide various content and interactions to laser projection device 100.
Illustratively, laser projection device 100 receives software program updates, or accesses a remotely stored digital media library by sending and receiving information, as well as EPG interactions. The servers 500 may be a group or groups, and may be one or more types of servers. Other web service contents such as video on demand and advertisement services are provided through the server 500.
A block diagram of the configuration of the control device 400 according to an exemplary embodiment is exemplarily shown in fig. 6. As shown in fig. 6, the control device 400 includes a controller 410, a communicator 430, a user input/output interface 440, a memory 490, and a power supply 480.
The control device 400 is configured to control the laser projection apparatus 100, and to receive an input operation instruction from a user, and to convert the operation instruction into an instruction recognizable and responsive to the laser projection apparatus 100, so as to mediate interaction between the user and the laser projection apparatus 100. Such as: the user responds to the channel up and down operation by operating the channel up and down keys on the control device 400.
In some embodiments, the control apparatus 400 may be a smart device. Such as: the control device 400 may be installed to control various applications of the laser projection apparatus 100 according to user requirements.
In some embodiments, the mobile terminal 200 or other intelligent electronic device may function similar to the control apparatus 400 after installing an application for operating the laser projection device 100. Such as: the user may implement the function of controlling the physical keys of the apparatus 400 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 200 or other intelligent electronic devices.
The controller 410 includes a processor 412, a RAM 413 and a ROM 414, a communication interface, and a communication bus. The controller 410 is used to control the operation of the control device 400, as well as the internal components for communication and coordination and external and internal data processing functions.
The communicator 430 enables communication of control signals and data signals with the laser projection apparatus 100 under the control of the controller 410. Such as: the received user input signal is transmitted to the laser projection device 100. The communicator 430 may include at least one of a WIFI module 431, a bluetooth module 432, a Near Field Communication (NFC) module 433, and the like.
A user input/output interface 440, wherein the input interface includes at least one of a microphone 441, a touch pad 442, a sensor 443, keys 444, a camera 445, and the like. Such as: the user may implement a user instruction input function through voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the laser projection apparatus 100.
The output interface includes an interface that transmits the received user instruction to the laser projection apparatus 100. In some embodiments, it may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the laser projection device 100 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then modulated according to the rf control signal modulation protocol, and then transmitted to the laser projection apparatus 100 through the rf transmitting terminal.
In some embodiments, the control device 400 includes at least one of a communicator 430 and an output interface. The control device 400 is configured with a communicator 430, such as: the modules such as the WIFI, the bluetooth, and the NFC may transmit the user input command to the laser projection device 100 through a WIFI protocol, a bluetooth protocol, or an NFC protocol code.
A memory 490 for storing various operation programs, data and applications for driving and controlling the control apparatus 400 under the control of the controller 410. The memory 490 may store various control signal commands input by a user.
And a power supply 480 for providing operation power support for each electrical component of the control device 400 under the control of the controller 410. The power supply 480 may be implemented by a battery and related control circuitry.
As shown in fig. 7, the application layer of the laser projection device includes various applications that may be executed at the laser projection device 100.
The application layer 1912 of the laser-projection device 100 may include, but is not limited to, one or more applications such as: live television applications, video-on-demand applications, media center applications, application centers, gaming applications, and the like.
The live television application program can provide live television through different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on laser projection device 100.
A video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of cloud storage, from a local hard disk storage containing stored video programs.
The media center application program can provide various applications for playing multimedia contents. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
The application program center can provide and store various application programs. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on a display device. The application center may obtain these applications from different sources, store them in local storage, and then be operational on laser projection device 100.
As shown in fig. 8, a display method provided in an embodiment of the present application includes:
s801, the laser projection equipment acquires multimedia information.
In a possible implementation manner, the multimedia information may include at least one of the following: the system comprises video streaming information, television homepage information, a menu interface, picture searching information, system information of a display system, new equipment access reminding information, pushing information of the display system, barrage information of the video streaming information, video call incoming call reminding information, video call information, weather information, time information and advertisement pushing information.
Specifically, the multimedia information may include: the EPG server sends video information to the laser projection equipment according to the playing information selected by the user; the system information which is determined by the laser projection equipment and needs to be pushed to the user by the display system; weather information, time information and advertisement push information acquired by the laser projection equipment; the network side pushes information to the laser projection equipment; the new equipment access reminding information is obtained after the new equipment is accessed into the display system; a menu interface determined according to the operation of the user; bullet screen information of a video currently being played; and the display system has at least one of video incoming call reminding information and video call information when the video call function is achieved.
S802, the laser projection equipment determines first information and second information according to the multimedia information.
The first information is information used for displaying by the laser projection equipment in the multimedia information; the second information is information used for displaying by the mobile terminal in the multimedia information;
it should be noted that the laser projection device is a main playback device, and is used for displaying video playback information selected by a user, or contents such as a main menu of the system. For example, the user selects information such as a live video program, a video-on-demand program, and the like. Accordingly, the laser projection device determines the information to be played by the first laser projection device as the first information.
The mobile terminal is an auxiliary playing device and is used for displaying information which is not required to be played in the laser projection device. For example: the system comprises a menu interface, system information of a display system, new equipment access reminding information, push information of the display system, barrage information of video stream information, video call incoming call reminding information, video call information, weather information, time information and advertisement push information. Accordingly, the laser projection device determines the information to be played by the second laser projection device as the second information.
In one possible implementation manner, mapping relationships between various types of information and the first information or the second information are preset in the laser projection device.
After the laser projection equipment acquires the multimedia information, determining whether the multimedia information is the first information or the second information according to the information type of each information in the multimedia information and the mapping relation.
And S803, the laser projection equipment displays the first information.
And S804, the multi-laser projection equipment sends second information to the mobile terminal. Correspondingly, the mobile terminal receives the second information from the laser projection device.
And S805, the mobile terminal displays the second information.
Based on the technical scheme, the display system, the display method and the computing equipment provided by the application can display information such as a menu interface, streaming media information and system push messages on the mobile terminal when a user watches a video program by using the laser projection equipment. Therefore, the information can not cover the picture displayed by the laser projection equipment, so that the user can view the information at the mobile terminal while watching the picture displayed by the laser projection equipment without shielding, and the use experience of the user is greatly improved.
It should be noted that, when the laser projection device is displaying the playing information, according to a difference of the second information, the display method provided in the embodiment of the present application may specifically include the following scenarios: scene 1 and the second information are network side information; scene 2, the second information is new equipment access information; scene 3, the second information is system push information; scene 4, the second information is video call incoming call reminding information; scene 5, the second information is video call information; scene 6 and the second information are picture search information, and scene 7 and the second information are reserved program playing reminding information.
The above-described scenarios are explained in detail below:
scene 1, the second information is network side information.
In scene 1, the multimedia information received by the laser projection device includes: video information from a video playing server, and network side information sent by a network side.
In an example, the network-side information may include: weather information, time information, text information, news information, advertisement push information, and the like.
In the following, the network side information is taken as time information and weather information as an example for explanation:
as shown in fig. 9, when the first information is video information and the second information is weather information pushed by the network side, the display interface of the laser projection device and the display interface of the mobile terminal are shown.
Specifically, when the laser projection device is playing a video, the laser projection device receives the weather information pushed by the network side. At this time, the multimedia information determined by the laser projection device includes both video information and weather information. The laser projection equipment determines that video information in the multimedia information is first information and weather information is second information.
The laser projection device displays the video information.
The laser projection equipment sends weather information to the mobile terminal, and the mobile terminal displays the weather information after receiving the weather information.
Based on the scene, compared with the technical scheme that in the prior art, when the laser projection device needs to display the network side information, the network side information needs to be covered on the laser projection device, the display system and the display method provided by the embodiment of the application can display the network side information on the mobile terminal, so that the situation that the network side information shields a playing picture of the laser projection device is avoided, and the watching experience of a user is improved.
And scene 2, the second information is new equipment access information.
In scenario 2, the multimedia information received by the laser projection device includes: video information from a video playing server, and new device access information detected by the laser projection device.
The new device can be accessed into the display system in any one of the following modes: through the USB interface access, through the HDMI interface access, through the WIFI module access, through wireless network access, through ethernet access etc. this application does not limit this.
In the following, the new device is connected to the display system through the HDMI interface as an example:
as shown in fig. 10, when the first information is video information, and the second information is new device access information, the display interface of the laser projection device and the display interface of the mobile terminal are shown.
An example, the new device access information may include: the type of the new device, the type of the access interface, and the access time.
Specifically, the laser projection device transmits main play information of the laser projection device, such as video stream information, system homepage information, and the like, to the first controller of the laser projection device according to the current state of the laser projection device. The first controller processes the information and then controls the first display to display the information in real time.
And in the process of playing the information by the laser projection equipment, the laser projection equipment detects that new equipment is accessed into the display system through the USB interface. The USB interface triggers the laser projection equipment to generate new equipment access information.
The laser projection equipment determines that the video playing information is first information, and the new equipment access information is second information.
The laser projection device displays the video information.
The laser projection equipment sends new equipment access information to the mobile terminal, and the mobile terminal displays the new equipment access information after receiving the new equipment access information.
Based on the scene, compared with the prior art, when the laser projection device needs to display the new device access information, the new device access information needs to be covered on the laser projection device, the display system and the display method provided by the embodiment of the application can display the new device access information on the mobile terminal, so that the situation that the new device access information shields a playing picture of the laser projection device is avoided, and the watching experience of a user is improved.
And scene 3, the second information is system push information.
In scenario 3, the multimedia information received by the laser projection device includes: video information from a video playing server, and system push information.
An example, the system pushing information may include: system update reminding, system advertisement push, scheduled program start reminding and other information.
In the following, the system push information is taken as the system update information as an example to explain:
as shown in fig. 11, when the first information is video information and the second information is system update information, the display interface of the laser projection device and the display interface of the mobile terminal are displayed.
Specifically, while the laser projection device is playing video, the laser projection device receives a system update package. And the laser projection equipment generates system updating information to prompt a user whether to update the system. At this time, the multimedia information determined by the laser projection device includes both video information and system update information. The laser projection equipment determines that video information in the multimedia information is first information, and system updating information is second information.
The laser projection device displays the video information.
And the laser projection equipment sends system updating information to the mobile terminal, and the mobile terminal displays the system updating information after receiving the system updating information.
Based on the scene, compared with the technical scheme that in the prior art, when the laser projection device needs to display the system updating information, the network side information needs to be covered on the laser projection device, the display system and the display method provided by the embodiment of the application can display the system updating information on the mobile terminal, so that the situation that the system updating information shields the playing picture of the laser projection device is avoided, and the watching experience of a user is improved.
Scene 4 and the second information are video call incoming call reminding information.
In scene 4, the multimedia information received by the laser projection device includes: video information from a video playing server, and video call incoming call reminding information.
As shown in fig. 12, the display interface of the laser projection device and the display interface of the mobile terminal are shown when the first information is video information and the second information is video call incoming call reminding information.
Specifically, when the laser projection device is playing a video, the laser projection device receives a video call incoming from other devices, and the laser projection device generates corresponding video call incoming call reminding information. At this time, the multimedia information determined by the laser projection device includes both video information and video call incoming call reminding information. The laser projection equipment determines that video information in the multimedia information is first information, and video call incoming call reminding information is second information.
The laser projection device displays the video information.
The laser projection equipment sends video call incoming call reminding information to the mobile terminal, and the mobile terminal displays the video call incoming call reminding information after receiving the video call incoming call reminding information.
Based on the scene, compared with the technical scheme that in the prior art, when the laser projection device needs to display the video call incoming call reminding information, the video call incoming call reminding information needs to be covered on the laser projection device, the display system and the display method provided by the embodiment of the application can display the video call incoming call reminding information on the mobile terminal, so that the situation that the video call incoming call reminding information blocks a playing picture of the laser projection device is avoided, and the watching experience of a user is improved.
Scene 5 and the second information are video call information.
It should be noted that the scene 5 is a scene subsequent to the scene 4, and is a scene after the user determines to answer the video call according to the content displayed by the mobile terminal and sends an answer instruction to the laser projection device.
As shown in fig. 13, when the first information is video information and the second information is video call information, the display interface of the laser projection device and the display interface of the mobile terminal are shown.
Specifically, after the mobile terminal displays the video call incoming call reminding information, if the user determines to answer the incoming call, the user executes a first operation. And after detecting the first operation, the laser projection equipment receives video call information from the initiating video call equipment.
At this time, the multimedia information determined by the laser projection device includes both video information and video call information. The laser projection equipment determines that video information in the multimedia information is first information and video call information is second information.
The laser projection device displays the video information.
The laser projection equipment sends video call information to the mobile terminal, and the mobile terminal displays the video call information after receiving the video call information.
It is noted that the first operation may be an operation in which the user answers the video call by clicking an "answer" button on the display system control device.
Alternatively, the first operation may be an operation in which the user inputs an "answer" instruction through a voice input function to answer a video call.
Alternatively, the user may also choose to receive the video call in other ways. Any operation performed by the user to select to answer the video call can be regarded as the first operation.
Based on the scene, compared with the technical scheme that in the prior art, when the laser projection device needs to display the video telephone, the video telephone needs to be covered on the laser projection device, the display system and the display method provided by the embodiment of the application can display the video telephone on the mobile terminal, so that the video telephone is prevented from blocking a playing picture of the laser projection device, and the watching experience of a user is improved.
In one possible implementation, the laser projection device may further generate a display selection reminding message to remind the user to select a device for displaying the video call message.
In the implementation mode, the display selection reminding information determined by the laser projection equipment is the second information, and the display selection reminding information is sent to the mobile terminal. The mobile terminal displays the display selection reminding information.
When the laser projection equipment receives that the user equipment selects the mobile terminal to display the video call, the laser projection equipment determines that the video call is second information.
When the laser projection equipment receives a video call displayed by the laser projection equipment selected by user equipment, the laser projection equipment determines the video call as first information. Correspondingly, the laser projection device can determine that the video being played by the current laser projection device is the second information, and send the second information to the mobile terminal for playing.
Based on the method, the display system can flexibly determine the equipment for displaying the video playing and the equipment for playing the video call based on the selection of the user.
Scene 6 and the second information are image search information.
In scenario 6, the multimedia information received by the laser projection device includes: the video information from the video playing server and the image information obtained after screen capture of the display interface of the current laser projection equipment.
As shown in fig. 14, when the first information is video information, and the second information is image search information, the display interface of the laser projection device and the display interface of the mobile terminal are shown.
Specifically, when the laser projection apparatus is playing video information, if the user views an interesting picture, the user may click the "function" of image search on the control device. The control device generates a map search instruction correspondingly and sends the instruction to the laser projection equipment. And after the laser projection equipment receives the instruction, intercepting the image currently displayed by the laser projection equipment. The laser projection device sends the image to the mobile terminal.
The mobile terminal displays the image, identifies the image, virtually divides and analyzes each person and/or article in the image to obtain an identification result of the image, and displays the identification result of the image.
And the user selects the interested target object according to the identification result displayed by the mobile terminal. The operation of selecting the target object by the user is recorded as a first operation. After detecting the first operation of the user, the mobile terminal determines a target object and searches the associated information of the target object at the network side. And after the mobile terminal searches the associated information of the target object, displaying the associated information of the target object.
For example, as shown in fig. 14, the mobile terminal displays introduction information of a lady's handbag in the image and two-dimensional code information for purchasing the handbag.
It should be noted that the association information may indicate different meanings based on different categories of the target object in the recognition result.
In the first example, the target object in the recognition result is a person, and the related information may be the name, occupation, year and month of birth, native place, work of representation, and the like of the person.
In a second example, if the target object in the recognition result is a television channel, the related information may be a name of the television channel, a program being played, a program announcement, and the like.
In a third example, if the target object in the recognition result is an item, the related information may be a money-like item of the item, a purchase link, or the like.
In a fourth example, if the target object in the recognition result is a building, the related information may be a location, a feature, and the like of the building.
In a fifth example, if the target object in the recognition result is a plant, the related information may be the name, variety, introduction, and the like of the plant.
In the sixth example, if the target object in the recognition result is an animal, the related information may be the name, the living area, the habit, and the like of the animal.
Based on the scene, compared with the technical scheme that when the user selects the image search function, the identification result of the image search function and the associated information of the target object need to be covered on the laser projection device in the prior art, the display system and the display method provided by the embodiment of the application can directly send the current image of the laser projection device intercepted by the image search function to the mobile terminal, the mobile terminal displays the identification result, and the mobile terminal interacts with the user to determine the target object and display the associated information of the target object. Therefore, the situation that the identification result of the image searching function and the associated information of the target object shield the playing picture of the laser projection equipment is avoided, and the watching experience of a user is improved.
Scene 7 and the second information are scheduled program playing reminding information.
In scenario 7, the multimedia information received by the laser projection device includes: and the laser projection equipment generates reserved program playing reminding information according to the reserved program information set by the user.
As shown in fig. 15, when the first information is video information and the second information is scheduled program playing reminding information, the display interface of the laser projection device and the display interface of the mobile terminal are shown.
Specifically, when the laser projection device is playing a video, the user equipment may select a video to be scheduled to be played from a program table of the laser projection device through a mobile terminal or a control device. After the laser projection device determines the video scheduled to be played, the start time of the video is further determined. When the laser television determines that the difference between the current time and the starting time of the video is less than a preset time length (for example, 5 minutes), the laser projection equipment generates scheduled program playing reminding information and sends the scheduled program playing reminding information to the mobile terminal. And after receiving the appointed program playing reminding information, the mobile terminal displays the appointed program playing reminding information to remind a user that the appointed program is about to start playing.
The exemplary embodiments of the present application provide a display terminal, which may be specifically a desktop computer, a portable computer, a smart phone, a tablet computer, a Personal Digital Assistant (PDA), and the like. The Display terminal may include a Central Processing Unit (CPU), a memory, an input/output device, and the like, the input device may include a keyboard, a mouse, a touch screen, and the like, and the output device may include a Display device, such as a Liquid Crystal Display (LCD), a Cathode Ray Tube (CRT), and the like.
For different display terminals, in some exemplary embodiments, the user interfaces 620, 820 may be interfaces capable of interfacing externally to desired devices including, but not limited to, keypads, displays, speakers, microphones, joysticks, and the like.
The processor is responsible for managing the bus architecture and general processing, and the memory may store data used by the processor 600 in performing operations.
In some exemplary embodiments, the processor may be a CPU (central processing unit), an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or a CPLD (Complex Programmable Logic Device).
The memory may include Read Only Memory (ROM) and Random Access Memory (RAM), and provides the processor with program instructions and data stored in the memory. In the present embodiment, the memory may be used to store a program of any of the methods provided in the present exemplary embodiments.
The processor is configured to execute any of the methods provided in the exemplary embodiments of the present application in accordance with the obtained program instructions by calling the program instructions stored in the memory.
The present application provides, in an exemplary embodiment, a computer storage medium for storing computer program instructions for an apparatus provided in the above-mentioned embodiment of the present application, which includes a program for executing any one of the methods provided in the above-mentioned embodiment of the present application.
The computer storage media may be any available media or data storage device that can be accessed by a computer, including, but not limited to, magnetic memory (e.g., floppy disks, hard disks, magnetic tape, magneto-optical disks (MOs), etc.), optical memory (e.g., CDs, DVDs, BDs, HVDs, etc.), and semiconductor memory (e.g., ROMs, EPROMs, EEPROMs, non-volatile memories (NAND FLASH), solid State Disks (SSDs)), etc.
As will be appreciated by one skilled in the art, the present exemplary embodiments may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments shown in the present application without inventive step, are within the scope of protection of the present application. Moreover, while the disclosure herein has been presented in terms of exemplary one or more examples, it is to be understood that each aspect of the disclosure can be utilized independently and separately from other aspects of the disclosure to provide a complete disclosure.
It should be understood that the terms "first," "second," "third," and the like in the description and claims of this application and in the foregoing drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can, for example, be implemented in sequences other than those illustrated or otherwise described herein with reference to the embodiments of the application.
Furthermore, the terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and these modifications or substitutions do not depart from the scope of the technical solutions of the embodiments of the present application.

Claims (11)

1. A display system, comprising:
the system comprises a laser projection device and a mobile terminal; the laser projection equipment is in communication connection with the mobile terminal;
the laser projection device configured to: determining first information and second information according to the acquired multimedia information; displaying the first information and sending the second information to the mobile terminal; the first information is information used for displaying by the laser projection equipment in the multimedia information; the second information is information used for displaying by the mobile terminal in the multimedia information;
the mobile terminal is configured to: receiving and displaying the second information;
the second information comprises a graph search identification result; and the image search identification result is the identification result of the image currently displayed by the laser projection equipment after the laser projection equipment receives the image search instruction.
2. The display system of claim 1, wherein the first information comprises at least one of: video streaming information, television home page information.
3. The display system according to claim 1 or 2, wherein the second information comprises at least one of: the relevant information of the laser projection equipment, the relevant information of the first information, the information received by the laser projection equipment, the scheduled program playing reminding information, the scheduled program split screen playing information, the picture searching identification result and the relevant information of the target object in the picture searching identification result.
4. The display system of claim 1, wherein the mobile terminal is further configured to:
determining a first operation, wherein the first operation is used for selecting a target object in the recognition result;
determining association information of the target object in response to the first operation;
and displaying the associated information of the target object.
5. The display system according to claim 1 or 2, wherein the mobile terminal is further configured to:
generating a second instruction under the condition that the laser projection equipment displays the first information and the mobile terminal displays the second information; the second instruction is used for instructing the laser projection equipment to send the first information to the mobile terminal and instructing the laser projection equipment to display the second information;
sending the second instruction to the laser projection equipment;
and receiving and displaying the first information.
6. The display system of claim 5, wherein the laser projection device is further configured to:
receiving a second instruction from the mobile terminal;
and responding to the second instruction, displaying the second information, and sending the first information to the mobile terminal.
7. The display system according to claim 1 or 2, wherein the mobile terminal is further configured to:
generating a third instruction; the third instruction is used for indicating the laser projection equipment to determine second information and sending the second information to the mobile terminal;
sending the third instruction to the laser projection equipment;
the laser projection device configured to: receiving the third instruction;
determining the second information in response to the third instruction.
8. A display method applied to the laser projection device of any one of claims 1 to 7, the method comprising:
determining first information and second information according to the acquired multimedia information; the first information is information used for displaying by the laser projection equipment in the multimedia information; the second information is information used for displaying by the mobile terminal in the multimedia information;
displaying the first information;
sending the second information to the mobile terminal;
the second information comprises a graph search identification result; and the image search identification result is the identification result of the image currently displayed by the laser projection equipment after the laser projection equipment receives the image search instruction.
9. The method of claim 8, wherein after the displaying the first information and sending the second information to the mobile terminal, the method further comprises:
receiving a second instruction from the mobile terminal equipment; the second instruction is used for instructing the laser projection equipment to send the first information to the mobile terminal and instructing the laser projection equipment to display the second information;
and responding to the second instruction, displaying the second information, and sending the first information to the mobile terminal.
10. A display method applied to the mobile terminal according to any one of claims 1 to 7, the method comprising:
receiving and displaying second information from the laser projection equipment; the second information comprises a graph search identification result; the image search identification result is the identification result of the image currently displayed by the laser projection equipment after the laser projection equipment receives the image search instruction;
determining a first operation, wherein the first operation is used for selecting a target object in the recognition result;
determining association information of the target object in response to the first operation;
and displaying the associated information of the target object.
11. The method of claim 10, further comprising:
generating a second instruction under the condition that the laser projection equipment displays the first information and the mobile terminal displays the second information; the second instruction is used for instructing the laser projection equipment to send the first information to the mobile terminal and instructing the laser projection equipment to display the second information;
sending the second instruction to the laser projection equipment;
receiving and displaying the first information from the laser projection device.
CN202011218309.XA 2019-11-04 2020-11-04 Display system and display method Active CN112269554B (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
CN2019110673551 2019-11-04
CN201911067355 2019-11-04
CN201911067968 2019-11-04
CN2019110673725 2019-11-04
CN2019110679685 2019-11-04
CN201911067372 2019-11-04

Publications (2)

Publication Number Publication Date
CN112269554A CN112269554A (en) 2021-01-26
CN112269554B true CN112269554B (en) 2023-03-14

Family

ID=74344686

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011218309.XA Active CN112269554B (en) 2019-11-04 2020-11-04 Display system and display method

Country Status (1)

Country Link
CN (1) CN112269554B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114979676A (en) * 2021-02-20 2022-08-30 花瓣云科技有限公司 Live broadcast method, device and system
CN113285867B (en) * 2021-04-28 2023-08-22 青岛海尔科技有限公司 Method, system, device and equipment for message reminding

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101729657A (en) * 2008-10-15 2010-06-09 Lg电子株式会社 Image projection in mobile communication terminal
CN102724579A (en) * 2011-03-29 2012-10-10 深圳市同洲软件有限公司 Display method, apparatus, and system for electronic program guide information
CN106101457A (en) * 2016-08-23 2016-11-09 努比亚技术有限公司 A kind of information screen apparatus and method
CN106604241A (en) * 2015-10-19 2017-04-26 中兴通讯股份有限公司 Inter-equipment information transmission method and system, and source terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101729657A (en) * 2008-10-15 2010-06-09 Lg电子株式会社 Image projection in mobile communication terminal
CN102724579A (en) * 2011-03-29 2012-10-10 深圳市同洲软件有限公司 Display method, apparatus, and system for electronic program guide information
CN106604241A (en) * 2015-10-19 2017-04-26 中兴通讯股份有限公司 Inter-equipment information transmission method and system, and source terminal
CN106101457A (en) * 2016-08-23 2016-11-09 努比亚技术有限公司 A kind of information screen apparatus and method

Also Published As

Publication number Publication date
CN112269554A (en) 2021-01-26

Similar Documents

Publication Publication Date Title
CN110109636B (en) Screen projection method, electronic device and system
EP4030276A1 (en) Content continuation method and electronic device
CN112835549B (en) Method and device for switching audio output device
CN114710640B (en) Video call method, device and terminal based on virtual image
CN112312366B (en) Method, electronic equipment and system for realizing functions through NFC (near field communication) tag
EP3908018B1 (en) Method for occupying device and electronic device
CN112269554B (en) Display system and display method
WO2022148319A1 (en) Video switching method and apparatus, storage medium, and device
CN113593567B (en) Method for converting video and sound into text and related equipment
WO2022199241A1 (en) Always on display method and electronic device
CN111526407A (en) Screen content display method and device
CN113436576B (en) OLED display screen dimming method and device applied to two-dimensional code scanning
CN113497851B (en) Control display method and electronic equipment
CN114449333B (en) Video note generation method and electronic equipment
CN112423052A (en) Display system and display method
US20230319217A1 (en) Recording Method and Device
WO2021088891A1 (en) Display system, and display method
CN115883893A (en) Cross-device flow control method and device for large-screen service
CN115841099B (en) Intelligent recommendation method of page filling words based on data processing
US20240045651A1 (en) Audio Output Method, Media File Recording Method, and Electronic Device
US20230350535A1 (en) Always on display method and electronic device
US20230254445A1 (en) Screen Sharing Method, Terminal, and Storage Medium
US20240056677A1 (en) Co-photographing method and electronic device
CN117544817A (en) Video sharing picture generation method and related device
CN117880410A (en) Method for screen projection display and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant