CN113676710B - Intelligent display device and application management method - Google Patents

Intelligent display device and application management method Download PDF

Info

Publication number
CN113676710B
CN113676710B CN202110528461.6A CN202110528461A CN113676710B CN 113676710 B CN113676710 B CN 113676710B CN 202110528461 A CN202110528461 A CN 202110528461A CN 113676710 B CN113676710 B CN 113676710B
Authority
CN
China
Prior art keywords
type
display screen
displayed
interface
interfaces
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110528461.6A
Other languages
Chinese (zh)
Other versions
CN113676710A (en
Inventor
王光强
李珑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Juhaokan Technology Co Ltd
Original Assignee
Juhaokan Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juhaokan Technology Co Ltd filed Critical Juhaokan Technology Co Ltd
Publication of CN113676710A publication Critical patent/CN113676710A/en
Application granted granted Critical
Publication of CN113676710B publication Critical patent/CN113676710B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21VFUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
    • F21V33/00Structural combinations of lighting devices with other articles, not otherwise provided for
    • F21V33/0004Personal or domestic articles
    • F21V33/0052Audio or video equipment, e.g. televisions, telephones, cameras or computers; Remote control devices therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

The application provides intelligent display equipment and an application management method, wherein the intelligent display equipment comprises a first imaging mechanism, a first display screen and a second imaging mechanism, wherein the first imaging mechanism is used for forming a first display screen on a first plane according to the control of a controller; a second imaging mechanism for forming a second display screen on a second plane according to control of the controller; a controller in communication with the first imaging mechanism and the second imaging mechanism, respectively, the controller configured to: receiving a process management instruction input by a user; and responding to the process management instruction, and displaying a process management interface on a first display picture according to the types of the first type of process and the second type of process, wherein the first type of process is a process only displayed on the first display picture, the second type of process is a process only displayed on the second display picture, and the first type of process and the second type of process are alternately arranged in the process management interface. The method and the device improve process management experience.

Description

Intelligent display device and application management method
The present application claims priority from the chinese patent office, application number 202010408081.4, entitled "an intelligent projection device," filed on day 15, 5, 2020, the entire contents of which are incorporated herein by reference.
Technical Field
The application relates to the technical field of projection, in particular to intelligent display equipment and an application management method.
Background
The projection device is a display device capable of projecting an image or video onto an object for display, and compared with a display device capable of directly displaying the image or video on a screen, the projection device has the advantages of large projection interface size, flexible installation, capability of protecting eyesight and the like, and is popular with more and more users.
In the related art, an intelligent projection device may have two projection mechanisms capable of projecting two display images. The intelligent projection equipment can be provided with a plurality of application programs, and the application programs can be displayed on different display pictures after being started. When a user starts a plurality of applications, there is a need for process management of the plurality of applications. In the current display device, a plurality of started applications are tiled on a process management interface, however, for the display device such as an intelligent projection device, if application processes on a plurality of display pictures are tiled directly, the process management interface is not beneficial to users to search for the application processes, and the user experience is poor.
Disclosure of Invention
In order to solve the technical problem of poor process management experience, the application provides intelligent display equipment and an application management method.
In a first aspect, the present application provides an intelligent display device, comprising:
a first imaging mechanism for forming a first display screen on a first plane according to control of the controller;
a second imaging mechanism for forming a second display screen on a second plane according to control of the controller;
a controller in communication with the first imaging mechanism, the second imaging mechanism, respectively, the controller configured to:
receiving a process management instruction input by a user;
and responding to the process management instruction, and displaying a process management interface on the first display picture according to the types of a first type process and a second type process, wherein the first type process is a process only displayed on the first display picture, the second type process is a process only displayed on the second display picture, and the first type process and the second type process are alternately arranged in the process management interface.
In a second aspect, the present application provides an intelligent display device, comprising:
A first imaging mechanism for forming a first display screen on a first plane according to control of the controller;
a second imaging mechanism for forming a second display screen on a second plane according to control of the controller;
a controller in communication with the first imaging mechanism, the second imaging mechanism, respectively, the controller configured to:
receiving a process management instruction input by a user;
and responding to the process management instruction, and displaying a process management interface on the first display picture according to the types of a first type process and a second type process, wherein the first type process is a process only displayed on the first display picture, the second type process is a process only displayed on the second display picture, in the process management interface, the first type process is arranged independently, and the second type process and the first type process are arranged independently at a preset distance.
In a third aspect, the present application provides an application management method for an intelligent display device, where the method includes:
receiving a process management instruction input by a user;
and responding to the process management instruction, and displaying a process management interface on the first display picture according to the types of a first type process and a second type process, wherein the first type process is a process only displayed on the first display picture, the second type process is a process only displayed on the second display picture, and the first type process and the second type process are alternately arranged in the process management interface.
The intelligent display device and the application management method have the beneficial effects that:
according to the method and the device for displaying the application process, the application process running on the first display picture, the application process running on the second display picture and the application process running on the first display picture and the second display picture are alternately displayed in a superimposed mode or displayed in a superimposed mode according to the display positions through the process management interface, regular display of the application process is achieved, convenience is brought to a user to check the process on each display picture, convenience is brought to the user to conduct process management, and user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
Fig. 1 is a schematic structural diagram of a smart desk lamp according to some embodiments of the present application;
FIG. 2 is a schematic diagram of a virtual display forming location according to some embodiments of the present application;
FIG. 3 is a schematic diagram of another embodiment of a smart desk lamp;
FIG. 4 is a schematic diagram of a second launch page according to some embodiments of the present application;
FIG. 5 is a schematic diagram of a first start page according to some embodiments of the present application;
fig. 6 is a schematic diagram of a display control interface D0 corresponding to a two-screen control D according to some embodiments of the present application;
FIG. 7 is a timing diagram illustrating process management in some embodiments of the present application;
FIG. 8 is a schematic diagram of process ordering in some embodiments of the present application;
FIG. 9 is a schematic diagram of process ordering in some embodiments of the present application;
FIG. 10 is a schematic diagram of process ordering in some embodiments of the present application;
FIG. 11 is a schematic diagram of process ordering in some embodiments of the present application;
FIG. 12 is a diagram of a process management page in some embodiments of the present application;
FIG. 13 is a diagram of a process management page in some embodiments of the present application;
FIG. 14 is a diagram of a process management page in some embodiments of the present application.
Detailed Description
For purposes of clarity and implementation of the present application, the following description will make clear and complete descriptions of exemplary implementations of the present application with reference to the accompanying drawings in which exemplary implementations of the present application are illustrated, it being apparent that the exemplary implementations described are only some, but not all, of the examples of the present application.
It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms "first," second, "" third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for limiting a particular order or sequence, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements explicitly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that is capable of performing the function associated with that element.
The desk lamp is a lighting tool for assisting people in reading, learning and working, common household equipment in life of people is developed towards an intelligent direction along with the progress of technology, and under the wave, the functions of the desk lamp are more and more rich. In some embodiments, the desk lamp may be provided with a projection mechanism, capable of being connected with a display device, to realize a projection function of a projector, and may be referred to as an intelligent desk lamp.
However, in the conventional projection technology, the content of the display device is directly projected, if the content displayed by the display device is superimposed content, the projected image is also displayed by the superimposed content, for example, in a video chat scene, the chat window is usually superimposed over the original interface, which can cause shielding to the content of the original interface, and affect the viewing experience of the user.
In order to solve the technical problem, the embodiment of the application enables the desk lamp to be an intelligent display device by arranging the plurality of projection mechanisms on the desk lamp, a plurality of pictures can be obtained through projection, and then a plurality of interfaces of an application program are displayed on the projected pictures in a separated mode, so that the interfaces are not blocked.
Fig. 1 is a schematic structural diagram of an intelligent desk lamp according to some embodiments of the present application, as shown in fig. 1, the intelligent desk lamp includes: at least two imaging mechanisms, a controller 200, and a camera 300. The imaging mechanism may be a projection mechanism, and the controller 200 is connected to at least two projection mechanisms and the camera 300, respectively, so that the controller 200 may control the working states of the at least two projection mechanisms and acquire contents photographed by the camera 300.
In some embodiments, the intelligent desk lamp further comprises a base, a support and an illumination bulb, wherein the illumination lamp, the projection mechanism and the camera can be arranged on the support, the support can be arranged on the base, and the controller 200 can be arranged inside the base.
In some embodiments, the controller 200 in the intelligent desk lamp has a network communication function, so that the current intelligent desk lamp can communicate with other intelligent desk lamps, intelligent terminals (such as mobile phones) or servers (such as network platforms), thereby obtaining projection contents.
In some embodiments, the controller 200 in the intelligent desk lamp may also be installed with an operating system, so that projection can be performed without connecting with a display device, and of course, the intelligent desk lamp with the operating system may also have a network communication function, so that communication with a server or the like can be performed, so as to implement some network functions, such as upgrading the operating system, installing an application program, interacting with other intelligent desk lamps, and the like. Referring to fig. 1, at least two projection mechanisms include at least a first imaging mechanism 110 and a second imaging mechanism 120, where the first imaging mechanism 110 may also be referred to as a desktop projection mechanism for projecting to form a first virtual display screen VS1; the second imaging mechanism 120 may also be referred to as a wall projection mechanism, and is configured to project and form a second virtual display screen VS2, where the first virtual display screen VS1 and the second virtual display screen VS2 are formed at different positions.
For example, fig. 2 is a schematic diagram illustrating a virtual display forming position in some embodiments of the present application, as shown in fig. 2, a first virtual display VS1 formed by projection of the first imaging mechanism 110 may be formed on a desktop of a desk provided with the smart desk lamp, the desktop may be a horizontal plane, and an interface on the first virtual display VS1 may be referred to as a first display screen; the second virtual display screen VS2 projected by the second imaging mechanism 120 may be formed on a wall surface on which the desk rests, and the wall surface may be a vertical plane, and an interface on the second virtual display screen VS2 may be referred to as a second display screen. It will be appreciated that in practical applications, the forming position of the virtual display screen may be adjusted according to practical needs, the desktop may be a substantially horizontal plane, and the wall surface may be a substantially vertical plane.
It can be understood that the specific display content of the first virtual display screen VS1 may be different from the specific display content of the second virtual display screen VS2, so that the two virtual display screens cooperate with each other to achieve the purpose of comprehensively displaying the content with large capacity and high complexity.
After the at least two projection mechanisms respectively project to form at least two virtual display screens, the camera 300 is configured to collect an operation gesture on at least one virtual display screen, and send the operation gesture to the controller 200, where the operation gesture may specifically be operation click information of a user on display content on the virtual display screen.
For example, the camera 300 may collect only the operation gesture on the first virtual display screen VS1, may collect only the operation gesture on the second virtual display screen VS2, or may collect the operation gestures on the first virtual display screen VS1 and the second virtual display screen VS2 simultaneously.
In addition, based on the number of virtual display screens required to acquire operation gestures, the number of cameras 300 may be set to be plural, that is, a single camera acquires operation gestures of a single virtual display screen.
In some embodiments, the camera 300 may be an infrared camera, so that the accuracy of the obtained operation gesture can be ensured even in the scene of bad light such as night, cloudy day, etc. by using the infrared detection technology.
In some embodiments, the camera 300 may collect user images in addition to the collection operation gestures, so as to realize functions of video call, photographing, and the like.
After the at least two projection mechanisms respectively project to form at least two virtual display screens, the controller 200 is configured to control the projection contents of the at least two projection mechanisms respectively on the at least two virtual display screens, and after receiving the operation gesture sent by the camera 300, adjust the projection contents of the at least two projection mechanisms based on the operation gesture on the at least one virtual display screen.
For example, the controller 200 may adjust only the projected content of the first imaging mechanism 110 on the first virtual display screen VS1 based on the operation gesture, may adjust only the projected content of the second imaging mechanism 120 on the second virtual display screen VS2 based on the operation gesture, and may adjust both the projected content of the first imaging mechanism 110 on the first virtual display screen VS1 and the projected content of the second imaging mechanism 120 on the second virtual display screen VS2 based on the operation gesture.
It will be appreciated that the two projection mechanisms are only one exemplary illustration of performing multi-screen projection on the intelligent desk lamp in the present application, and the at least two projection mechanisms may also be other numbers of projection mechanisms, for example, 3 or more than 3, etc., where the number of projection mechanisms of the intelligent desk lamp is not specifically limited in the present application. In addition, for convenience of explanation, each embodiment of the present application uses two projection mechanisms as an example, and the technical solution of the present application will be explained.
In some embodiments, the number of the controllers 200 may be plural, specifically, may be the same as the number of the projection mechanisms, so that a single controller may be configured to control the projection content of a single projection mechanism, and communication connections exist between the respective controllers.
For example, for a case where at least two projection mechanisms include at least a first imaging mechanism 110 and a second imaging mechanism 120, the controller 200 may specifically include a first controller that controls the projected content of the first imaging mechanism 110 and a second controller that controls the projected content of the second imaging mechanism 120, where the first controller and the second controller have a communication connection.
In some embodiments, the plurality of controllers may be centrally located, i.e., the plurality of controllers are located at the same designated location in the intelligent desk lamp; the controllers may be separately provided, that is, the controllers may be provided corresponding to the corresponding projection mechanisms, and the present application does not limit the positions of the plurality of controllers.
Some embodiments provide an intelligent desk lamp, this intelligent desk lamp includes two at least projection mechanism, and this application belongs to the intelligent desk lamp of many screen projections promptly to, the virtual display screen's that every projection mechanism projection formed forms the position different, thereby can form a plurality of virtual display screens in different positions, through the cooperation of a plurality of virtual display screens demonstration, in order to play the purpose of the display content that the comprehensive display capacity is big, the complexity is high. Meanwhile, the operation gestures on the virtual display screen are obtained through the camera, and the projection content is adjusted according to the operation gestures, so that the interactivity among different users can be further enhanced.
Fig. 3 is another schematic structural diagram of an intelligent desk lamp according to some embodiments of the present application, as shown in fig. 3, the first imaging mechanism 110 includes: a first light source 112, a first imaging unit 114, and a first lens 116; the first light source 112 is configured to emit light, the first imaging unit 114 is configured to form a pattern based on the light emitted by the first light source 112, and the first light source 112 and the first imaging unit 114 are configured to cooperate to form a first projection pattern; the first lens 116 is configured to enlarge the first projection pattern, so that the first light source 112, the first imaging unit 114, and the first lens 116 cooperate to display corresponding display content on the first virtual display screen VS1 corresponding to the first imaging mechanism 110. In some embodiments, the first light source 112 includes at least one of a tri-color light source, a white light source, and a blue light wheel light source. The three-color light source and the blue light wheel light source are used for emitting light with different colors, so that color content can be displayed on the first virtual display screen VS 1. The white light source is used for emitting white light so as to realize the basic lighting function of the desk lamp.
In some embodiments, the first light source 112 may include only a white light source, such that a basic lighting function may be achieved. The first light source 112 may include only a three-color light source or only a blue-light wheel light source so that color contents can be displayed on the first virtual display screen VS1 when projection is required. The first light source 112 may include a white light source and a tri-color light source at the same time, or include a white light source and a blue light wheel light source at the same time, or include a white light source, a tri-color light source and a blue light wheel light source at the same time, so that color content may be displayed on the first virtual display screen VS1 while implementing a basic lighting function.
Referring to fig. 3, the second imaging mechanism 120 includes: a second light source 122, a second imaging unit 124, and a second lens 126; wherein the second light source 122 is configured to emit light, the second imaging unit 124 is configured to form a pattern based on the light emitted by the second light source 122, and the second light source 122 and the second imaging unit 124 are configured to cooperatively form a second projection pattern; the second lens 126 is configured to enlarge the second projection pattern, so that the second light source 122, the second imaging unit 124, and the second lens 126 cooperate to display corresponding display contents on the second virtual display screen VS2 corresponding to the second imaging mechanism 120.
In some embodiments, the second light source 122 includes at least one of a tri-color light source, a white light source, and a blue light wheel light source. The three-color light source and the blue light wheel light source are used for emitting light with different colors, so that color content can be displayed on the second virtual display screen VS 2. The white light source is used for emitting white light so as to realize the basic lighting function of the desk lamp.
In some embodiments, the second light source 122 may include only a white light source, such that a basic lighting function may be achieved. The second light source 122 may include only a three-color light source or only a blue-light wheel light source so that color contents can be displayed on the second virtual display screen VS2 when projection is required. The second light source 122 may include a white light source and a tri-color light source at the same time, or include a white light source and a blue light wheel light source at the same time, or include a white light source, a tri-color light source and a blue light wheel light source at the same time, so that the color content may be displayed on the second virtual display screen VS2 while the basic lighting function is implemented.
In some embodiments, the lens in the projection mechanism is a focus adjustable lens, and the controller 200 can adjust the size of the projected image by adjusting the focal length of the lens.
In some embodiments, the first light source 112 and the second light source 122 may be different light sources to provide light beams to different imaging units, respectively, or the same light source may provide light beams to different imaging units through light splitting.
In one embodiment, the smart desk lamp may include one or more of the following components: a storage component, a power component, an audio component and a communication component.
The storage component is configured to store various types of data to support operation at the intelligent desk lamp. Examples of such data include student questions, examination papers, electronic textbooks, question analysis and explanation, etc. for projection display on the smart desk lamp, and the types of data include documents, pictures, audio, video, etc. The memory component may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply assembly provides power for various components of the intelligent desk lamp. The power components may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the intelligent desk lamp.
The audio component is configured to output and/or input an audio signal. For example, the audio component includes a Microphone (MIC) configured to receive external audio signals when the smart desk lamp is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may be further stored in a storage component or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals.
The communication component is configured to facilitate wired or wireless communication between the intelligent desk lamp and other devices. The intelligent desk lamp can access a wireless network based on a communication standard, such as WiFi, 4G or 5G, or the like, or a combination thereof. In one exemplary embodiment, the communication component receives a broadcast signal or broadcast-related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component further comprises a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In one embodiment, the principle of the camera 300 capturing operation gestures is explained.
The actual imaging interface may be a virtual display screen, which in some embodiments may be a desktop, wall, dedicated projection screen, or other surface structure that presents the projected image, where the user's operation is identified by the image captured by the camera or by the positional information transmitted by the controlled position sensing device.
Exemplary some of the operation acquisitions are as follows:
motion trail
After the controller 200 controls the projection mechanism to project on the virtual display screen, the camera 300 captures an image of the user's finger on the virtual display screen in real time and transmits the image to the controller 200. The controller 200 recognizes the user's fingertip in the image through the fingertip tracking technology, so that the operation track of the user on the virtual display screen can be obtained based on the movement track of the fingertip.
In some embodiments, in the image acquired by the camera 300, if only a single finger is included, determining an operation track of the user based on the fingertip of the finger; if a plurality of fingers are included, the operation locus of the user is determined based on the finger tips of the specific fingers, which may be, for example, index fingers or the like, or the locus of the plurality of finger tips is determined.
(II) click operation
The camera 300 of the intelligent desk lamp is arranged above the finger of the user, when the user performs the operation of pressing and clicking the finger, the fingertip image of the user can change to a certain extent, and the controller 200 can identify whether the user performs the clicking operation according to the change of the fingertip image.
For example, in the case where the position of the camera 300 is fixed, when the user performs the finger-down clicking operation, the distance between the fingertip and the camera 300 changes, and in the image obtained by the camera 300, the fingertip pattern size before the finger-down clicking is larger than the fingertip pattern size after the finger-down clicking, so that the user can be considered to perform the finger-down clicking operation when the fingertip pattern size changes.
For example, when some users click, there is a case that the fingertip is bent downward, and the fingertip pattern in the image is deformed or incomplete, so that when the fingertip pattern is deformed or incomplete in display, the user can be considered to perform the pressing click operation.
It can be appreciated that when the fingertip image has just changed, the user can be considered to be in a fingertip press-down state; after the fingertip image is restored, the user can be considered to be in a fingertip lifting state, and thus, the fingertip image of the user changes once, and the user can be considered to perform an effective clicking operation once.
(III) Single click operation
When the controller 200 confirms that the user is in a state of fingertip depression, the position coordinates of the position Point1 of the state and a time stamp are recorded.
When confirming that the user is in a state of lifting the fingertip, the position coordinates of the position Point2 of the state and a time stamp are recorded.
If the distance between the position coordinates of the position Point1 and the position coordinates of the position Point2 is smaller than the preset threshold value, and the time difference between the time stamp of the position Point1 and the time stamp of the position Point2 is smaller than the preset threshold value, the user is considered to perform the clicking operation on the position Point1 (same as the position Point 2).
(IV) double-click operation
When the controller 200 confirms that the user performs the first valid click operation, the position coordinates of the position Point3 of the click operation and the time stamp are recorded.
When the user is confirmed to perform the second effective clicking operation, the position coordinates of the position Point4 of the clicking operation and a time stamp are recorded.
If the distance between the position coordinates of the position Point3 and the position coordinates of the position Point4 is smaller than the preset threshold value and the time difference between the time stamp of the position Point3 and the time stamp of the position Point4 is smaller than the preset threshold value, the clicking operation performed by the user at the position points Point3 and Point4 is considered to form an effective double-click operation.
It will be appreciated that the principle of recognition of the multiple click operation is similar to that of the double click operation, and will not be described in detail herein.
(fifth) Long press operation
When the controller 200 confirms that the user is in a state of fingertip depression, the position coordinates of the position Point5 of the state and a time stamp are recorded.
When confirming that the user is in a state of lifting the fingertip, the position coordinates of the position Point6 of the state and a time stamp are recorded.
If the distance between the position coordinates of the position Point5 and the position coordinates of the position Point6 is smaller than the preset threshold value, and the time difference between the time stamp of the position Point5 and the time stamp of the position Point6 is larger than the preset threshold value, the user is considered to perform long-press operation on the position Point5 (same as the position Point 6).
Sixth sliding operation
When the controller 200 confirms that the user is in a state of fingertip depression, the position coordinates of the position Point7 of the state and a time stamp are recorded.
When confirming that the user is in a state of lifting the fingertip, the position coordinates of the position Point8 of the state and a time stamp are recorded.
If the distance between the position coordinates of the position Point7 and the position coordinates of the position Point8 is greater than a preset threshold value and the time difference between the time stamp of the position Point7 and the time stamp of the position Point8 is greater than the preset threshold value, the user is considered to have performed a sliding operation between the position points Point7 to Point 8.
It will be appreciated that the sliding operation may be a sideways sliding, such as a left or right sliding, a longitudinal sliding, such as an upward or downward sliding, or an oblique sliding, such as an upward or downward and left sliding, etc.
In some embodiments, the sliding distance and sliding direction (positive X-axis direction to the right and positive Y-axis direction to the top in the default position coordinate system) may be determined based on the position coordinates of position points Point7 and Point 8.
For example, the sliding distance may be calculated by the following formula:
where dis is the sliding distance, x7, y7 are the position coordinates of position Point7, and x8, y8 are the position coordinates of position Point 8.
When x7=x8, or the difference between x7 and x8 is smaller than a preset threshold, if y7> y8, the sliding direction is downward sliding; if y7< y8, the sliding direction is upward.
When y7=y8, or the difference between y7 and y8 is smaller than a preset threshold, if x7> x8, the sliding direction is to be sitting and sliding; if x7< x8, the sliding direction is to the right.
When x7> x8, if y7> y8, the sliding direction is sliding downwards left; if y7< y8, the sliding direction is to slide upward and leftward.
When x7< x8, if y7> y8, the sliding direction is sliding downward to the right; if y7< y8, the sliding direction is to slide upward and rightward.
In one embodiment, the operation of the user on the virtual display screen can be simulated through other peripheral devices. The peripheral device is specifically such as a sensing pen.
In some embodiments, the pen tip of the sensing pen is provided with a position sensor, and the position sensor sends the position of the pen tip to the controller 200 of the intelligent desk lamp in real time, so that the intelligent desk lamp obtains the operation track of the user on the virtual display screen through the position change condition sent by the controller 200.
In addition, the nib of the sensing pen is provided with a pressing sensing structure (such as a pressure sensor, etc.), when a user needs to perform clicking operation, the user can touch the desktop by using the sensing pen, so that the pressing sensing structure obtains a pressing signal, and the obtained pressing signal is sent to the controller 200 of the intelligent desk lamp, and the controller 200 can determine the clicking position of the user based on the current position of the note and the pressing signal.
It will be appreciated that the principle of other operations (e.g. double click, long press, etc.) performed by the user through the sensing pen is the same as that performed by the fingertip, and will not be described in detail here.
For ease of understanding, in the following embodiments, the smart desk lamp includes a single controller 200, two projection mechanisms (a first imaging mechanism 110 and a second imaging mechanism 120), and a single camera 300, where the camera 300 only collects an operation gesture on a first virtual display screen VS1, the first virtual display screen VS1 formed by projection of the first imaging mechanism 110 is formed on a desk top of the desk on which the smart desk lamp is disposed, and the second virtual display screen VS2 formed by projection of the second imaging mechanism 120 is formed on a wall surface on which the desk is disposed.
In one embodiment, an application management method of the intelligent desk lamp is explained.
In some embodiments, a power-on key is provided on the base of the intelligent desk lamp. The power-on key can be in a physical pressing structure or a touch structure, and when the power-on key is in the physical pressing structure, if a user presses the power-on key, the power-on key can be considered to be in an active state; when the power-on key is in a touch structure, if a limb part (such as a finger) of a user is placed on the surface of the power-on key, the power-on key can be considered to be in an active state.
In some embodiments, the power-on key being in an active state refers to a state in which the power-on key is pressed.
The traditional desk lamp is also generally provided with a start key, and after a user presses or touches the start key, a light source of the desk lamp can be electrified to emit light, so that the lighting function is realized. Compared with the traditional desk lamp, the intelligent desk lamp is further provided with a plurality of virtual display screens, after the intelligent desk lamp is started, the intelligent desk lamp can realize an illumination function, and the virtual display screens can display preset projection pictures, so that a display function is realized.
In some embodiments, the intelligent desk lamp may be provided with a projection key in addition to a start key, and when the intelligent desk lamp is started, the virtual display screen may not display the projection screen first, so that the intelligent desk lamp may be used as a common desk lamp after being started, when a user needs to perform projection display, the user may press or touch the projection key to enable the virtual display screen to display the projection screen, the user presses or touches the projection key again, and the virtual display screen may close the display of the projection screen, thereby realizing the effect of saving energy.
In some embodiments, the functions of the projection key may also be integrated into a power-on key, for example, the power-on key is configured to be turned on or off when the user presses the key for a long time, and the projection is turned on or off when the user clicks the key.
In some embodiments, after a user triggers a start key, the controller obtains a trigger signal of the start key, then generates a first start page and a second start page according to the trigger signal, then controls the desktop projection mechanism to project the first start page, controls the wall projection mechanism to project the second start page, and starts the camera, so that the operation of the user on the second start page and/or the first start page is monitored through the camera.
Referring to fig. 4, an interface schematic diagram of a second start page according to some embodiments, as shown in fig. 4, the second start page may not be provided with a control, and only displays some information such as weather information, time information, and the like, which do not need to be focused by a user for a long time. Of course, in some embodiments, the first launch page may also set a small number of controls, e.g., the "weather fine" in fig. 4 may be an interface control that the user clicks on to access the weather details interface.
Referring to fig. 5, an interface schematic diagram of a first start page according to some embodiments may include an educational interface, as shown in fig. 5, where the educational interface may be provided with a plurality of controls, such as "online teaching system", "teaching channel", "problem exercise", "simulation examination", "job correction", each of which is a display position, and after a user clicks a control, the user may enter an interface corresponding to the control. A two-screen control D may be disposed on the first actuation page, the control D being configured to call up a control interface for operating the wall projection interface on the current interface in response to a trigger.
In some embodiments, it may be inconvenient for the user to directly operate the control on the second launch page, and the user may click on the D control in the first launch page shown in fig. 5, thereby calling up a control interface of the second launch page on the first virtual display screen, and operating the second launch page on the control interface.
It should be noted that, referring to fig. 5, in some embodiments, for the first virtual display VS1, 4 control keys are included below the interface of the host interface, which are respectively a return key, a home key, a process key, and a minimize key. The method comprises the steps of setting a process key, a minimize key and a main key, wherein the return key is used for returning to a previous page, the main key is used for directly returning to a corresponding Launcher interface, the process key is used for displaying all current processes to conduct process management, and the minimize key is used for minimizing the application running in the current foreground.
In one embodiment, the screen display control of the second virtual display screen VS2 is explained.
Referring to fig. 5, a two-screen control D is set on the display page of the first virtual display screen VS1, and when the user clicks the two-screen control D, the first virtual display screen VS1 displays a display control interface of the second virtual display screen VS2 above the original interface.
Fig. 6 is a schematic diagram of a display control interface D0 corresponding to the two-screen control D, where, as shown in fig. 6, the display control interface D0 includes a display area D1, a return one-screen control D2, a close two-screen control D3, a touch area D4, and an exit control D5.
The display area D1 is used for displaying a current running process of the second virtual display screen VS 2.
And returning to the one-screen control D2 for switching the content displayed by the second virtual display screen VS2 to the first virtual display screen VS1 for display. For example, for a certain process, if the current display content of the process is displayed on the second virtual display screen VS2, and if the user clicks the return one-screen control D2, the process is switched to the first virtual display screen VS1 to display, and at this time, the second virtual display screen VS2 may display a counter interface or a time interface.
And closing the two-screen control D3, wherein the content displayed by the second virtual display screen VS2 is switched to the first virtual display screen VS1 for display, and the second virtual display screen VS2 is turned off. For example, for a certain process, if the current display content of the process is displayed on the second virtual display screen VS2, and if the user clicks to close the two-screen control D3, the process is switched to the first virtual display screen VS1 to display, and at this time, the second virtual display screen VS2 is switched to the off-screen state, that is, the second virtual display screen VS2 does not display the content.
The touch area D4 is used for performing operation control on the second virtual display screen VS2 (the effect is similar to a notebook touch pad). According to the operation of the user in the touch area D4 acquired by the camera, mapping the operation of the user to the operation of the corresponding position of the second virtual display screen VS2 according to the mapping relation between the position of the user in the touch area D4 and the preset position, and then determining the control for executing the operation according to the position of the control of the second virtual display screen VS 2. For example, the user may control the screen pointer on the second virtual display screen VS2 through the touch area D4, so as to perform a corresponding operation.
The exit control D5 is used to collapse the display control interface D0. For example, the user may collapse the display control interface D0 by clicking the exit control D5, and at this time, an icon of the two-screen control D is displayed on the first virtual display screen VS 1.
In one embodiment, the process control management of the first virtual display screen VS1 and the second virtual display screen VS2 is explained.
In some embodiments, an application on the smart desk lamp may generate a process for the application on the smart desk lamp after being started, and thus, managing the application may include managing the process of the application.
In some embodiments, the applications on the intelligent desk lamp include three types, the first being applications only shown on the first virtual display screen VS1, the applications may include interactive applications that the developer of the intelligent desk lamp is self-grinding, the second being applications only shown on the second virtual display screen VS2, the applications may include display applications of a third party, the third being applications that are shown on both the first virtual display screen VS1 and the second virtual display screen VS2, the applications may include comprehensive applications that the developer of the intelligent desk lamp is self-grinding, the first and second types of applications may be referred to as single-screen applications, and the third type of applications may be referred to as dual-screen applications. The interactive application may be an application that requires a user to perform a large number of operations, such as a game application, where the user needs to perform a manipulation in real time; a presentation class application may refer to an application that does not require user action, or that requires only a small amount of user action, such as a weather application, that does not require user action, or that only requires entry of a geographic location; the comprehensive application can refer to an application which needs to be operated by a user in some scenes and does not need to be operated by the user in some scenes, such as a notepad application, wherein the user needs to interact with the application when inputting an event, and the application only needs to display the event input by the user after the event is input.
In some embodiments, in order to enable the intelligent desk lamp to identify on which virtual display screen an application needs to be displayed, a screen identifier may be carried in an installation package of some applications, where the screen identifier may be a screen parameter, if the parameter is given by VS1, the screen identifier may be a first identifier, which indicates that the application is of a type that needs to be displayed on the first virtual display screen VS1, a process generated by the application after starting may be referred to as a first type of process, if the parameter is given by VS2, at this time, the screen identifier may be a second identifier, which indicates that the application is of a type that needs to be displayed on the second virtual display screen VS2, a process generated by the application after starting may be referred to as a second type of process, if the parameter is given by (VS 1, VS 2), and at this time, the screen identifier may be a third identifier, which indicates that the application is of a type that needs to be displayed on both the first virtual display screen VS1 and the second virtual display screen VS 2. It should be noted that, for an application with a value of 3 for a screen parameter, at a certain moment, the application may display an interface on two display screens, or may display an interface on only one display screen. After these applications are installed on the smart desk lamp, screen identifiers may be stored in the installation information of these applications. In some embodiments, the screen identifier may not be carried in the installation package of some applications, and the smart desk lamp may default that the application needs to be displayed on the first virtual display screen VS1 or the second virtual display screen VS 2. After the applications are installed on the intelligent desk lamp, the intelligent desk lamp can add screen marks in the installation information of the applications.
In some embodiments, the process management timing diagram of the installed application on the smart desk lamp may be seen in fig. 7, and in fig. 7, the application management stack, the application process management module, and the desk throw control module may be software modules executed on a controller of the smart desk lamp.
In some embodiments, after the intelligent desk lamp is turned on, the controller of the intelligent desk lamp may create three application management stacks for storing the application displayed only on the first virtual display screen VS1, the application displayed only on the second virtual display screen VS2, and the screen usage information of the application provided with display screens on the first virtual display screen VS1 and the second virtual display screen VS2, respectively. The screen usage information can be the same as the screen identification in the installation information of the application program when the application is started, and can be changed along with the user operation if the display screen displayed by the application program is switched by the later user, and is different from the screen identification in the installation information of the application program.
In some embodiments, after the intelligent desk lamp is started, the controller of the intelligent desk lamp may not first create an application management stack, and after the user starts an application only displayed on the first virtual display screen VS1, create a first application management stack; after the user starts an application which is only displayed on the second virtual display screen VS2, a second application management stack is created; after the user starts an application with display pictures respectively set on the first virtual display screen VS1 and the second virtual display screen VS2, a third application management stack is created.
Taking the example that the intelligent desk lamp automatically creates three application management stacks after being started, in some embodiments, a user can click an application program from a first starting page or a second starting page to generate an application starting instruction of the application program.
In some embodiments, the controller of the intelligent desk lamp is responsive to the application launch instruction to launch the application program, which upon launch generates at least one launch interface. If the application program is a double-screen application, generating two starting interfaces, wherein the application program can respectively set screen identifiers for the two starting interfaces, so that a controller can read the screen identifiers corresponding to the starting interfaces, and the two starting interfaces are projected onto corresponding virtual display screens according to the screen identifiers; if the application program is a single-screen application, a starting interface is generated, and the application program can set a screen identifier which is the same as the screen identifier of the application program for the starting interface, so that a controller can read the screen identifier corresponding to the starting interface, and the starting interface is projected onto a corresponding virtual display screen according to the interface screen identifier.
In some embodiments, the controller of the intelligent desk lamp responds to the application starting instruction to acquire the screen identification of the application program from the installation information of the application program. If the value of the screen identifier is VS1, adding the screen use information of the application program into a first application management stack; if the value of the screen identifier is VS2, the screen use information of the application program is added into a second application management stack; if the value of the screen identifier is (VS 1, VS 2), the screen use information of the application program is added into a third application management stack.
In some implementations, after an application is started, a user can return to a display interface where an icon of the application is located, and when the user clicks the application again on the display interface, an application entry instruction is input to the intelligent desk lamp, and the intelligent desk lamp redisplays the interface of the application according to the application entry instruction.
In some embodiments, after launching one application, the user may continue to launch other applications. After each application is started, screen use information of the application is stored in a corresponding application management stack. In an application management stack, the screen use information of each application is ordered according to the opening sequence of the user, the application which is firstly opened by the user is positioned at the stack bottom, the application which is finally opened by the user is positioned at the stack top, and the order of the screen use information of each application in the application management stack is automatically adjusted along with the operation of the user along with the switching of the application by the user.
In some embodiments, if a user opens more applications, the memory resource consumption of the intelligent desk lamp will be larger, and the user may perform process management on the intelligent desk lamp, for example, close some applications to release the memory resource occupied by the applications.
In some embodiments, a process control is used to be clickable on the first virtual display VS1 to manage the launched application.
In some embodiments, the user may also slide up from the bottom of the first virtual display screen VS1 to trigger a process control to manage the started application.
In some embodiments, after clicking the process control, the application process management module will respond to the trigger signal of the process control to control the intelligent desk lamp to display the process management page for the user to perform application management.
In some embodiments, the application process management module responds to the trigger signal of the process control, and can read the screen use information of the currently started application stored in each application management stack, and determine the process running in the memory of the intelligent desk lamp. After the screen use information of the currently started application is read, the screen use information can be processed, so that a process management interface is generated.
In order to facilitate the user to understand the application currently running on the first virtual display screen VS1 and the application running on the second virtual display screen VS2, application windows of all the started applications may be displayed regularly.
In some embodiments, an application process may be presented in a regular manner by alternately ordering application processes that differ in screen usage information.
In some embodiments, the application process management module may calculate, according to the screen usage information of the started applications, the number of applications corresponding to each application management stack, to obtain the number of applications of one screen, where the number of applications of two screens is N, and the number of applications of one/two screens is Q, where one screen refers to an application whose current display screen is the first virtual display screen VS1, two screen refers to an application whose current display screen is the second virtual display screen VS2, and one/two screen refers to an application whose current display screen is the first virtual display screen VS1 and the second virtual display screen VS 2.
In some embodiments, the method of alternately ordering different application processes may be: the priority of the first class process, the second class process and the third class process are the same, the first class process, the second class process and the third class process are alternately ordered according to the sequence of the first class process, the second class process and the third class process, and then the rest class processes are alternately ordered.
According to the ordering mode, if only the first type of process and the second type of process are running, when M is greater than or equal to N and N is greater than zero, the first N first type of process and the second N type of process are alternately displayed on the first 2N sequence bits on the process management page, and the remaining sequences of the remaining M-N first type of processes after the first 2N sequence bits are sequentially arranged. When M is smaller than N, the first M first type processes and the second M type processes are alternately displayed on the first 2M sequence bits on the process management page, and the remaining sequences of the remaining N-M second type processes after the first 2M sequence bits are sequentially arranged.
According to the ordering mode, if the running processes have a first type process, a second type process and a third type process, when M is greater than or equal to N, N is greater than or equal to Q, and Q is greater than zero, the first Q first type processes, the first Q second type processes and the Q third type processes are alternately displayed on the first 3Q sequence bits on the process management page, the N-Q first type processes and the N-Q second type processes are alternately displayed on the 2 (N-Q) sequence bits after the first 3Q sequence bits on the process management page, and the rest M-N first type processes are displayed on the subsequent sequence bits. When M is smaller than N, M and larger than Q, the first Q first type processes, the first Q second type processes and the Q third type processes are alternately displayed on the first 3Q sequence bits on the process management page, the M-Q first type processes and the M-Q second type processes are alternately displayed on the 2 (M-Q) sequence bits after the first 3Q sequence bits on the process management page, and the rest N-M second type processes are displayed on the subsequent sequence bits.
In some embodiments, the method of alternately ordering and exposing different application processes may be: the priority of the first class process and the second class process is the same and is greater than that of the third class process, the first class process and the second class process are alternately ordered, the rest of the processes and the third class process are alternately ordered, or the second class process and the first class process are alternately ordered, and the rest of the processes and the third class process are alternately ordered.
According to this ordering, when M is less than N, several specific ordering approaches are as follows:
if M is smaller than N and the sum of M and Q is smaller than N, the application process management module can alternately sequence the application processes of the two-screen application and the application processes of the one-screen application, and then alternately sequence the application processes of the rest two-screen application and the application processes of the one/two-screen application. Because the sum of M and Q is smaller than N, the application processes of a part of two-screen applications cannot be alternately ordered with the application processes with different screen use information, the application processes of the part of two-screen applications can be directly overlapped and displayed, and finally a group of overlapped processes are obtained. Sequencing effect referring to fig. 8, in fig. 8, the sequencing of application processes of the same screen usage information coincides with the sequencing of screen usage information within the respective application management stacks.
If M is smaller than N and the sum of M and Q is equal to N, the application process management module may first alternately sequence the application processes of the two-screen application and the application processes of the one-screen application, and then alternately sequence the application processes of the remaining two-screen application and the application processes of the one/two-screen application, so as to finally obtain a set of overlapped windows. Sequencing effect referring to fig. 9, in fig. 9, the sequencing of application processes of the same screen usage information is consistent in terms of the sequencing of the screen usage information within the respective application management stacks.
If M is smaller than N and the sum of M and Q is larger than N, the application process management module can alternately sequence the application processes of the two-screen application and the application processes of the one-screen application, and then alternately sequence the application processes of the rest two-screen application and the application processes of the one/two-screen application. Because the sum of M and Q is larger than N, the application processes of a part of one/two screen application cannot be alternately ordered with the application processes with different screen use information, and the application processes of the part of one/two screen application can be directly overlapped and displayed, so that a group of overlapped windows is finally obtained. Sequencing effect referring to fig. 10, in fig. 10, the sequencing of application processes of the same screen usage information coincides with the sequencing of screen usage information within the respective application management stacks.
If M is greater than N, the processes of the started applications may be sorted by a method similar to the sorting method when M is less than N, where the sum of N and Q is less than M, N and the sum of Q is equal to M, N and the sum of Q is greater than M.
In some embodiments, after the controller sorts the first type of process, the second type of process and the third type of process according to the above-mentioned rule of alternating sorting, a process management page may be generated according to the sorted processes.
In some embodiments, in the process management page, the multiple processes perform vertical superposition display according to the above alternating sequence, the process in the sequence in front can form a shelter for a preset area of the process in the sequence in front, the preset area can be a lower area, and the superposition display mode of partial shelter can display more processes in a smaller area than the tiling display mode of all the processes. In some embodiments, an application process may also be displayed separately for application processes with different screen usage information in a regular manner. After the application process management module obtains the screen use information of the started application from each application management stack, the application process of the one-screen application can be displayed in a superposition manner, the application process of the two-screen application can be displayed in a superposition manner, and the application processes of the one/two-screen application can be displayed in a superposition manner, so that three groups of superposed display processes are obtained. The sorting effect can be seen in fig. 11, as shown in fig. 11, each type of process can be arranged separately, and different types of processes can be spaced by a preset distance, so that the user can distinguish and difference.
In fig. 11, the order of application processes of the same screen usage information coincides with the order of screen usage information within the respective application management stacks.
In some embodiments, after the application management process obtains the sequence of all application processes of the started application, a process management interface provided with a screen identifier VS1 may be generated according to the beat sequence, and then the application management process interface is sent to the desktop application control module according to the screen identifier VS1, so that the desktop application control module may display the process management interface on the first virtual display screen VS 1.
For example, referring to fig. 12, the process management interface may enable a user to manage a currently running process through a process management page, e.g., turn off, switch display screens, etc.
As shown in fig. 12, a plurality of processes currently running are displayed in a layered manner, and a user can select a process to be managed by scrolling up and down the processes. For example, the currently managed process is process one, and by scrolling down, the currently managed process can be switched to process two.
In some embodiments, after the current management process is switched, if the user clicks to select the current management process, the content of the current management process is directly displayed on the corresponding virtual display screen.
In addition, a tag a is provided in a corresponding area of a different process, and the tag a is used for identifying on which virtual display screen the process is currently displayed. The tag a may be determined according to the screen identifier, and if the screen identifier is VS1, the tag a may display "one screen", if the screen identifier is VS2, the tag a may display "two screens", and if the screen identifier is (VS 1, VS 2), the tag a may display "one screen+two screens". After an input process management instruction is received, acquiring an operating process and a position presented by the process, and displaying the positions presented before the instruction is received according to the application corresponding to the process (the first virtual display screen VS1 or the second virtual display screen VS 2) in the label sub-control of the process control controlled by the position presented before the instruction is received according to the application corresponding to the process, wherein the content displayed by the different label sub-control of the position presented before the instruction is received by the application corresponding to the process is different. For example, the label A1 corresponding to the first process in fig. 12 is "one screen", that is, indicates that the first process is displayed on the first virtual display screen VS1; the label A2 corresponding to the second process is a 'two-screen', namely the second process is displayed on a second virtual display screen VS2; the label A3 corresponding to the process three is "one screen+two screens", that is, the process three is displayed on the first virtual display screen VS1 and the second virtual display screen VS2 at the same time. At this time, if the user clicks the first selected process, the content corresponding to the first process is directly displayed on the first virtual display screen VS1; if the user clicks the selected process II, the content corresponding to the process II is directly displayed on a second virtual display screen VS2; if the user clicks to select the third process, the content corresponding to the third process is directly displayed on the first virtual display screen VS1 and the second virtual display screen VS 2.
It will be appreciated that the currently running processes may also be displayed in other forms, for example, tiled in the form of multiple widgets, etc., and the display forms of the processes are not limited in this application.
In addition, the present application does not limit the expression form of the tag content, and the tag content may be any combination of numbers, letters, and characters, so long as the user can clearly and intuitively know on which virtual display screen the process is displayed according to the tag content. For example, in fig. 12, the tag content "one screen" may be "1", the tag content "two screens" may be "2", or the like.
In some embodiments, referring to fig. 12, after opening the process management page, the user may perform process shutdown management on the currently running process, in addition to seeing on which virtual display screen the process is displayed. In some embodiments, a user may shut down a running process through process shutdown control B.
For example, the user may close the first process by clicking the first process closing control B1, close the second process by clicking the second process closing control B2, and close the third process by clicking the third process closing control B3. Thus, the user can pertinently close the process which needs to be closed.
In some embodiments, referring to fig. 12, the process management page is further provided with a one-button closing control B0, and when the number of processes that need to be closed currently is large, the user can close all the processes that run currently by clicking the one-button closing control B0, without clicking the process closing control corresponding to each process one by one, so that the process closing efficiency can be improved.
After the user completes the process closing operation, the page before the process management can be returned by clicking the return key. In some embodiments, after the process shutdown operation is completed, if the user clicks on a space in the process management interface, the educational interface in the host interface is returned.
In some embodiments, after the user opens the process management page, the user may switch the virtual display screen corresponding to the currently running process, in addition to performing process closing management on the currently running process. In some embodiments, the user may display a screen switch to the process by applying display screen switch control C.
In some embodiments, application display screen toggling control C may include control C1 and control C2. The triggering mode of the control C1 can be that user operation sliding leftwards is received in a display area of the topmost process, an instruction for switching the application display screen can be generated after triggering, and the topmost process can be switched to the second virtual display screen VS2 in response to the instruction; the triggering manner of the control C2 may be that a user operation sliding to the right is received in the display area of the topmost process, and the topmost process may be switched to the first virtual display screen CS1 in response to the triggering. As shown in FIG. 12, control C1 and control C2 may be simultaneously presented on a process management page. As shown in fig. 13, only control C1 may be displayed when the topmost application is a one-screen process. As shown in fig. 14, only control C2 may be displayed when the topmost application is a two-screen process.
In some embodiments, when the currently managed process is a process, the label A1 corresponding to the process is "one screen", that is, the process is displayed on the first virtual display screen VS1 before the process management instruction is received, at this time, the user may execute the left-sliding operation corresponding to the control C1 to switch the process to the second virtual display screen VS2 for displaying, at this time, the first virtual display screen VS1 does not display the content of the process, that is, the process is displayed in a different screen.
In some embodiments, when the currently managed process is a process, the user may execute the right-sliding operation corresponding to the control C2, and the process is still shown on a screen.
In some embodiments, when the currently managed process is the second process, the label A2 corresponding to the second process is "two-screen", that is, the second process is displayed on the second virtual display screen VS2 before receiving the process management instruction, at this time, the user may execute the right sliding operation corresponding to the control C2 to switch the second process to the first virtual display screen VS1 for display, at this time, the second virtual display screen VS2 does not display the content of the second process any more, that is, the second process is displayed in a different screen.
In some embodiments, when the currently managed process is process two, the user may execute the left-sliding operation corresponding to the control C1, where process two is still displayed on the two screens.
In some embodiments, when the currently managed process is the process three, the label A3 corresponding to the process three is "one screen+two screens", that is, the process two is displayed on the first virtual display screen VS1 and the second virtual display screen VS2 before receiving the process management instruction, at this time, the user may execute the left-sliding operation corresponding to the control C1 to switch the process three to the second virtual display screen VS2 for displaying, at this time, the first virtual display screen VS1 does not display the content of the process three any more, that is, the process three is switched from the dual-screen display to the single-screen (second virtual display screen VS 2) for displaying. At this time, the first virtual display screen VS1 may display a Launcher interface, for example, an educational interface or the like.
In some embodiments, the user may also perform a right-sliding operation corresponding to the control C2 to switch the third process to the first virtual display screen VS1 for display, where the second virtual display screen VS2 no longer displays the content of the third process, that is, switches the third process from the dual-screen display to the single-screen (the first virtual display screen VS 1) display. At this time, the second virtual display screen VS2 may display a counther interface or a time interface.
In some embodiments, controls C1 and C2 are conditionally hidden/displayed, i.e., controls C1 and C2 are not always in a displayed state.
In some embodiments, control C1 is displayed when the currently managed process is currently displayed on first virtual display VS1 and can switch to display on second virtual display VS2 (e.g., process one in fig. 12), at which time control C2 is hidden.
When the currently managed process is currently displayed on the second virtual display screen VS2 and can be switched to the first virtual display screen VS1 for display (for example, the second process in fig. 12), the control C2 is displayed, and the control C1 is hidden.
When the currently managed process is currently displayed on the first virtual display screen VS1 and the second virtual display screen VS2, and can be individually switched to the first virtual display screen VS1 or the second virtual display screen VS2 for display (for example, the process three in fig. 12), the control C1 and the control C2 are displayed simultaneously.
When the currently managed process is currently displayed on the first virtual display screen VS1 and the second virtual display screen VS2, and cannot be individually switched to the first virtual display screen VS1 or the second virtual display screen VS2 for display (for example, the process three in fig. 12), the control C1 and the control C2 are hidden.
Therefore, by setting the controls C1 and C2 as the conditions for hiding/displaying, the interference to the user when the user switches the display can be avoided.
In some embodiments, the user may also set controls C1 and C2 to always be displayed or always be hidden by the settings.
In one embodiment, a plurality of intelligent table lamps can form a communication system through network connection, a plurality of users corresponding to the plurality of intelligent table lamps can perform information interaction through the communication system, and the plurality of users can be users with different identity types. For example, the plurality of users may include a first number of first identity users, a second number of second identity users, and so on.
As can be seen from the foregoing embodiments, in the embodiments of the present application, by alternately displaying, in a superimposed manner, an application process running on a first display screen, an application process running on a second display screen, and an application process running on the first display screen and the second display screen on a process management interface or displaying in a superimposed manner according to a display position, regular display of the application process is achieved, which is favorable for a user to view the processes on each display screen, facilitates the user to perform process management, and improves user experience.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (9)

1. An intelligent display device, comprising:
the first projection mechanism is used for projecting on a first plane according to the control of the controller to form a first display picture, and the first plane is a desktop;
a second projection mechanism for forming a second display screen on a second plane according to the control of the controller;
a controller in communication with the first projection mechanism, the second projection mechanism, respectively, the controller configured to:
receiving a process management instruction input by a user;
responding to the process management instruction, and displaying a process management interface on the first display screen according to the types of a first type process and a second type process, wherein the first type process is a process which is only displayed on the first display screen and cannot be displayed on the second display screen, and the second type process is a process which is only displayed on the second display screen and cannot be displayed on the first display screen; in the process management interface, the interfaces of the first type of process and the interfaces of the second type of process are alternately arranged so that a user can conveniently check the processes on each display picture, and the interfaces arranged in the front process form shielding to the preset area arranged on the interface of the rear process in the two adjacent processes, and the interfaces arranged in the front process do not form shielding to the non-preset area arranged on the interface of the rear process;
And the non-preset area on the interface of each process is displayed with a label and a process closing control, wherein the label is used for indicating that the process is the first type process or the second type process, and the process closing control is used for closing the running process.
2. The smart display device according to claim 1, wherein displaying a process management page on the first display screen according to types of a first type of process and a second type of process includes:
determining M first-class processes and N second-class processes running in a background, wherein M is greater than or equal to N, and N is greater than zero;
and alternately displaying the interfaces of the first N first-class processes and the interfaces of the second N second-class processes on the first 2N sequence bits on the process management page, wherein the rest sequences of the rest interfaces of the M-N first-class processes after the first 2N sequence bits are sequentially arranged.
3. An intelligent display device, comprising:
the first projection mechanism is used for projecting on a first plane according to the control of the controller to form a first display picture, and the first plane is a desktop;
a second projection mechanism for forming a second display screen on a second plane according to the control of the controller;
A controller in communication with the first projection mechanism, the second projection mechanism, respectively, the controller configured to:
receiving a process management instruction input by a user;
displaying a process management interface on the first display screen according to the types of a first type of process, a second type of process and a third type of process, wherein the first type of process is a process which is only displayed on the first display screen and cannot be displayed on the second display screen, the second type of process is a process which is only displayed on the second display screen and cannot be displayed on the first display screen, and the third type of process is a process which is simultaneously displayed on the first display screen and the second display screen;
in the process management interface, the interfaces of the first type of process, the interfaces of the second type of process and the interfaces of the third type of process are alternately arranged so that a user can conveniently view the processes on each display picture, and the interfaces of the front process in two adjacent processes form shielding to a preset area arranged on the interface of the rear process, and the interfaces of the front process do not form shielding to a non-preset area arranged on the interface of the rear process;
And the non-preset area on the interface of each process is displayed with a label and a process closing control, wherein the label is used for indicating that the process is the first type process or the second type process, and the process closing control is used for closing the running process.
4. The intelligent display device of claim 3, wherein displaying a process management page on the first display screen according to types of the first type of process, the second type of process, and the third type of process comprises:
determining M first-class processes, N second-class processes and Q third-class processes running in the background, wherein M is greater than or equal to N, N is greater than or equal to Q, and Q is greater than zero;
the first Q first class interfaces, the first Q second class interfaces and the Q third class interfaces are alternately displayed on the first 3Q sequence bits on the process management page, the N-Q first class interfaces and the N-Q second class interfaces are alternately displayed on the 2 (N-Q) sequence bits after the first 3Q sequence bits on the process management page, and the remaining M-N first class interfaces are displayed on the subsequent sequence bits.
5. The intelligent display device according to claim 4, wherein screen identifiers are displayed on the interfaces of each process in the process management page, the screen identifiers on the interfaces of the first type of process are first identifiers, the screen identifiers on the interfaces of the second type of process are second identifiers, the screen identifiers on the interfaces of the third type of process are third identifiers, and the controller is configured to determine the display screen of each process according to the screen identifiers.
6. The smart display device of claim 5, wherein the process management page is provided with an application display screen switching control, the controller further configured to:
receiving a trigger instruction of the application display screen switching control, wherein the trigger instruction is used for switching a display area of a process positioned at the topmost layer from a first virtual display screen to a second virtual display screen or from the second virtual display screen to the first virtual display screen;
and responding to a trigger instruction of the application display screen switching control, and changing a screen identifier arranged on an interface of the current process at the forefront in the process management page from the current first identifier to the second identifier or from the current second identifier to the first identifier.
7. An intelligent display device, comprising:
the first projection mechanism is used for forming a first display picture on a first plane according to the control of the controller, and the first plane is a desktop;
a second projection mechanism for forming a second display screen on a second plane according to the control of the controller;
a controller in communication with the first projection mechanism, the second projection mechanism, respectively, the controller configured to:
Receiving a process management instruction input by a user;
responding to the process management instruction, and displaying a process management interface on the first display screen according to the types of a first type process and a second type process, wherein the first type process is a process which is only displayed on the first display screen and cannot be displayed on the second display screen, and the second type process is a process which is only displayed on the second display screen and cannot be displayed on the first display screen;
the process management interface comprises a first area and a second area, wherein the first area is used for displaying a first type of process, the second area is used for displaying a second type of process, and the first area and the second area are spaced by a preset distance so that the first type of process and the second type of process can be simultaneously and sectionally displayed in the process management interface;
the plurality of first-class processes are independently arranged into a column in the first area, the interface pairs of the first-class processes arranged in front form a shielding for a preset area on the interface of the later process, and the interface pairs of the first-class processes arranged in front do not form a shielding for a non-preset area on the interface of the first-class processes arranged in back;
The plurality of second-class processes are independently arranged into a column in the second area, the interface pairs of the front second-class process form shielding on the preset area on the interface of the back process, and the interface pairs of the front second-class process do not form shielding on the non-preset area on the interface of the back second-class process;
and the non-preset area on the interface of each process is displayed with a label and a process closing control, wherein the label is used for indicating that the process is the first type process or the second type process, and the process closing control is used for closing the running process.
8. The smart display device of claim 7, wherein the smart display device further operates a third process, the third process being a process that is displayed on the first display screen and the second display screen simultaneously;
displaying a process management interface on the first display screen according to the types of the first type of process and the second type of process, wherein the process management interface comprises: displaying a process management interface on the first display screen according to the types of the first type of process, the second type of process and the third type of process;
in the process management interface, the first type of process is arranged independently, and the second type of process is arranged independently at a preset distance from the first type of process, including: in the process management interface, the interfaces of the first type of process are arranged independently, the interfaces of the second type of process and the interfaces of the first type of process are arranged independently at intervals of a preset distance, and the interfaces of the third type of process and the interfaces of the second type of process are arranged independently at intervals of a preset distance.
9. An application management method for an intelligent display device, wherein the intelligent display device is provided with a first display screen and a second display screen, the application management method comprising:
receiving a process management instruction input by a user;
in response to the process management instruction, displaying a process management interface on the first display screen according to the types of a first process and a second process, wherein the first process is a process which is only displayed on the first display screen and cannot be displayed on the second display screen, the second process is a process which is only displayed on the second display screen and cannot be displayed on the first display screen, in the process management interface, the interfaces of the first process and the interfaces of the second process are alternately arranged, and in the adjacent two processes, the interfaces arranged in the front process form a shielding for a preset area arranged on the interface of the rear process, and the interface arranged in the front process does not form a shielding for a non-preset area arranged on the interface of the rear process;
and the non-preset area on the interface of each process is displayed with a label and a process closing control, wherein the label is used for indicating that the process is the first type process or the second type process, and the process closing control is used for closing the running process.
CN202110528461.6A 2020-05-14 2021-05-14 Intelligent display device and application management method Active CN113676710B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2020104080814 2020-05-14
CN202010408081 2020-05-14

Publications (2)

Publication Number Publication Date
CN113676710A CN113676710A (en) 2021-11-19
CN113676710B true CN113676710B (en) 2024-03-29

Family

ID=76929281

Family Applications (4)

Application Number Title Priority Date Filing Date
CN202110360544.9A Active CN113676709B (en) 2020-05-14 2021-04-02 Intelligent projection equipment and multi-screen display method
CN202110522907.4A Active CN113178105B (en) 2020-05-14 2021-05-13 Intelligent display device and exercise record acquisition method
CN202110529750.8A Active CN113194302B (en) 2020-05-14 2021-05-14 Intelligent display device and starting control method
CN202110528461.6A Active CN113676710B (en) 2020-05-14 2021-05-14 Intelligent display device and application management method

Family Applications Before (3)

Application Number Title Priority Date Filing Date
CN202110360544.9A Active CN113676709B (en) 2020-05-14 2021-04-02 Intelligent projection equipment and multi-screen display method
CN202110522907.4A Active CN113178105B (en) 2020-05-14 2021-05-13 Intelligent display device and exercise record acquisition method
CN202110529750.8A Active CN113194302B (en) 2020-05-14 2021-05-14 Intelligent display device and starting control method

Country Status (1)

Country Link
CN (4) CN113676709B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007184002A (en) * 2007-04-02 2007-07-19 Fujitsu Ltd Multiprocess management device and computer-readable recording medium
CN104461001A (en) * 2014-12-02 2015-03-25 联想(北京)有限公司 Information processing method and electronic equipment
CN106020796A (en) * 2016-05-09 2016-10-12 北京小米移动软件有限公司 Interface display method and device
WO2018119905A1 (en) * 2016-12-29 2018-07-05 深圳前海达闼云端智能科技有限公司 Control method and control device for multisystem mobile terminal, and electronic device
WO2018213451A1 (en) * 2017-05-16 2018-11-22 Apple Inc. Devices, methods, and graphical user interfaces for navigating between user interfaces and interacting with control objects
CN110008011A (en) * 2019-02-28 2019-07-12 维沃移动通信有限公司 A kind of target switching method and terminal device
CN110062288A (en) * 2019-05-21 2019-07-26 广州视源电子科技股份有限公司 Application management method, device, user terminal, multimedia terminal and storage medium

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0959452A3 (en) * 1998-05-23 1999-12-22 Mannesmann VDO Aktiengesellschaft Method of displaying variable information
US7469381B2 (en) * 2007-01-07 2008-12-23 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US10503342B2 (en) * 2006-08-04 2019-12-10 Apple Inc. User interface spaces
EP2058733A3 (en) * 2007-11-09 2009-09-02 Avro Computing Inc. Multi-tier interface for management of operational structured data
CN102495711B (en) * 2011-11-15 2017-05-17 中兴通讯股份有限公司 Virtual multi-screen implementation method and device
JP2014170147A (en) * 2013-03-05 2014-09-18 Funai Electric Co Ltd Projector
US20140344765A1 (en) * 2013-05-17 2014-11-20 Barnesandnoble.Com Llc Touch Sensitive UI Pinch and Flick Techniques for Managing Active Applications
US9798355B2 (en) * 2014-12-02 2017-10-24 Lenovo (Beijing) Co., Ltd. Projection method and electronic device
KR102350382B1 (en) * 2015-07-16 2022-01-13 삼성전자 주식회사 Display apparatus and control method thereof
US10209851B2 (en) * 2015-09-18 2019-02-19 Google Llc Management of inactive windows
CN106647934A (en) * 2016-08-04 2017-05-10 刘明涛 Projection microcomputer
CN112363657A (en) * 2016-10-26 2021-02-12 海信视像科技股份有限公司 Gesture erasing method and device
CN106598161A (en) * 2017-01-03 2017-04-26 蒲婷 Modular portable optical computer
CN106782268B (en) * 2017-01-04 2020-07-24 京东方科技集团股份有限公司 Display system and driving method for display panel
CN107132981B (en) * 2017-03-27 2019-03-19 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture
CN107197223A (en) * 2017-06-15 2017-09-22 北京有初科技有限公司 The gestural control method of micro-projection device and projector equipment
JP2019082649A (en) * 2017-10-31 2019-05-30 アルプスアルパイン株式会社 Video display system
CN108111758A (en) * 2017-12-22 2018-06-01 努比亚技术有限公司 A kind of shooting preview method, equipment and computer readable storage medium
CN108334229B (en) * 2018-01-31 2021-12-14 广州视源电子科技股份有限公司 Method, device and equipment for adjusting writing track and readable storage medium
CN108513070B (en) * 2018-04-04 2020-09-04 维沃移动通信有限公司 Image processing method, mobile terminal and computer readable storage medium
CN108769506B (en) * 2018-04-16 2020-04-21 Oppo广东移动通信有限公司 Image acquisition method and device, mobile terminal and computer readable medium
CN108874341B (en) * 2018-06-13 2021-09-14 深圳市东向同人科技有限公司 Screen projection method and terminal equipment
CN108874342B (en) * 2018-06-13 2021-08-03 深圳市东向同人科技有限公司 Projection view switching method and terminal equipment
CN108920016B (en) * 2018-08-15 2021-12-28 京东方科技集团股份有限公司 Touch display device, touch display client and touch information processing device
CN110347305A (en) * 2019-05-30 2019-10-18 华为技术有限公司 A kind of VR multi-display method and electronic equipment
CN115629730A (en) * 2019-07-23 2023-01-20 华为技术有限公司 Display method and related device
CN110941383B (en) * 2019-10-11 2021-08-10 广州视源电子科技股份有限公司 Double-screen display method, device, equipment and storage medium
CN110908574B (en) * 2019-12-04 2020-12-29 深圳市超时空探索科技有限公司 Display adjusting method, device, terminal and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007184002A (en) * 2007-04-02 2007-07-19 Fujitsu Ltd Multiprocess management device and computer-readable recording medium
CN104461001A (en) * 2014-12-02 2015-03-25 联想(北京)有限公司 Information processing method and electronic equipment
CN106020796A (en) * 2016-05-09 2016-10-12 北京小米移动软件有限公司 Interface display method and device
WO2018119905A1 (en) * 2016-12-29 2018-07-05 深圳前海达闼云端智能科技有限公司 Control method and control device for multisystem mobile terminal, and electronic device
WO2018213451A1 (en) * 2017-05-16 2018-11-22 Apple Inc. Devices, methods, and graphical user interfaces for navigating between user interfaces and interacting with control objects
CN110008011A (en) * 2019-02-28 2019-07-12 维沃移动通信有限公司 A kind of target switching method and terminal device
CN110062288A (en) * 2019-05-21 2019-07-26 广州视源电子科技股份有限公司 Application management method, device, user terminal, multimedia terminal and storage medium

Also Published As

Publication number Publication date
CN113178105A (en) 2021-07-27
CN113676709A (en) 2021-11-19
CN113178105B (en) 2022-05-24
CN113676709B (en) 2023-10-27
CN113676710A (en) 2021-11-19
CN113194302B (en) 2022-06-21
CN113194302A (en) 2021-07-30

Similar Documents

Publication Publication Date Title
CN107333047B (en) Shooting method, mobile terminal and computer readable storage medium
CN103870233A (en) Display device, and method of controlling display device
CN103197778A (en) Display device, projector, display system, and method of switching device
CN103488388A (en) Display control apparatus and control method thereof
JP6255935B2 (en) Handwriting reproducing apparatus and program
CN112383802B (en) Focus switching method, projection display device and system
CN112399212A (en) Display device, file sharing method and server
CN102736726A (en) Stealth technology for keyboard and mouse
CN113485604B (en) Interactive terminal, interactive system, interactive method and computer readable storage medium
CN109388321B (en) Electronic whiteboard operation method and device
US20150261385A1 (en) Picture signal output apparatus, picture signal output method, program, and display system
JP2017182110A (en) Display system, display device, information processor, and information processing method
CN112068741A (en) Display device and display method for Bluetooth switch state of display device
CN111580903A (en) Real-time voting method, device, terminal equipment and storage medium
CN113676710B (en) Intelligent display device and application management method
CN112269553B (en) Display system, display method and computing device
CN102289283A (en) Status change of adaptive device
CN106155533B (en) A kind of information processing method and projection device
CN110333780A (en) Function triggering method, device, equipment and storage medium
JP6260210B2 (en) Display system, display method and program
JP6187172B2 (en) Receiving device, receiving system, and program
CN112399071B (en) Control method and device for camera motor and display equipment
CN101876870A (en) Display terminal and method for operating display frame
CN112073779B (en) Display device and fault-tolerant method for key transmission
WO2023142056A1 (en) Information display methods and display devices, electronic device, and computer-readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant