CN113282174A - Terminal information display method, device, terminal and storage medium - Google Patents

Terminal information display method, device, terminal and storage medium Download PDF

Info

Publication number
CN113282174A
CN113282174A CN202110584897.7A CN202110584897A CN113282174A CN 113282174 A CN113282174 A CN 113282174A CN 202110584897 A CN202110584897 A CN 202110584897A CN 113282174 A CN113282174 A CN 113282174A
Authority
CN
China
Prior art keywords
user
module
terminal
aerial imaging
aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110584897.7A
Other languages
Chinese (zh)
Other versions
CN113282174B (en
Inventor
李贝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China United Network Communications Group Co Ltd
Original Assignee
China United Network Communications Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China United Network Communications Group Co Ltd filed Critical China United Network Communications Group Co Ltd
Priority to CN202110584897.7A priority Critical patent/CN113282174B/en
Publication of CN113282174A publication Critical patent/CN113282174A/en
Application granted granted Critical
Publication of CN113282174B publication Critical patent/CN113282174B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Abstract

The application provides a terminal information display method, a device, a terminal and a storage medium, wherein after user authentication is passed, an aerial imaging module is started to carry out aerial imaging based on a picture displayed in a terminal display screen, the aerial imaging module also has an aerial amplification imaging function, the imaging height is adjustable, and then if an aerial interaction request input by a user is obtained, the aerial interaction module is started to realize interaction with the user according to the request. According to the terminal equipment, the image is formed in the air based on the image displayed on the terminal display screen, the image is amplified, the imaging height is adjusted, and the like, so that the problems that the vision of a user is reduced, the body is damaged and the like caused by the fact that the display screen of the existing terminal equipment is small are solved. In addition, according to the embodiment of the application, the user does not need to directly contact the terminal display screen, the damage of the user to the terminal display screen is reduced, meanwhile, the user can perform corresponding operation on the image when comfortably watching the aerial imaging image, and the application requirement is met.

Description

Terminal information display method, device, terminal and storage medium
Technical Field
The present application relates to the field of aerial imaging technologies, and in particular, to a method and an apparatus for displaying terminal information, a terminal, and a storage medium.
Background
With the continuous progress of economic technology, more and more users use terminal devices, such as mobile phones, tablet computers (pads), point-and-read machines, touch computers, and the like, to perform work or study. Therefore, terminal devices have become an essential part of people's lives.
Currently, when a user uses a terminal device, the user usually directly performs corresponding operations on a display screen of the terminal device, such as watching a video, making a file, playing a game, and the like.
However, because the display screen of the terminal device is small, the user may have a visual deterioration if using the terminal device for a long time, and because the screen of the terminal device is small, the user needs to look over the display screen of the terminal device for a long time, and some physical injuries, such as cervical vertebra pain, are easy to occur. In addition, the strength of the terminal equipment display screen used by the user is large, and the frequency of the terminal equipment display screen is large, so that the screen of the terminal equipment is damaged to a certain extent, and the service life of the terminal equipment is influenced.
Disclosure of Invention
In order to solve the problems in the prior art, the application provides a terminal information display method, a device, a terminal and a storage medium.
In a first aspect, an embodiment of the present application provides a terminal information display method, which is applied to a terminal and includes the following steps:
acquiring an identity of a user, and authenticating the user according to the identity;
if the user authentication is passed, starting an aerial imaging module, wherein the aerial imaging module is used for carrying out aerial imaging based on a picture displayed in a display screen of the terminal;
generating a prompt whether to carry out air interaction on the display screen;
if an air interaction request input by the user according to the prompt is obtained, an air interaction module is started according to the air interaction request, and the air interaction module is used for identifying one or more of the face, the gesture, the voice and the body feeling of the user and determining a corresponding interaction instruction according to an identification result;
and interacting with the user according to the interaction instruction and the aerial imaging module.
In one possible implementation, after the starting the aerial imaging module, the method further includes:
judging whether the imaging height input by the user is received or not;
and if the imaging height input by the user is received, acquiring the current height of aerial imaging of the aerial imaging module, and controlling a terminal support to adjust the heights of the aerial imaging module and the display screen according to the imaging height input by the user and the current height of aerial imaging so as to adjust the height of aerial imaging.
In one possible implementation, the aerial imaging module includes a converging lens array, a slab lens array, and an aerial imaging array;
the convergent lens array is used for magnifying a picture displayed in the display screen into a virtual image, the flat lens array is used for magnifying the virtual image into a real image, and the aerial imaging array is used for reflecting the real image for at least two times to obtain an aerial real image corresponding to the picture displayed in the display screen.
In one possible implementation, the aerial imaging array includes any one of an afocal lens array, a stereoscopic mirror array that causes at least two reflections, and a combination retroreflective sheeting and a half mirror.
In a possible implementation manner, before the starting the air interaction module according to the air interaction request, the method further includes:
performing one or more of face recognition training, gesture recognition training, voice recognition training and somatosensory recognition training on the air interaction module;
the starting an air interaction module according to the air interaction request comprises:
and starting the trained air interaction module according to the air interaction request.
In a possible implementation manner, the authenticating the user according to the identity includes:
judging whether the identity mark is one of prestored legal user identity marks or not;
and if the identity is one of the pre-stored legal user identities, judging that the user authentication is passed.
In a possible implementation manner, a protective film is arranged on the display screen, and a protective shell is arranged on the terminal.
In a second aspect, an embodiment of the present application provides a terminal information display apparatus, where the apparatus is applied to a terminal, and the apparatus includes:
the authentication module is used for acquiring an identity of a user and authenticating the user according to the identity;
the first starting module is used for starting an aerial imaging module if the user authentication passes, and the aerial imaging module is used for carrying out aerial imaging based on a picture displayed in a display screen of the terminal;
the prompting module is used for generating a prompt for whether to carry out air interaction on the display screen;
the second starting module is used for starting the air interaction module according to the air interaction request if the air interaction request input by the user according to the prompt is obtained, and the air interaction module is used for identifying one or more of the face, the gesture, the voice and the body feeling of the user and determining a corresponding interaction instruction according to an identification result;
and the interaction module is used for interacting with the user according to the interaction instruction and the aerial imaging module.
In a possible implementation manner, the system further includes an adjusting module, configured to determine whether the imaging height input by the user is received after the aerial imaging module is started by the first starting module;
and if the imaging height input by the user is received, acquiring the current height of aerial imaging of the aerial imaging module, and controlling a terminal support to adjust the heights of the aerial imaging module and the display screen according to the imaging height input by the user and the current height of aerial imaging so as to adjust the height of aerial imaging.
In one possible implementation, the aerial imaging module includes a converging lens array, a slab lens array, and an aerial imaging array;
the convergent lens array is used for magnifying a picture displayed in the display screen into a virtual image, the flat lens array is used for magnifying the virtual image into a real image, and the aerial imaging array is used for reflecting the real image for at least two times to obtain an aerial real image corresponding to the picture displayed in the display screen.
In one possible implementation, the aerial imaging array includes any one of an afocal lens array, a stereoscopic mirror array that causes at least two reflections, and a combination retroreflective sheeting and a half mirror.
In one possible implementation, the aerial imaging module comprises a light-transmitting laminated body, a transparent strip and a convex lens laminated body, wherein the transparent strip and the convex lens laminated body are attached to each other;
the first starting module is specifically configured to:
and starting the light-transmitting laminated body, the transparent strips mutually attached and the convex lens laminated body to realize the transformation and amplification of the propagation direction of the reflected light in the aerial imaging module and form an aerial real image corresponding to the picture displayed in the display screen.
In a possible implementation manner, the second starting module is specifically configured to:
performing one or more of face recognition training, gesture recognition training, voice recognition training and somatosensory recognition training on the air interaction module;
and starting the trained air interaction module according to the air interaction request.
In a possible implementation manner, the authentication module is specifically configured to:
judging whether the identity mark is one of prestored legal user identity marks or not;
and if the identity is one of the pre-stored legal user identities, judging that the user authentication is passed.
In a possible implementation manner, a protective film is arranged on the display screen, and a protective shell is arranged on the terminal.
In a third aspect, an embodiment of the present application provides a terminal, including:
an aerial imaging module; an air interaction module; a processor;
a memory; and
a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor, the computer program comprising instructions for performing the method of the first aspect.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored, and the computer program causes a server to execute the method according to the first aspect.
In a fifth aspect, the present application provides a computer program product, which includes computer instructions for executing the method of the first aspect by a processor.
According to the terminal information display method, the terminal information display device, the terminal and the storage medium, the terminal obtains the identity of the user, the user is authenticated according to the identity, after the authentication is passed, the aerial imaging module is started and used for aerial imaging based on the picture displayed in the display screen of the terminal, then a prompt whether aerial interaction is carried out or not is generated on the display screen, if the aerial interaction request input by the user is obtained, the aerial interaction module is started according to the aerial interaction request and used for determining the corresponding interaction instruction so as to interact with the user. According to the method and the device, aerial imaging is carried out on the basis of the picture displayed on the terminal display screen, and the problems that the vision of a user is reduced, the body is damaged and the like caused by the fact that the display screen of the existing terminal device is small are solved. Moreover, the embodiment of the application realizes interaction with the user through the air interaction module and the air imaging module, the user does not need to directly contact the terminal display screen, the damage of the user to the terminal display screen is reduced to a certain extent, and meanwhile, the user can perform corresponding operation on the image when comfortably watching the air imaging image, so that the application requirement is met.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic diagram of a terminal used by a current user according to an embodiment of the present application;
fig. 2 is a schematic diagram of a terminal information display system according to an embodiment of the present application;
FIG. 3 is a schematic view of aerial imaging provided by an embodiment of the present application;
fig. 4 is a schematic flowchart of a terminal information display method according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an aerial imaging module provided in an embodiment of the present application;
FIG. 6 is a schematic view of another aerial imaging system provided by embodiments of the present application;
fig. 7 is a schematic flowchart of another terminal information display method according to an embodiment of the present application;
FIG. 8 is a schematic view of yet another aerial imaging system provided by an embodiment of the present application;
fig. 9 is a schematic diagram of a terminal information display provided in an embodiment of the present application;
fig. 10 is a schematic structural diagram of a terminal information display device according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of another terminal information display device according to an embodiment of the present application;
fig. 12A is a schematic diagram of a basic hardware architecture of a terminal provided in the present application;
fig. 12B is a schematic diagram of a basic hardware architecture of another terminal provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," "third," and "fourth," if any, in the description and claims of this application and the above-described figures are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
At present, along with the popularization and the use of terminals, the use of the terminals is more and more common, and due to the use of services such as games and the like, the number of myopia is increased day by day due to the factors of smaller terminal screens, longer terminal use time of users and the like, and the myopia has the trend of being low in age. Moreover, with the long-term application of the terminal, the user's posture is incorrect, for example, lowering the head and looking down at the terminal screen causes the appearance of "rich bag" on the neck, and cervical spondylosis also shows a rising trend, for example, as shown in fig. 1, taking the terminal as a mobile phone, the user lowers the head and looks at the mobile phone for a long time, which causes cervical discomfort. In addition, the terminal screen is a device with the highest terminal fault, and the terminal screen is consumed to a certain extent due to the force and frequency of touching the screen by a user, so that the terminal screen is damaged.
In order to solve the above problems, an embodiment of the present application provides a terminal information display method, which performs aerial imaging based on a picture displayed on a terminal display screen, and solves the problems of decreased user eyesight, body damage and the like caused by a small display screen of an existing terminal device. Moreover, the embodiment of the application realizes interaction with the user through the air interaction module and the air imaging module, the user does not need to directly contact the terminal display screen, the damage of the user to the terminal display screen is reduced to a certain extent, and meanwhile, the user can perform corresponding operation on the image when comfortably watching the in-air imaging image, so that the application requirement is met.
Optionally, the terminal information display method provided in the present application may be applied to the schematic structural diagram of the terminal information display system shown in fig. 2, and as shown in fig. 2, the system may include a processor 21, an aerial imaging module 22, and an aerial interaction module 23. The terminal information display system may be the terminal itself, or a chip or an integrated circuit that realizes the functions of the terminal.
It is to be understood that the illustrated structure of the embodiment of the present application does not form a specific limitation to the architecture of the terminal information display system. In other possible embodiments of the present application, the foregoing architecture may include more or less components than those shown in the drawings, or combine some components, or split some components, or arrange different components, which may be determined according to practical application scenarios, and is not limited herein. The components shown in fig. 2 may be implemented in hardware, software, or a combination of software and hardware.
In a specific implementation process, the processor 21 may start the aerial imaging module 22, and perform aerial imaging based on a picture displayed in a display screen of the terminal by using the aerial imaging module 22, so as to solve the problems of decreased vision, body damage and the like of a user caused by a small display screen of the existing terminal device. The aerial imaging module 22 may include a convergent lens array, a flat lens array, and an aerial imaging array, where the convergent lens array is configured to magnify a picture displayed in the display screen into a virtual image, the flat lens array is configured to magnify the virtual image into a real image, and the aerial imaging array is configured to obtain an aerial real image corresponding to the picture displayed in the display screen after reflecting the real image at least twice. For example, as shown in fig. 3, here, the processor 21 may start the aerial imaging module 22, and form an aerial real image corresponding to a screen displayed in the display screen of the terminal using the above-described convergent lens array, the flat lens array, and the aerial imaging array.
The processor 21 can also start the air interaction module 23, and a user can interact with the user through the air interaction module 23 and the air imaging module 22 without directly contacting the terminal display screen, so that the damage of the user to the terminal display screen is reduced to a certain extent. Furthermore, the air interaction module 23 may recognize one or more of a face, a gesture, a voice and a body sensation of the user, and determine a corresponding interaction instruction according to the recognition result, so that the processor 21 may interact with the user according to the interaction instruction and the air imaging module 22, so that the user may perform a corresponding operation on the image when comfortably watching the air imaging image, and the air interaction module is suitable for application.
The air interaction module 23 may employ a depth sensing technology, and a color camera, an infrared emitter, a microphone array, and the like are built in the air interaction module, so as to capture, detect, and track a hand, a finger, and a finger-like tool in real time, sense a position, an action, a sound, and the like of a user, and realize recognition of a human face, a gesture, a voice, or a body. Illustratively, the face recognition may be implemented by the air interaction module 23 by establishing a portrait file, reading a portrait, and comparing the two. Gesture recognition may be implemented by the air interaction module 23 using a right-handed cartesian coordinate system. The speech recognition can be implemented by the air interaction module 23 after continuous training by converting the sound into an electrical signal, performing preprocessing, extracting characteristic values, comparing with a preset speech model library, and the like.
In this embodiment of the present application, the terminal may be a handheld device, a vehicle-mounted device, a wearable device, a computing device, and various forms of User Equipment (UE), and the like.
In addition, the system architecture and the service scenario described in the embodiment of the present application are for more clearly illustrating the technical solution of the embodiment of the present application, and do not constitute a limitation to the technical solution provided in the embodiment of the present application, and it can be known by a person skilled in the art that the technical solution provided in the embodiment of the present application is also applicable to similar technical problems along with the evolution of the network architecture and the appearance of a new service scenario.
The technical solutions of the present application are described below with several embodiments as examples, and the same or similar concepts or processes may not be described in detail in some embodiments.
Fig. 4 is a schematic flowchart of a terminal information display method provided in an embodiment of the present application, where an execution main body of the embodiment may be the terminal in the embodiment shown in fig. 2, and further may be a processor in the terminal, and as shown in fig. 4, the method may include:
s401: and acquiring the identity of the user, and authenticating the user according to the identity.
Here, the user may be understood as a user using the terminal. When the user uses the terminal, the processor firstly acquires the identity of the user, wherein the identity can be a face image and a fingerprint of the user, or an unlocking number and a pattern input by the user.
For example, after obtaining the user identifier, the processor may determine whether the identifier is one of the pre-stored legitimate user identifiers. And if the identity is one of the pre-stored legal user identities, the processor judges that the user authentication is passed. And if the identity mark is not one of the pre-stored legal user identity marks, the processor judges that the user authentication is not passed.
The pre-stored legal user identity may be stored in the terminal in advance for the user, and may include a face image and a fingerprint of the user, or an unlocking number and a pattern set by the user. If the identity is the same as one of the pre-stored legal user identities, namely the identity is one of the pre-stored legal user identities, the processor judges that the user authentication is passed, and otherwise, the processor judges that the user authentication is not passed.
S402: and if the user authentication is passed, starting an aerial imaging module, wherein the aerial imaging module is used for carrying out aerial imaging on the basis of the picture displayed in the display screen of the terminal.
If the user authentication is passed, the processor may start an aerial imaging module to perform aerial imaging on a picture displayed in a display screen of the terminal.
The aerial imaging module can comprise a convergent lens array, a flat lens array and an aerial imaging array, the convergent lens array is used for magnifying a picture displayed in the display screen into a virtual image, the flat lens array is used for magnifying the virtual image into a real image, and the aerial imaging array is used for obtaining an aerial real image corresponding to the picture displayed in the display screen after reflecting the real image for at least two times.
The aerial imaging module not only can carry out aerial imaging on the basis of the picture displayed in the display screen of the terminal, but also has an aerial amplification imaging function, and the problems of user eyesight reduction, body damage and the like caused by the small display screen of the existing terminal equipment are solved.
Illustratively, as shown in fig. 5, a light source, a converging lens array (emitting light to form an enlarged virtual image (first enlargement) including an achromatic lens array for adjusting light of different wavelengths), a flat lens array (emitting light to form an enlarged real image 1 (second enlargement)), and an aerial imaging array (light reflected at least 2 times in an optical imaging element to form an aerial real image 2 corresponding to an incident image) are illustrated.
Wherein, for a converging lens array: the array realizes the functions of ultra-short focus amplification and chromatic aberration and spherical aberration elimination correction for a plurality of lens combinations.
Here, the distance between the flat lens and the center of the converging lens is x, which is generally selected to satisfy the following requirements:
Figure BDA0003087763740000091
d is the diameter of the convergent lens, L is the side length of the shorter side of the flat lens, F is the focal length of the convergent lens, and the distance between the flat lens and the convergent lens is kept within the requirements, so that the image imaged in the air can be kept clear and complete.
In addition, the focal length needs to be shortened for the use scene reason, the ultra-short focus amplification is realized, and chromatic aberration and spherical aberration are corrected, so that the ultra-short focus projector technology can be used for realizing the purpose. The lens array comprises a spherical ultra-short focus magnifying lens group.
For a flat lens array: the array emits light as an enlarged real image (second magnification).
For aerial imaging arrays: an optical imaging element which bends the light of the incident image can be used, so that the light forms an aerial real image corresponding to the incident image after being reflected for at least 2 times in the optical imaging element. That is, light input from one surface side is reflected or refracted, and an image of the object is formed into a real image of an inverted convex-concave relationship in the midair of the other surface side.
Illustratively, the aerial imaging array may include any one of an afocal lens array, a stereoscope array that causes at least two reflections, and a combination retroreflective sheeting and half-mirrors.
Wherein the afocal lens array: the relationship between the incident angle and the emission angle is the same, but the sign is opposite, and the afocal lens array may be formed by pairing lens array elements (ensuring the same optical axis) having the same positive and negative focal lengths and an intermediate plane. Here, the lens array elements having the same focal length in pairs may be a bread loaf lens array or a cylindrical lens array in one direction.
A stereoscopic array that causes at least two reflections: the optical elements can be a plurality of solid square or rectangular optical elements, light can penetrate from one surface to the other surface, and light can also penetrate through the surface to be emitted after being reflected twice by the mirror surface of the inner wall of each subunit optical element; the optical element can be a dihedral angle reflector, and the mirror surface of the outer wall of each subunit optical element is reflected twice and then emitted; the transparent strips of two adjacent layers of transparent laminates are arranged orthogonally and are arranged in a staggered manner to realize the transformation of the light propagation direction; the optical element can be a three-dimensional rectangular optical element which is internally provided with a plurality of orthogonal staggered layers for arrangement and hollow three-dimensional rectangle, and the optical element is arranged in a staggered manner to realize the transformation of the light propagation direction; the above results in a smaller micromirror imaging structure with a larger number of cells displaying an image.
Wherein the above-mentioned stereoscopic mirror array is used as an aerial imaging element, and the relationship between the incident light beam and the emitted light beam of the light beam forming the image is such that the magnitude of the incident angle and the emission angle with respect to the normal of the plane are the same, but the signs are opposite.
Here, a plurality of solid square or rectangular optical elements may be provided in the plane, which allow light to pass from one surface to another (or from another surface to the one surface). The unit optical elements are arranged in a matrix form along the U 'axis direction and the V' axis direction. In each unit optical element, inner wall surfaces perpendicular to each other are formed in the W-axis direction, and each inner wall surface is subjected to mirror surface processing. The embodiment of the application is provided with a three-dimensional Cartesian coordinate system U 'V' W. The three-dimensional cartesian coordinate system U ' V ' W is a coordinate system in which the U and V axes are rotated by 45 degrees and the U ' V plane is parallel to the U V plane.
Further, the optical element may be provided with a plurality of dihedral corner mirrors in the plane. Or the transparent strips of two adjacent layers of transparent laminated bodies are arranged orthogonally to realize the transformation of the light propagation direction, namely realize staggered discharge, and form a micromirror imaging structure with more and smaller units for displaying images, thereby greatly improving the resolution and reducing the energy consumption of the system. Or, a plurality of stereoscopic rectangular optical elements with stereoscopic rectangular hollows inside and staggered arrangement can be arranged in the plane.
Combining retroreflective sheeting with a half mirror: the incident angle of light entering the surface of the retroreflective sheeting is equal to the emission angle of light reflected by the reflective surface inside the sheeting and emitted from the surface of the sheeting. The retroreflection sheet can be a plurality of transparent solid spheres, light rays are subjected to surface refraction, internal reflection and surface refraction, and can also be a solid surface array formed by a solid right-angle isosceles triangle and a straight line enclosed city, and the light rays are subjected to surface refraction, internal twice reflection and surface refraction.
Wherein the half mirror reflects incident light in a direction of 1: the ratio of 1 is equally divided into reflected light and transmitted light. The light source enters the retroreflective sheeting after passing through the half mirror, and after being reflected inside, emergent light passes through the half mirror, and a part of light is transmitted to form a real image. The retroreflective sheeting is designed such that the incident angle of light entering the sheet surface is in principle equal to the emission angle of light reflected by the reflective surface inside the sheeting and emitted from the sheet surface.
For example, the light source a is reflected by the half mirror, the light enters the retroreflective sheet, the reflected light enters the half mirror, and then the light passes through the half mirror to form a real image, which is observed by human eyes.
The retroreflective sheet may be a plurality of transparent spheres, and the light is refracted by the surface, reflected internally, and refracted by the surface. Or an optical system in which a triangular pyramid or rectangular pyramid bottom surface array, a half mirror, and a retroreflective sheet are combined.
In addition, as shown in fig. 6, the aerial imaging module can be pulled and folded, and the terminal inner layer can be reversed to the upper layer of the display screen after being pulled out, so that aerial imaging of images in the terminal display screen is realized.
S403: and generating a prompt for whether to carry out air interaction on the display screen.
In the process of using the terminal, different users have different requirements. For example, if the user uses the terminal to watch the video, no interaction is needed, and if the user needs to frequently interact in making files or playing games, the processor generates a prompt for whether to perform the air interaction on the display screen of the terminal in order to meet the different requirements.
S404: and if the air interaction request input by the user according to the prompt is acquired, starting an air interaction module according to the air interaction request, wherein the air interaction module is used for identifying one or more of the face, the gesture, the voice and the body feeling of the user, and determining a corresponding interaction instruction according to an identification result.
S405: and interacting with the user according to the interaction instruction and the aerial imaging module.
Here, if the air interaction request input by the user according to the prompt is obtained, which indicates that the user has an interaction requirement, the processor may start the air interaction module according to the air interaction request. The specific functions of the air interaction module can be determined according to actual conditions, for example, sensing such as gestures can be realized by touch sensing, infrared laser sensing and the like, so that motion tracks, gesture unlocking and the like are achieved, and face recognition and the like can also be realized by using a miniature camera. The aerial interaction module identifies the face, the gesture and the like of the user and determines a corresponding interaction instruction according to an identification result, so that the processor can interact with the user according to the interaction instruction and the interaction requirement of the user is met.
In addition, a protective film can be arranged on the display screen, the protective film can be a layer of protective film with a grating, direct sunlight interference and stray light interference on imaging effects are reduced, and the like.
The terminal is provided with a protective shell. The protective shell is folded, one surface of the protective shell protects the terminal rear shell, and the other surface of the protective shell is folded and embedded into the rear shell to be reversely rotated according to needs, so that a certain angle is formed between the protective shell and a terminal display screen, and aerial imaging is realized.
According to the method and the device, the identity of the user is obtained through the terminal, the user is authenticated according to the identity, after the authentication is passed, the aerial imaging module is started and used for carrying out aerial imaging on the basis of a picture displayed in a display screen of the terminal, then a prompt for whether aerial interaction is carried out or not is generated on the display screen, if the aerial interaction request input by the user is obtained, the aerial interaction module is started according to the aerial interaction request and used for determining a corresponding interaction instruction so as to interact with the user. According to the method and the device, aerial imaging is carried out on the basis of the picture displayed on the terminal display screen, and the problems that the vision of a user is reduced, the body is damaged and the like caused by the fact that the display screen of the existing terminal device is small are solved. Moreover, the embodiment of the application realizes interaction with the user through the air interaction module and the air imaging module, the user does not need to directly contact the terminal display screen, the damage of the user to the terminal display screen is reduced to a certain extent, and meanwhile, the user can perform corresponding operation on the image when comfortably watching the air imaging image, so that the application requirement is met.
Here, in the embodiment of the present application, before the air interaction module is started according to the air interaction request, the air interaction module needs to be trained, so that the trained air interaction module is started to perform corresponding processing. In the training process, the processor can perform face recognition training, gesture recognition training, voice recognition training, somatosensory recognition training and the like on the air interaction module. Taking face recognition training as an example, the processor may input the reference face image into the air interaction module, and then determine the recognition accuracy according to the face recognition result output by the air interaction module and the recognition result corresponding to the reference face image. If the recognition accuracy is lower than the preset accuracy threshold, the processor can adjust the air interaction module according to the recognition accuracy to improve the recognition accuracy, and the adjusted air interaction module is used as a new air interaction module to re-execute the step of inputting the reference face image into the air interaction module until the recognition accuracy reaches the preset accuracy threshold, and the training is stopped.
The reference face may be acquired by the processor in advance through a camera on the terminal, and after acquiring the reference face image, the recognition result corresponding to the reference face image is recorded so as to be used for face recognition training of the air interaction module in the following.
Other training, such as gesture recognition training, speech recognition training, somatosensory recognition training, etc., refer to the above-mentioned face recognition training, and the embodiments of the present application are not described herein again.
In addition, in the embodiment of the application, after the aerial imaging module is started, the heights of the aerial imaging module and the display screen can be adjusted, so that the aerial imaging height can be adjusted, and various application requirements can be met. Fig. 7 is a flowchart illustrating another terminal information display method according to an embodiment of the present application. As shown in fig. 7, the method includes:
s701: and acquiring the identity of the user, and authenticating the user according to the identity.
S702: and if the user authentication is passed, starting an aerial imaging module, wherein the aerial imaging module is used for carrying out aerial imaging on the basis of the picture displayed in the display screen of the terminal.
The steps S701 to S702 refer to the related descriptions of the steps S401 to S402, and are not described herein again.
S703: and judging whether the imaging height input by the user is received.
S704: and if the imaging height input by the user is received, acquiring the current height of aerial imaging of the aerial imaging module, and controlling the terminal support to adjust the heights of the aerial imaging module and the display screen according to the imaging height input by the user and the current height of the aerial imaging so as to adjust the height of the aerial imaging.
Here, when the user uses the terminal, if the height of the aerial imaging picture formed by the aerial imaging module does not meet the required imaging height, the user may input the corresponding imaging height at the terminal.
The processor can judge whether the imaging height input by the user is received or not after the aerial imaging module is started to carry out aerial imaging. If the imaging height input by the user is received, the processor can acquire the current height of the aerial imaging module for aerial imaging, and compare the current height with the imaging height input by the user. If the image height is inconsistent with the current height of the aerial image, the processor can control the terminal bracket to adjust the heights of the aerial image module and the display screen according to the imaging height input by the user and the current height of the aerial image so as to adjust the height of the aerial image, and the adjusted height of the aerial image is equal to the imaging height input by the user.
For example, as shown in fig. 8, the processor may use the terminal bracket to perform folding and bidirectional lifting according to the line of sight of the user (i.e., the aerial imaging module and the terminal display screen are lifted or lowered simultaneously, and the lifting is required to complete imaging because the angle between the aerial imaging module and the terminal display screen is in the interval (0,90 °)).
S705: and generating a prompt for whether to carry out air interaction on the display screen.
S706: and if the air interaction request input by the user according to the prompt is acquired, starting an air interaction module according to the air interaction request, wherein the air interaction module is used for identifying one or more of the face, the gesture, the voice and the body feeling of the user, and determining a corresponding interaction instruction according to an identification result.
S707: and interacting with the user according to the interaction instruction and the aerial imaging module.
The steps S705 to S707 refer to the related description of the steps S403 to S405, and are not described herein again.
According to the embodiment of the application, after the aerial imaging module is started, the heights of the aerial imaging module and the display screen can be adjusted, so that the height of aerial imaging can be adjusted, and various requirements of a user on the aerial imaging height can be met. In addition, the embodiment of the application carries out aerial imaging based on the picture displayed on the display screen of the terminal, and solves the problems of user eyesight reduction, body injury and the like caused by the small display screen of the existing terminal equipment. Moreover, the embodiment of the application realizes interaction with the user through the air interaction module and the air imaging module, the user does not need to directly contact the terminal display screen, the damage of the user to the terminal display screen is reduced to a certain extent, and meanwhile, the user can perform corresponding operation on the image when comfortably watching the air imaging image, so that the application requirement is met.
Here, as shown in fig. 9, the processor may first authenticate the user based on the acquired user identity. If the authentication is passed, the processor can start the aerial imaging module to carry out aerial imaging based on the picture displayed in the display screen of the terminal, and the aerial imaging module also has an aerial amplification imaging function, so that the problems of user vision reduction, body injury and the like caused by the small display screen of the conventional terminal equipment are solved. Then, the processor can control the terminal bracket to adjust the heights of the aerial imaging module and the display screen according to the imaging height input by the user so as to adjust the aerial imaging height and meet different requirements of the user on the aerial imaging height. Moreover, the processor can also start the air interaction module according to the air interaction request input by the user to recognize the human face, the gesture, the voice, the body feeling and the like of the user, and determine a corresponding interaction instruction according to the recognition result, so that the processor interacts with the user according to the interaction instruction, so that the user can perform corresponding operation on the picture when comfortably watching the aerial imaging picture, and the air interaction system is suitable for application.
Fig. 10 is a schematic structural diagram of a terminal information display device according to an embodiment of the present application, corresponding to the terminal information display method according to the foregoing embodiment. For convenience of explanation, only portions related to the embodiments of the present application are shown. Fig. 10 is a schematic structural diagram of a terminal information display device according to an embodiment of the present application, where the terminal information display device 100 includes: an authentication module 1001, a first activation module 1002, a prompt module 1003, a second activation module 1004, and an interaction module 1005. The terminal information display device may be the terminal itself, or a chip or an integrated circuit that realizes the functions of the terminal. It should be noted here that the division of the authentication module, the first starting module, the prompting module, the second starting module, and the interaction module is only a division of a logic function, and the two modules may be integrated or independent physically.
The authentication module 1001 is configured to obtain an identity of a user, and authenticate the user according to the identity.
A first starting module 1002, configured to start an aerial imaging module if the user authentication passes, where the aerial imaging module is configured to perform aerial imaging based on a picture displayed in a display screen of the terminal.
And a prompt module 1003, configured to generate a prompt on the display screen whether to perform an air interaction.
A second starting module 1004, configured to start the air interaction module according to the air interaction request input by the user according to the prompt if the air interaction request is obtained, where the air interaction module is configured to identify one or more of a face, a gesture, a voice, and a body sensation of the user, and determine a corresponding interaction instruction according to an identification result.
And the interaction module 1005 is configured to interact with the user according to the interaction instruction and the aerial imaging module.
In one possible implementation, the aerial imaging module includes a converging lens array, a slab lens array, and an aerial imaging array;
the convergent lens array is used for magnifying a picture displayed in the display screen into a virtual image, the flat lens array is used for magnifying the virtual image into a real image, and the aerial imaging array is used for reflecting the real image for at least two times to obtain an aerial real image corresponding to the picture displayed in the display screen.
In one possible implementation, the aerial imaging array includes any one of an afocal lens array, a stereoscopic mirror array that causes at least two reflections, and a combination retroreflective sheeting and a half mirror.
In a possible implementation manner, the second starting module 1004 is specifically configured to:
performing one or more of face recognition training, gesture recognition training, voice recognition training and somatosensory recognition training on the air interaction module;
and starting the trained air interaction module according to the air interaction request.
In a possible implementation manner, the authentication module 1001 is specifically configured to:
judging whether the identity mark is one of prestored legal user identity marks or not;
and if the identity is one of the pre-stored legal user identities, judging that the user authentication is passed.
In a possible implementation manner, a protective film is arranged on the display screen, and a protective shell is arranged on the terminal.
The apparatus provided in the embodiment of the present application may be used to implement the technical solution of the method embodiment in fig. 4, which has similar implementation principles and technical effects, and is not described herein again in the embodiment of the present application.
Fig. 11 is a schematic structural diagram of another terminal information display device according to an embodiment of the present application. In addition to fig. 10, the terminal information display device 100 further includes: an adjustment module 1006.
The adjusting module 1006 is configured to determine whether the imaging height input by the user is received after the aerial imaging module is started by the first starting module 1002;
and if the imaging height input by the user is received, acquiring the current height of aerial imaging of the aerial imaging module, and controlling a terminal support to adjust the heights of the aerial imaging module and the display screen according to the imaging height input by the user and the current height of aerial imaging so as to adjust the height of aerial imaging.
The apparatus provided in the embodiment of the present application may be used to implement the technical solution of the method embodiment in fig. 7, which has similar implementation principles and technical effects, and is not described herein again in the embodiment of the present application.
Alternatively, fig. 12A and 12B schematically provide one possible basic hardware architecture of the terminal described in the present application, respectively.
Referring to fig. 12A and 12B, the terminal includes at least one air imaging module, an air interaction module, a processor 1201, and a communication interface 1203. Further optionally, a memory 1202 and a bus 1204 may also be included.
Among them, in the terminal, the number of the processors 1201 may be one or more, and fig. 12A and 12B only illustrate one of the processors 1201. Alternatively, the processor 1201 may be a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or a Digital Signal Processor (DSP). If the terminal has a plurality of processors 1201, the types of the plurality of processors 1201 may be different, or may be the same. Optionally, the plurality of processors 1201 of the terminal may also be integrated as a multi-core processor.
Memory 1202 stores computer instructions and data; the memory 1202 may store computer instructions and data required to implement the above-described terminal information display method provided herein, for example, the memory 1202 stores instructions for implementing the steps of the above-described terminal information display method. Memory 1202 may be any one or any combination of the following storage media: nonvolatile memory (e.g., Read Only Memory (ROM), Solid State Disk (SSD), hard disk (HDD), optical disk), volatile memory.
The communication interface 1203 may provide information input/output for the at least one processor. Any one or any combination of the following devices may also be included: a network interface (e.g., an ethernet interface), a wireless network card, etc. having a network access function.
Optionally, the communication interface 1203 may also be used for data communication between the terminal and other computing devices or terminals.
Further alternatively, fig. 12A and 12B show the bus 1204 by a thick line. The bus 1204 may connect the processor 1201 with the memory 1202 and the communication interface 1203. Thus, the processor 1201 has access to the memory 1202 via the bus 1204, and may also interact with other computing devices or terminals using the communication interface 1203.
In the present application, the terminal executes the computer instructions in the memory 1202, so that the terminal implements the terminal information display method provided in the present application, or the terminal deploys the terminal information display apparatus.
From the perspective of logical functional partitioning, illustratively, as shown in fig. 12A, the memory 1202 may include therein an authentication module 1001, a first initiation module 1002, a prompting module 1003, a second initiation module 1004, and an interaction module 1005. The inclusion herein merely relates to that instructions stored in the memory may, when executed, implement the functions of the authentication module, the first initiation module, the prompting module, the second initiation module, and the interaction module, respectively, and is not limited to a physical structure.
In one possible design, as shown in FIG. 12B, the memory 1202 includes a throttle module 1006, which includes instructions that are stored in the memory only and when executed can implement the functions of the throttle module, and is not limited to a physical structure.
In addition, the above-described terminal may be implemented by software as in fig. 12A and 12B, or may be implemented by hardware as a hardware module or as a circuit unit.
The application provides a computer readable storage medium, and the computer program product comprises computer instructions for instructing a computing device to execute the above terminal information display method provided by the application.
An embodiment of the present application provides a computer program product, which includes a computer instruction, where the computer instruction is executed by a processor to perform the above terminal information display method provided in the present application.
The present application provides a chip comprising at least one processor and a communication interface providing information input and/or output for the at least one processor. Further, the chip may also include at least one memory for storing computer instructions. The at least one processor is used for calling and running the computer instructions to execute the terminal information display method provided by the application.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.

Claims (11)

1. A terminal information display method is applied to a terminal, and the method comprises the following steps:
acquiring an identity of a user, and authenticating the user according to the identity;
if the user authentication is passed, starting an aerial imaging module, wherein the aerial imaging module is used for carrying out aerial imaging based on a picture displayed in a display screen of the terminal;
generating a prompt whether to carry out air interaction on the display screen;
if an air interaction request input by the user according to the prompt is obtained, an air interaction module is started according to the air interaction request, and the air interaction module is used for identifying one or more of the face, the gesture, the voice and the body feeling of the user and determining a corresponding interaction instruction according to an identification result;
and interacting with the user according to the interaction instruction and the aerial imaging module.
2. The method of claim 1, further comprising, after said activating the aerial imaging module:
judging whether the imaging height input by the user is received or not;
and if the imaging height input by the user is received, acquiring the current height of aerial imaging of the aerial imaging module, and controlling a terminal support to adjust the heights of the aerial imaging module and the display screen according to the imaging height input by the user and the current height of aerial imaging so as to adjust the height of aerial imaging.
3. The method of claim 1 or 2, wherein the aerial imaging module comprises a converging lens array, a flat-panel lens array, and an aerial imaging array;
the convergent lens array is used for magnifying a picture displayed in the display screen into a virtual image, the flat lens array is used for magnifying the virtual image into a real image, and the aerial imaging array is used for reflecting the real image for at least two times to obtain an aerial real image corresponding to the picture displayed in the display screen.
4. The method of claim 3, wherein the aerial imaging array comprises any one of an afocal lens array, a stereoscopic array that causes at least two reflections, and a combination retroreflective sheeting and half-mirrors.
5. The method according to claim 1 or 2, wherein before said initiating an over-the-air interaction module according to said over-the-air interaction request, further comprising:
performing one or more of face recognition training, gesture recognition training, voice recognition training and somatosensory recognition training on the air interaction module;
the starting an air interaction module according to the air interaction request comprises:
and starting the trained air interaction module according to the air interaction request.
6. The method according to claim 1 or 2, wherein the authenticating the user according to the identity comprises:
judging whether the identity mark is one of prestored legal user identity marks or not;
and if the identity is one of the pre-stored legal user identities, judging that the user authentication is passed.
7. The method according to claim 1 or 2, wherein a protective film is provided on the display screen and a protective case is provided on the terminal.
8. A terminal information display device, characterized in that the device is applied to a terminal, the device comprises:
the authentication module is used for acquiring an identity of a user and authenticating the user according to the identity;
the first starting module is used for starting an aerial imaging module if the user authentication passes, and the aerial imaging module is used for carrying out aerial imaging based on a picture displayed in a display screen of the terminal;
the prompting module is used for generating a prompt for whether to carry out air interaction on the display screen;
the second starting module is used for starting the air interaction module according to the air interaction request if the air interaction request input by the user according to the prompt is obtained, and the air interaction module is used for identifying one or more of the face, the gesture, the voice and the body feeling of the user and determining a corresponding interaction instruction according to an identification result;
and the interaction module is used for interacting with the user according to the interaction instruction and the aerial imaging module.
9. A terminal, comprising:
an aerial imaging module; an air interaction module; a processor;
a memory; and
a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor, the computer program comprising instructions for performing the method of any of claims 1-7.
10. A computer-readable storage medium, characterized in that it stores a computer program that causes a server to execute the method of any one of claims 1-7.
11. A computer program product comprising computer instructions for executing the method of any one of claims 1 to 7 by a processor.
CN202110584897.7A 2021-05-27 2021-05-27 Terminal information display method and device, terminal and storage medium Active CN113282174B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110584897.7A CN113282174B (en) 2021-05-27 2021-05-27 Terminal information display method and device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110584897.7A CN113282174B (en) 2021-05-27 2021-05-27 Terminal information display method and device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN113282174A true CN113282174A (en) 2021-08-20
CN113282174B CN113282174B (en) 2023-10-17

Family

ID=77282005

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110584897.7A Active CN113282174B (en) 2021-05-27 2021-05-27 Terminal information display method and device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN113282174B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113838395A (en) * 2021-09-18 2021-12-24 湖南美景创意文化建设有限公司 Ultra-high contrast medium-free aerial imaging display screen for museum
WO2023236716A1 (en) * 2022-06-10 2023-12-14 上海丹诺西诚智能科技有限公司 Position adjusting method and system for aerial image forming projection pattern

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005208308A (en) * 2004-01-22 2005-08-04 Yasuaki Tanaka Stereoscopic projector
JP2011081296A (en) * 2009-10-09 2011-04-21 Pioneer Electronic Corp Display device
CN102150072A (en) * 2008-07-10 2011-08-10 实景成像有限公司 Broad viewing angle displays and user interfaces
CN104977794A (en) * 2015-06-19 2015-10-14 浙江吉利控股集团有限公司 Portable air projection device
JP2016006447A (en) * 2014-06-20 2016-01-14 船井電機株式会社 Image display device
US20160062452A1 (en) * 2014-09-01 2016-03-03 Samsung Electronics Co., Ltd. Method for providing screen magnification and electronic device thereof
CN105389044A (en) * 2014-09-02 2016-03-09 三星电子株式会社 Display device
CN105938425A (en) * 2015-03-03 2016-09-14 三星电子株式会社 Method of displaying image and electronic device
CN107113949A (en) * 2014-12-26 2017-08-29 日立麦克赛尔株式会社 Lighting device
US20180137794A1 (en) * 2015-03-26 2018-05-17 Kyocera Document Solutions Inc. Visible image forming apparatus and image forming apparatus
WO2019039600A1 (en) * 2017-08-25 2019-02-28 林テレンプ株式会社 Aerial image display device
TWM585357U (en) * 2019-04-29 2019-10-21 大陸商北京眸合科技有限公司 Optical system for realizing aerial suspension type display
CN111133357A (en) * 2017-09-20 2020-05-08 三星电子株式会社 Optical lens assembly and electronic device with cross reference to related applications
US20200241470A1 (en) * 2019-01-25 2020-07-30 International Business Machines Corporation Augmented image viewing with three dimensional objects
CN111868657A (en) * 2020-05-15 2020-10-30 曹庆恒 Suspension display device and using method thereof

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005208308A (en) * 2004-01-22 2005-08-04 Yasuaki Tanaka Stereoscopic projector
CN102150072A (en) * 2008-07-10 2011-08-10 实景成像有限公司 Broad viewing angle displays and user interfaces
JP2011081296A (en) * 2009-10-09 2011-04-21 Pioneer Electronic Corp Display device
JP2016006447A (en) * 2014-06-20 2016-01-14 船井電機株式会社 Image display device
US20160062452A1 (en) * 2014-09-01 2016-03-03 Samsung Electronics Co., Ltd. Method for providing screen magnification and electronic device thereof
CN105389044A (en) * 2014-09-02 2016-03-09 三星电子株式会社 Display device
CN107113949A (en) * 2014-12-26 2017-08-29 日立麦克赛尔株式会社 Lighting device
CN105938425A (en) * 2015-03-03 2016-09-14 三星电子株式会社 Method of displaying image and electronic device
US20180137794A1 (en) * 2015-03-26 2018-05-17 Kyocera Document Solutions Inc. Visible image forming apparatus and image forming apparatus
CN104977794A (en) * 2015-06-19 2015-10-14 浙江吉利控股集团有限公司 Portable air projection device
WO2019039600A1 (en) * 2017-08-25 2019-02-28 林テレンプ株式会社 Aerial image display device
CN111133357A (en) * 2017-09-20 2020-05-08 三星电子株式会社 Optical lens assembly and electronic device with cross reference to related applications
US20200241470A1 (en) * 2019-01-25 2020-07-30 International Business Machines Corporation Augmented image viewing with three dimensional objects
TWM585357U (en) * 2019-04-29 2019-10-21 大陸商北京眸合科技有限公司 Optical system for realizing aerial suspension type display
CN111868657A (en) * 2020-05-15 2020-10-30 曹庆恒 Suspension display device and using method thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113838395A (en) * 2021-09-18 2021-12-24 湖南美景创意文化建设有限公司 Ultra-high contrast medium-free aerial imaging display screen for museum
WO2023236716A1 (en) * 2022-06-10 2023-12-14 上海丹诺西诚智能科技有限公司 Position adjusting method and system for aerial image forming projection pattern

Also Published As

Publication number Publication date
CN113282174B (en) 2023-10-17

Similar Documents

Publication Publication Date Title
US11710351B2 (en) Action recognition method and apparatus, and human-machine interaction method and apparatus
US11461982B2 (en) 3D object rendering using detected features
EP3106911B1 (en) Head mounted display apparatus
US10585288B2 (en) Computer display device mounted on eyeglasses
US20180120935A1 (en) Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device
US9888126B2 (en) Multipurpose conferencing terminal and multipurpose conference system
JP6095763B2 (en) Gesture registration device, gesture registration program, and gesture registration method
US20160062474A1 (en) Unlocking a Head Mountable Device
KR20190133204A (en) Accumulation and Reliability Allocation of Iris Codes
JP6052399B2 (en) Image processing program, image processing method, and information terminal
KR20170100641A (en) Virtual representations of real-world objects
CN113282174B (en) Terminal information display method and device, terminal and storage medium
US9336779B1 (en) Dynamic image-based voice entry of unlock sequence
CN104216118A (en) Head Mounted Display With Remote Control
US20220309836A1 (en) Ai-based face recognition method and apparatus, device, and medium
US20150381973A1 (en) Calibration device, calibration program, and calibration method
CN110968190B (en) IMU for touch detection
WO2020036821A1 (en) Identification method and apparatus and computer-readable storage medium
CN109643208A (en) Display device, program, display methods and control device
KR20210017081A (en) Apparatus and method for displaying graphic elements according to object
US20220329726A1 (en) Wearable electronic device including cameras
KR102043156B1 (en) Mobile terminal and method for controlling the same
KR102067599B1 (en) Mobile terminal and method for controlling the same
US20230262323A1 (en) Method and device for obtaining image of object
US20230367117A1 (en) Eye tracking using camera lens-aligned retinal illumination

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant