CN113282174B - Terminal information display method and device, terminal and storage medium - Google Patents

Terminal information display method and device, terminal and storage medium Download PDF

Info

Publication number
CN113282174B
CN113282174B CN202110584897.7A CN202110584897A CN113282174B CN 113282174 B CN113282174 B CN 113282174B CN 202110584897 A CN202110584897 A CN 202110584897A CN 113282174 B CN113282174 B CN 113282174B
Authority
CN
China
Prior art keywords
user
module
imaging
air
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110584897.7A
Other languages
Chinese (zh)
Other versions
CN113282174A (en
Inventor
李贝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China United Network Communications Group Co Ltd
Original Assignee
China United Network Communications Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China United Network Communications Group Co Ltd filed Critical China United Network Communications Group Co Ltd
Priority to CN202110584897.7A priority Critical patent/CN113282174B/en
Publication of CN113282174A publication Critical patent/CN113282174A/en
Application granted granted Critical
Publication of CN113282174B publication Critical patent/CN113282174B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Abstract

The application provides a terminal information display method, a device, a terminal and a storage medium, wherein after user authentication is passed, the method starts an aerial imaging module to carry out aerial imaging based on a picture displayed in a terminal display screen, the aerial imaging module also has an aerial amplifying imaging function, the imaging height is adjustable, and then if an aerial interaction request input by a user is acquired, the aerial interaction module is started to realize interaction with the user according to the request. The embodiment of the application performs aerial imaging based on the picture displayed on the terminal display screen, amplifies imaging, adjusts imaging height and the like, and solves the problems of user vision degradation, body injury and the like caused by smaller display screen of the traditional terminal equipment. In addition, the embodiment of the application does not need to directly contact the terminal display screen by the user, reduces the damage of the user to the terminal display screen, simultaneously enables the user to perform corresponding operation on the picture when watching the aerial imaging picture comfortably, and meets the application requirement.

Description

Terminal information display method and device, terminal and storage medium
Technical Field
The present application relates to the field of aerial imaging technologies, and in particular, to a terminal information display method, a device, a terminal, and a storage medium.
Background
With the continuous progress of economic technology, more and more users use terminal devices, such as mobile phones, tablet computers (portable android device, pad), point-to-read machines, touch computers, etc., to perform work or study. Therefore, the terminal device has become an indispensable part of life of people.
Currently, when a user uses a terminal device, the user typically performs corresponding operations, such as watching a video, making a file, playing a game, or the like, directly on a display screen of the terminal device.
However, since the display screen of the terminal device is small, the user may have a reduced eyesight if using the terminal device for a long period of time, and since the display screen of the terminal device is small, the user needs to look at the display screen of the terminal device for a long period of time with a low head, and some physical injuries such as cervical pain are likely to occur. In addition, because the user uses the display screen of the terminal equipment with larger strength and more frequency, the screen of the terminal equipment is damaged to a certain extent, and the service life of the equipment is influenced.
Disclosure of Invention
In order to solve the problems in the prior art, the application provides a terminal information display method, a device, a terminal and a storage medium.
In a first aspect, an embodiment of the present application provides a terminal information display method, where the method is applied to a terminal, and includes the following steps:
Acquiring an identity of a user, and authenticating the user according to the identity;
if the user authentication is passed, an aerial imaging module is started, and the aerial imaging module is used for aerial imaging based on a picture displayed in a display screen of the terminal;
generating a prompt on the display screen whether to perform air interaction;
if the air interaction request input by the user according to the prompt is obtained, an air interaction module is started according to the air interaction request, and the air interaction module is used for identifying one or more of the face, the gesture, the voice and the somatosensory of the user and determining a corresponding interaction instruction according to an identification result;
and according to the interaction instruction and the aerial imaging module, interacting with the user.
In one possible implementation, after the activating the aerial imaging module, the method further includes:
judging whether the imaging height input by the user is received or not;
and if the imaging height input by the user is received, acquiring the current air imaging height of the air imaging module, and controlling a terminal bracket to adjust the heights of the air imaging module and the display screen according to the imaging height input by the user and the current air imaging height so as to adjust the air imaging height.
In one possible implementation, the aerial imaging module includes a converging lens array, a flat lens array, and an aerial imaging array;
the converging lens array is used for amplifying a picture displayed in the display screen into a virtual image, the flat lens array is used for amplifying the virtual image into a real image, and the aerial imaging array is used for obtaining an aerial real image corresponding to the picture displayed in the display screen after the real image is reflected at least twice.
In one possible implementation, the aerial imaging array comprises an afocal lens array, a stereoscopic array that causes at least two reflections, and any one of a combination retroreflective sheeting and a half mirror.
In one possible implementation manner, before the air interaction module is started according to the air interaction request, the method further includes:
performing one or more of face recognition training, gesture recognition training, voice recognition training and somatosensory recognition training on the air interaction module;
and starting an air interaction module according to the air interaction request, wherein the method comprises the following steps:
and starting the trained air interaction module according to the air interaction request.
In one possible implementation manner, the authenticating the user according to the identity includes:
Judging whether the identity is one of prestored legal user identity;
and if the identity is one of the prestored legal user identities, judging that the user authentication passes.
In one possible implementation manner, a protective film is disposed on the display screen, and a protective shell is disposed on the terminal.
In a second aspect, an embodiment of the present application provides a terminal information display apparatus, which is applied to a terminal, including:
the authentication module is used for acquiring the identity of the user and authenticating the user according to the identity;
the first starting module is used for starting an aerial imaging module if the user authentication passes, and the aerial imaging module is used for aerial imaging based on a picture displayed in a display screen of the terminal;
the prompt module is used for generating a prompt on the display screen whether air interaction is carried out or not;
the second starting module is used for starting an air interaction module according to the air interaction request if the air interaction request input by the user according to the prompt is obtained, and the air interaction module is used for identifying one or more of the face, the gesture, the voice and the somatosensory of the user and determining a corresponding interaction instruction according to the identification result;
And the interaction module is used for interacting with the user according to the interaction instruction and the aerial imaging module.
In a possible implementation manner, the device further comprises an adjusting module, configured to determine whether the imaging height input by the user is received after the first starting module starts the aerial imaging module;
and if the imaging height input by the user is received, acquiring the current air imaging height of the air imaging module, and controlling a terminal bracket to adjust the heights of the air imaging module and the display screen according to the imaging height input by the user and the current air imaging height so as to adjust the air imaging height.
In one possible implementation, the aerial imaging module includes a converging lens array, a flat lens array, and an aerial imaging array;
the converging lens array is used for amplifying a picture displayed in the display screen into a virtual image, the flat lens array is used for amplifying the virtual image into a real image, and the aerial imaging array is used for obtaining an aerial real image corresponding to the picture displayed in the display screen after the real image is reflected at least twice.
In one possible implementation, the aerial imaging array comprises an afocal lens array, a stereoscopic array that causes at least two reflections, and any one of a combination retroreflective sheeting and a half mirror.
In one possible implementation, the aerial imaging module includes a light transmissive laminate, a transparent strip, and a lenticular lens laminate that are bonded to each other;
the first starting module is specifically configured to:
and starting the light-transmitting laminated body, the mutually-attached transparent strips and the convex lens laminated body to realize the transformation and amplification of the propagation direction of the reflected light in the aerial imaging module, so as to form an aerial real image corresponding to the picture displayed in the display screen.
In one possible implementation manner, the second starting module is specifically configured to:
performing one or more of face recognition training, gesture recognition training, voice recognition training and somatosensory recognition training on the air interaction module;
and starting the trained air interaction module according to the air interaction request.
In one possible implementation manner, the authentication module is specifically configured to:
judging whether the identity is one of prestored legal user identity;
and if the identity is one of the prestored legal user identities, judging that the user authentication passes.
In one possible implementation manner, a protective film is disposed on the display screen, and a protective shell is disposed on the terminal.
In a third aspect, an embodiment of the present application provides a terminal, including:
an aerial imaging module; an air interaction module; a processor;
a memory; and
a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor, the computer program comprising instructions for performing the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program, the computer program causing a server to execute the method of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising computer instructions for performing the method of the first aspect by a processor.
The method comprises the steps of obtaining an identity of a user through the terminal, authenticating the user according to the identity, starting an aerial imaging module after the authentication is passed, generating a prompt on a display screen of whether aerial interaction is carried out or not based on a picture displayed in the display screen of the terminal, and starting the aerial interaction module according to the aerial interaction request if the aerial interaction request input by the user is obtained, wherein the aerial interaction module is used for determining corresponding interaction instructions to interact with the user. Namely, the embodiment of the application carries out aerial imaging based on the picture displayed on the terminal display screen, and solves the problems of reduced eyesight, body injury and the like of the user caused by smaller display screen of the traditional terminal equipment. In addition, the embodiment of the application realizes interaction with the user through the air interaction module and the air imaging module, does not need the user to directly contact with the terminal display screen, reduces the damage of the user to the terminal display screen to a certain extent, and simultaneously enables the user to perform corresponding operation on the image when watching the air imaging image comfortably, thereby meeting the application requirements.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the application, and that other drawings can be obtained according to these drawings without inventive faculty for a person skilled in the art.
Fig. 1 is a schematic diagram of an existing user terminal according to an embodiment of the present application;
fig. 2 is a schematic diagram of a terminal information display system architecture according to an embodiment of the present application;
FIG. 3 is a schematic illustration of aerial imaging according to an embodiment of the present application;
fig. 4 is a flow chart of a method for displaying terminal information according to an embodiment of the present application;
fig. 5 is a schematic diagram of an aerial imaging module according to an embodiment of the present application;
FIG. 6 is a schematic illustration of another aerial imaging provided by an embodiment of the present application;
fig. 7 is a flowchart of another method for displaying terminal information according to an embodiment of the present application;
FIG. 8 is a schematic illustration of still another aerial imaging provided in an embodiment of the present application;
fig. 9 is a schematic diagram of a terminal information display according to an embodiment of the present application;
Fig. 10 is a schematic structural diagram of a terminal information display device according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of another terminal information display device according to an embodiment of the present application;
fig. 12A is a schematic diagram of a basic hardware architecture of a terminal according to the present application;
fig. 12B is a schematic diagram of a basic hardware architecture of another terminal according to the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms "first," "second," "third," and "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
At present, with the popularization and use of terminals, the use of terminals is more and more common, and due to the use of services such as games, the number of myopia is increased day by day caused by the factors such as smaller terminal screen, longer terminal use time of users, and the like, the myopia shows a trend of low age. Moreover, with the long-term application of the terminal, the user uses the terminal in an incorrect posture, for example, when looking down at the screen of the terminal with the head down, the cervical spondylosis also tends to rise, for example, as shown in fig. 1, when taking the terminal as a mobile phone, the user looks down at the mobile phone for a long time, and cervical vertebra discomfort is caused. In addition, the terminal screen is the device with the highest terminal fault, and certain consumption occurs to the terminal screen due to the strength and the touch frequency of the touch screen of the user, so that the terminal screen is damaged.
In order to solve the problems, the embodiment of the application provides a terminal information display method, which is used for carrying out aerial imaging based on pictures displayed on a terminal display screen, and solves the problems of reduced eyesight, body injury and the like of a user caused by smaller display screen of the traditional terminal equipment. In addition, the embodiment of the application realizes the interaction with the user through the air interaction module and the air imaging module, does not need the user to directly contact with the terminal display screen, reduces the damage of the user to the terminal display screen to a certain extent, and simultaneously ensures that the user can perform corresponding operation on the image when watching the air imaging image comfortably, thereby meeting the application requirement.
Optionally, the method for displaying terminal information provided by the present application may be applied to a schematic architecture of a terminal information display system shown in fig. 2, where the system may include a processor 21, an aerial imaging module 22 and an aerial interaction module 23 as shown in fig. 2. The terminal information display system may be the terminal itself, or a chip or an integrated circuit for realizing the functions of the terminal.
It will be appreciated that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the architecture of the terminal information display system. In other possible embodiments of the present application, the architecture may include more or less components than those illustrated, or some components may be combined, some components may be split, or different component arrangements may be specifically determined according to the actual application scenario, and the present application is not limited herein. The components shown in fig. 2 may be implemented in hardware, software, or a combination of software and hardware.
In a specific implementation process, the processor 21 can start the aerial imaging module 22, and aerial imaging is performed by using the aerial imaging module 22 based on a picture displayed in a display screen of the terminal, so that the problems of reduced eyesight, body injury and the like of a user caused by smaller display screen of the existing terminal equipment are solved. The aerial imaging module 22 may include a converging lens array for magnifying a picture displayed in the display screen into a virtual image, a flat lens array for magnifying the virtual image into a real image, and an aerial imaging array for obtaining an aerial real image corresponding to the picture displayed in the display screen after reflecting the real image at least twice. For example, as shown in fig. 3, the processor 21 may activate the aerial imaging module 22 to form an aerial real image corresponding to a picture displayed in the display screen of the terminal using the above-described converging lens array, flat lens array, and aerial imaging array.
The processor 21 can also start the air interaction module 23, so that a user can interact with the user through the air interaction module 23 and the air imaging module 22 without directly contacting the terminal display screen by the user, and the damage of the user to the terminal display screen is reduced to a certain extent. Moreover, the air interaction module 23 can recognize one or more of face, gesture, voice and somatosensory of the user, and determine corresponding interaction instructions according to the recognition result, so that the processor 21 can interact with the user according to the interaction instructions and the air imaging module 22, and the user can perform corresponding operation on the image when watching the air imaging image comfortably, which is suitable for application.
The air interaction module 23 may adopt a depth sensing technology, and a color camera, an infrared emitter, a microphone array, etc. are built in the air interaction module to capture, detect and track hands, fingers and finger-like tools in real time, and sense the position, motion, sound, etc. of the user, so as to recognize the face, gesture, voice or body feeling. Illustratively, face recognition may be accomplished by the air interaction module 23 by creating a portrait file, reading a portrait, and comparing the two. Gesture recognition may be implemented by the air interaction module 23 using a right-hand cartesian coordinate system. The voice recognition can be realized by the air interaction module 23 after continuous training by converting sound into an electric signal, preprocessing, extracting characteristic values, comparing with a preset voice model library, and the like.
In the embodiment of the present application, the terminal may be a handheld device, an in-vehicle device, a wearable device, a computing device, various types of User Equipment (UE), and so on.
In addition, the system architecture and the service scenario described in the embodiments of the present application are for more clearly describing the technical solution of the embodiments of the present application, and do not constitute a limitation on the technical solution provided by the embodiments of the present application, and as a person of ordinary skill in the art can know, with evolution of the network architecture and occurrence of a new service scenario, the technical solution provided by the embodiments of the present application is also applicable to similar technical problems.
The following description of the present application is given by taking several embodiments as examples, and the same or similar concepts or processes may not be described in detail in some embodiments.
Fig. 4 is a schematic flow chart of a method for displaying terminal information according to an embodiment of the present application, where an execution body of the embodiment may be the terminal in the embodiment shown in fig. 2, and further may be a processor in the terminal, as shown in fig. 4, and the method may include:
s401: and acquiring the identity of the user, and authenticating the user according to the identity.
Here, the user may be understood as a user who uses the terminal. When the user uses the terminal, the processor firstly acquires the identity of the user, wherein the identity can be a face image and a fingerprint of the user or an unlocking number, a pattern and the like input by the user.
For example, after the processor acquires the identity of the user, the processor may determine whether the identity is one of the prestored legal user identities. And if the identity is one of the prestored legal user identities, the processor judges that the user authentication passes. If the identity is not one of the prestored legal user identities, the processor judges that the user authentication is not passed.
The prestored legal user identity mark can be stored in the terminal in advance for the user and can comprise a user face image, a fingerprint or unlocking numbers, patterns and the like set by the user. If the identity is the same as one of the prestored legal user identities, namely the identity is one of the prestored legal user identities, the identity is the legal user identity, and the processor judges that the user authentication passes, otherwise, the processor judges that the user authentication passes differently.
S402: and if the user authentication is passed, starting an aerial imaging module, wherein the aerial imaging module is used for aerial imaging based on a picture displayed in a display screen of the terminal.
If the user authentication is passed, the processor may start an aerial imaging module to aerial image a screen displayed in a display screen of the terminal.
The aerial imaging module may include a converging lens array, a flat lens array and an aerial imaging array, where the converging lens array is used to amplify a picture displayed in the display screen into a virtual image, the flat lens array is used to amplify the virtual image into a real image, and the aerial imaging array is used to obtain an aerial real image corresponding to the picture displayed in the display screen after the real image is reflected at least twice.
Here, the aerial imaging module not only can carry out aerial imaging based on pictures displayed in the display screen of the terminal, but also has an aerial amplification imaging function, and solves the problems of user vision degradation, body injury and the like caused by smaller display screen of the traditional terminal equipment.
Illustratively, as shown in fig. 5, there are shown a light source, a converging lens array (outgoing light into an enlarged virtual image (first enlargement) including an achromatic lens array for adjusting light of different wavelengths), a flat lens array (outgoing light into an enlarged real image 1 (second enlargement)), and an aerial imaging array (light rays are reflected at least 2 times in an optical imaging element to form an aerial real image 2 corresponding to an incident image).
Wherein for a converging lens array: the array realizes the functions of ultra-short focal magnification and eliminating correction chromatic aberration and spherical aberration for a plurality of lens combinations.
Here, the distance between the centers of the flat lens and the converging lens is x, which is generally chosen to satisfy the following requirements:
wherein D is the diameter of the convergent lens, L is the side length of the shorter side of the flat lens, F is the focal length of the convergent lens, and the distance between the flat lens and the convergent lens is kept within the above requirements, so that the image imaged in the air can be kept clear and complete.
In addition, for the reason of using the scene, the focal length is required to be shortened, the ultra-short focal length is realized, the chromatic aberration and the spherical aberration are corrected, and the ultra-short focal length projector technology can be utilized. The lens array includes a spherical ultra-short focal length magnifying lens group.
For a flat lens array: the array emits light into an amplified real image (second amplification).
For aerial imaging arrays: an optical imaging element that bends the light of an incident image may be utilized such that the light is reflected at least 2 times in the optical imaging element to form an aerial real image corresponding to the incident image. That is, light inputted from one surface side is reflected or refracted, and an image of the object is formed as a real image in inverse convex-concave relation in mid-air of the other surface side.
Illustratively, the above aerial imaging array may include any one of an afocal lens array, a stereoscopic lens array that causes at least two reflections, and a combination retroreflective sheeting and half mirror.
Wherein, afocal lens array: the incidence angle and the emission angle are equal in magnitude and opposite in sign, and may be a pair of afocal lens arrays having lens array elements with equal forward and backward focal lengths (ensuring the same optical axis) and a mid-plane. Here, the lens array elements having the same pair front and back focal lengths may be a bread-loaf lens array or a cylindrical lens array in one direction.
An array of stereoscopy that causes at least two reflections: the optical element can be a plurality of solid square or rectangular optical elements, light can pass through from one surface to the other surface, and can penetrate the surface to be emitted after being reflected twice by the mirror surface of the inner wall of each subunit optical element; the optical element can be a dihedral angle reflector, and the mirror surface on the outer wall of each subunit optical element is reflected twice and then is emitted; the transparent laminated bodies can be orthogonally arranged, and transparent strips of two adjacent layers of transparent laminated bodies are orthogonally arranged and are arranged in a staggered manner to realize the transformation of the light propagation direction; the optical device can be a three-dimensional rectangular optical element internally provided with a plurality of orthogonal staggered layers and hollowed-out three-dimensional rectangular optical elements, and the optical elements are arranged in a staggered manner so as to realize the transformation of the light propagation direction; the above all form a micromirror imaging structure with a larger and smaller number of cells for displaying an image.
Wherein the above-described stereoscopic array is used as an aerial imaging element, and the relationship between the incident beam and the emitted beam of the light beam forming the image is such that the incident angle and the emitted angle with respect to the normal to the plane are the same in magnitude and opposite in sign.
Here, a plurality of stereoscopic square or rectangular optical elements may be provided in the above-mentioned plane, which allow light to pass from one surface to the other (or from the other surface to the one surface). The unit optical elements are arranged in a matrix along the U 'axis direction and the V' axis direction. In each unit optical element, inner wall surfaces perpendicular to each other are formed in the W axis direction, and each inner wall surface is subjected to mirror surface treatment. The embodiment of the application is provided with a three-dimensional Cartesian coordinate system U 'V' W. The three-dimensional cartesian coordinate system U 'V' W is a coordinate system in which the U and V axes are rotated 45 degrees and the U 'V' plane is parallel to the U V plane.
In addition, the plane may be provided with a plurality of optical elements of dihedral corner reflectors. Alternatively, the planes can be orthogonal light-transmitting laminated bodies, wherein the transparent strips of two adjacent layers of light-transmitting laminated bodies are in orthogonal arrangement so as to realize the transformation of the light propagation direction, namely the dislocation emission, and a micromirror imaging structure with more and smaller units for displaying images is formed, so that the resolution is greatly improved, and the energy consumption of a system is reduced. Or, a plurality of three-dimensional rectangular optical elements with hollow internal three-dimensional rectangular and staggered arrangement can be arranged in the plane.
Combining the retroreflective sheeting with the half mirror: the angle of incidence of a light ray into the surface of the retroreflective sheeting is equal to the angle of emission of the light reflected by the reflective surfaces inside the sheeting and emitted from the surface of the sheeting. The retroreflection sheet can be a plurality of transparent three-dimensional spheres, light rays are subjected to surface refraction, internal reflection and surface refraction, and the retroreflection sheet can also be a three-dimensional isosceles triangle or a three-dimensional surface array of isosceles triangles enclosed by straight lines, and the light rays are subjected to surface refraction, internal twice reflection and surface refraction.
Wherein, the half mirror is used for reflecting incident light according to the following ratio of 1: the ratio of 1 is equally divided into reflected light and transmitted light. The light source enters the retroreflective sheet after passing through the half mirror, and after being internally reflected, the emergent light passes through the half mirror, and part of the light is transmitted to form a real image. Retroreflective sheeting is designed such that the angle of incidence of light into the surface of the sheeting is in principle equal to the angle of emission of light reflected by reflective surfaces within the sheeting and emitted from the surface of the sheeting.
For example, the light source a is reflected by the half mirror, the light enters the retroreflective sheet, and after the light is reflected, the light enters the half mirror, and then the light is transmitted to form a real image, and the real image is observed by human eyes.
The retroreflective sheet may be a plurality of transparent spheres, and the light rays are subjected to surface refraction, internal reflection and surface refraction. An optical system in which a half mirror is combined with a retroreflective sheet may be used as the bottom surface array of a triangular pyramid or a rectangular pyramid.
In addition, as shown in fig. 6, the aerial imaging module can realize drawing and folding, and the terminal inner layer can be reversed to the upper layer of the display screen after drawing out, so that aerial imaging of images in the terminal display screen is realized.
S403: and generating a prompt on the display screen for whether air interaction is performed.
Different users have different requirements in the process of using the terminal. For example, if the user views the video using the terminal, no interaction is required, and if there is a frequent need for interaction in creating a file, playing a game, or the like, the processor generates a prompt on the display screen of the terminal as to whether or not to perform the over-the-air interaction in order to satisfy the different needs.
S404: if the air interaction request input by the user according to the prompt is obtained, an air interaction module is started according to the air interaction request, and the air interaction module is used for identifying one or more of the face, the gesture, the voice and the somatosensory of the user and determining a corresponding interaction instruction according to an identification result.
S405: and according to the interaction instruction and the aerial imaging module, interacting with a user.
Here, if the air interaction request input by the user according to the prompt is obtained, which indicates that the user has an interaction requirement, the processor may start the air interaction module according to the air interaction request. The specific function of the air interaction module can be determined according to actual conditions, for example, sensing such as gestures can be realized by utilizing touch sensing, infrared laser sensing and the like, movement tracks, gesture unlocking and the like can be realized, and face recognition and the like can also be realized by utilizing a miniature camera. The aerial interaction module recognizes the face, the gesture and the like of the user and determines a corresponding interaction instruction according to a recognition result, so that the processor can interact with the user according to the interaction instruction to meet the interaction requirement of the user.
In addition, the display screen can be provided with a protective film, and the protective film can be a layer of protective film with a grating, so that direct sunlight interference is reduced, and interference of stray light on imaging effects is reduced.
The terminal is provided with a protective shell. The protective shell is folded, one side of the protective shell protects the rear shell of the terminal, the other side of the protective shell is folded and embedded into the rear shell, the protective shell is folded and inverted according to the requirement, and a certain angle is formed between the protective shell and the display screen of the terminal, so that aerial imaging is realized.
According to the embodiment of the application, the identity of the user is acquired through the terminal, the user is authenticated according to the identity, after the authentication is passed, the aerial imaging module is started, the aerial imaging module is used for aerial imaging based on a picture displayed in a display screen of the terminal, then a prompt whether aerial interaction is performed is generated on the display screen, if an aerial interaction request input by the user is acquired, the aerial interaction module is started according to the aerial interaction request, and the aerial interaction module is used for determining a corresponding interaction instruction so as to interact with the user. Namely, the embodiment of the application carries out aerial imaging based on the picture displayed on the terminal display screen, and solves the problems of reduced eyesight, body injury and the like of the user caused by smaller display screen of the traditional terminal equipment. In addition, the embodiment of the application realizes interaction with the user through the air interaction module and the air imaging module, does not need the user to directly contact with the terminal display screen, reduces the damage of the user to the terminal display screen to a certain extent, and simultaneously enables the user to perform corresponding operation on the image when watching the air imaging image comfortably, thereby meeting the application requirements.
Before the air interaction module is started according to the air interaction request, the air interaction module needs to be trained so as to perform corresponding processing on the air interaction module after the air interaction module is started. In the training process, the processor can perform face recognition training, gesture recognition training, voice recognition training, somatosensory recognition training and the like on the air interaction module. Taking face recognition training as an example, the processor may input the reference face image into the air interaction module, and then determine the recognition accuracy according to the face recognition result output by the air interaction module and the recognition result corresponding to the reference face image. If the recognition accuracy is lower than a preset accuracy threshold, the processor can adjust the air interaction module according to the recognition accuracy to improve the recognition accuracy, take the adjusted air interaction module as a new air interaction module, re-execute the step of inputting the reference face image into the air interaction module until the recognition accuracy reaches the preset accuracy threshold, and stop training.
The reference face may be acquired by the processor through a camera on the terminal in advance, and after acquiring the reference face image, a recognition result corresponding to the reference face image is recorded, so that the reference face is used for face recognition training of the air interaction module.
Other training, such as gesture recognition training, voice recognition training, and somatosensory recognition training, will not be described in detail herein with reference to the above-mentioned face recognition training.
In addition, in the embodiment of the application, after the aerial imaging module is started, the heights of the aerial imaging module and the display screen can be adjusted so as to adjust the aerial imaging height, thereby meeting various application requirements. Fig. 7 is a flowchart of another method for displaying terminal information according to an embodiment of the present application. As shown in fig. 7, the method includes:
s701: and acquiring the identity of the user, and authenticating the user according to the identity.
S702: and if the user authentication is passed, starting an aerial imaging module, wherein the aerial imaging module is used for aerial imaging based on a picture displayed in a display screen of the terminal.
Steps S701 to S702 are described in the above steps S401 to S402, and are not described herein.
S703: and judging whether the imaging height input by the user is received or not.
S704: and if the imaging height input by the user is received, acquiring the current air imaging height of the air imaging module, and controlling the terminal bracket to adjust the heights of the air imaging module and the display screen according to the imaging height input by the user and the current air imaging height so as to adjust the air imaging height.
Here, when the user uses the terminal, if the height of the aerial imaging picture formed by the aerial imaging module does not meet the required imaging height, the user may input the corresponding imaging height at the terminal.
After the aerial imaging module is started to carry out aerial imaging, the processor can judge whether the imaging height input by the user is received or not. If the user-entered imaging altitude is received, the processor may obtain a current altitude of aerial imaging by the aerial imaging module and compare the current altitude to the user-entered imaging altitude. If the height of the aerial imaging module is inconsistent with the current height of the aerial imaging, the processor can control the terminal bracket to adjust the heights of the aerial imaging module and the display screen according to the imaging height input by the user, so that the height of the aerial imaging after adjustment is equal to the imaging height input by the user.
For example, as shown in fig. 8, the processor may use the terminal support to implement folding and bidirectional lifting according to the line of sight of the user (i.e. the aerial imaging module and the terminal display screen rise or fall simultaneously, because the angle between the aerial imaging module and the terminal display screen is in the interval (0,90 ° ], and lifting needs to satisfy the imaging integrity).
S705: and generating a prompt on the display screen for whether air interaction is performed.
S706: if the air interaction request input by the user according to the prompt is obtained, an air interaction module is started according to the air interaction request, and the air interaction module is used for identifying one or more of the face, the gesture, the voice and the somatosensory of the user and determining a corresponding interaction instruction according to an identification result.
S707: and according to the interaction instruction and the aerial imaging module, interacting with a user.
Steps S705 to S707 refer to the related descriptions of steps S403 to S405, and are not described herein.
After the aerial imaging module is started, the heights of the aerial imaging module and the display screen can be adjusted, so that the aerial imaging height can be adjusted, and various requirements of users on the aerial imaging height can be met. Moreover, the embodiment of the application performs aerial imaging based on the picture displayed on the terminal display screen, and solves the problems of reduced eyesight, body injury and the like of the user caused by smaller display screen of the traditional terminal equipment. In addition, the embodiment of the application realizes interaction with the user through the air interaction module and the air imaging module, does not need the user to directly contact with the terminal display screen, reduces the damage of the user to the terminal display screen to a certain extent, and simultaneously enables the user to perform corresponding operation on the image when watching the air imaging image comfortably, thereby meeting the application requirements.
Here, as shown in fig. 9, the processor may first authenticate the user based on the obtained user identification. If the authentication is passed, the processor can start the aerial imaging module to carry out aerial imaging based on the picture displayed in the display screen of the terminal, and the aerial imaging module also has an aerial amplification imaging function, so that the problems of user vision degradation, body injury and the like caused by smaller display screen of the traditional terminal equipment are solved. Then, the processor can control the terminal bracket to adjust the height of the aerial imaging module and the display screen according to the imaging height input by the user so as to adjust the aerial imaging height and meet different requirements of the user on the aerial imaging height. And the processor can also start the air interaction module according to the air interaction request input by the user so as to identify the face, the gesture, the voice, the body feeling and the like of the user and determine the corresponding interaction instruction according to the identification result, so that the processor interacts with the user according to the interaction instruction, and the user can perform corresponding operation on the image when watching the air imaging image comfortably, thereby being suitable for application.
Fig. 10 is a schematic structural diagram of a terminal information display device according to an embodiment of the present application, corresponding to the terminal information display method of the above embodiment. For convenience of explanation, only portions relevant to the embodiments of the present application are shown. Fig. 10 is a schematic structural diagram of a terminal information display device according to an embodiment of the present application, where the terminal information display device 100 includes: an authentication module 1001, a first start module 1002, a prompt module 1003, a second start module 1004, and an interaction module 1005. The terminal information display device may be the terminal itself, or a chip or an integrated circuit for realizing the functions of the terminal. It should be noted that, the division of the authentication module, the first start module, the prompt module, the second start module, and the interaction module is only a division of a logic function, and the two may be integrated or independent physically.
The authentication module 1001 is configured to obtain an identity of a user, and authenticate the user according to the identity.
And a first starting module 1002, configured to start an aerial imaging module if the user authentication passes, where the aerial imaging module is configured to perform aerial imaging based on a picture displayed in a display screen of the terminal.
And the prompt module 1003 is used for generating a prompt on the display screen whether to perform air interaction.
And the second starting module 1004 is configured to, if an air interaction request input by the user according to the prompt is obtained, start an air interaction module according to the air interaction request, where the air interaction module is configured to identify one or more of a face, a gesture, a voice and a somatosensory of the user, and determine a corresponding interaction instruction according to an identification result.
And the interaction module 1005 is used for interacting with the user according to the interaction instruction and the aerial imaging module.
In one possible implementation, the aerial imaging module includes a converging lens array, a flat lens array, and an aerial imaging array;
the converging lens array is used for amplifying a picture displayed in the display screen into a virtual image, the flat lens array is used for amplifying the virtual image into a real image, and the aerial imaging array is used for obtaining an aerial real image corresponding to the picture displayed in the display screen after the real image is reflected at least twice.
In one possible implementation, the aerial imaging array comprises an afocal lens array, a stereoscopic array that causes at least two reflections, and any one of a combination retroreflective sheeting and a half mirror.
In one possible implementation manner, the second starting module 1004 is specifically configured to:
performing one or more of face recognition training, gesture recognition training, voice recognition training and somatosensory recognition training on the air interaction module;
and starting the trained air interaction module according to the air interaction request.
In one possible implementation, the authentication module 1001 is specifically configured to:
judging whether the identity is one of prestored legal user identity;
and if the identity is one of the prestored legal user identities, judging that the user authentication passes.
In one possible implementation manner, a protective film is disposed on the display screen, and a protective shell is disposed on the terminal.
The device provided by the embodiment of the present application may be used to implement the technical scheme of the embodiment of the method of fig. 4, and its implementation principle and technical effects are similar, and the embodiment of the present application is not repeated here.
Fig. 11 is a schematic structural diagram of another terminal information display device according to an embodiment of the present application. In addition to fig. 10, the terminal information display device 100 further includes: an adjustment module 1006.
The adjusting module 1006 is configured to determine whether the imaging height input by the user is received after the first starting module 1002 starts the aerial imaging module;
and if the imaging height input by the user is received, acquiring the current air imaging height of the air imaging module, and controlling a terminal bracket to adjust the heights of the air imaging module and the display screen according to the imaging height input by the user and the current air imaging height so as to adjust the air imaging height.
The device provided by the embodiment of the present application may be used to implement the technical scheme of the embodiment of the method of fig. 7, and its implementation principle and technical effects are similar, and the embodiment of the present application is not repeated here.
Alternatively, fig. 12A and 12B schematically provide one possible basic hardware architecture of the terminal according to the present application, respectively.
Referring to fig. 12A and 12B, the terminal includes at least one aerial imaging module, an aerial interaction module, a processor 1201, and a communication interface 1203. Further optionally, a memory 1202 and bus 1204 may also be included.
Where the number of processors 1201 in a terminal may be one or more, fig. 12A and 12B illustrate only one of the processors 1201. Optionally, the processor 1201 may be a central processing unit (central processing unit, CPU), a graphics processor (graphics processing unit, GPU) or a digital signal processor (digital signal processor, DSP). If the terminal has a plurality of processors 1201, the types of the plurality of processors 1201 may be different or may be the same. Optionally, the multiple processors 1201 of the terminal may also be integrated into a multi-core processor.
Memory 1202 stores computer instructions and data; the memory 1202 may store computer instructions and data necessary for implementing the above-described terminal information display method provided by the present application, for example, the memory 1202 stores instructions for implementing the steps of the above-described terminal information display method. Memory 1202 may be any one or any combination of the following storage media: nonvolatile memory (e.g., read Only Memory (ROM), solid State Disk (SSD), hard disk (HDD), optical disk), volatile memory).
The communication interface 1203 may provide information input/output for the at least one processor. Any one or any combination of the following devices may also be included: a network interface (e.g., ethernet interface), a wireless network card, etc., having network access functionality.
Optionally, the communication interface 1203 may also be used for data communication of the terminal with other computing devices or terminals.
Further alternatively, fig. 12A and 12B represent bus 1204 with a thick line. The bus 1204 may connect the processor 1201 with the memory 1202 and the communication interface 1203. Thus, through the bus 1204, the processor 1201 may access the memory 1202 and may also utilize the communication interface 1203 to interact data with other computing devices or terminals.
In the present application, the terminal executes the computer instructions in the memory 1202, so that the terminal implements the above-described terminal information display method provided by the present application, or so that the terminal deploys the above-described terminal information display device.
From a logical functional partitioning perspective, as illustrated in fig. 12A, the memory 1202 may include an authentication module 1001, a first initiation module 1002, a hint module 1003, a second initiation module 1004, and an interaction module 1005. The inclusion herein is not limited to a physical structure, and may involve only the functions of the authentication module, the first initiation module, the hint module, the second initiation module, and the interaction module, respectively, when the instructions stored in the memory are executed.
One possible design is shown in fig. 12B, where the memory 1202 includes a conditioning module 1006, where the inclusion involves only that the instructions stored in the memory, when executed, may perform the functions of the conditioning module, and is not limited to a physical structure.
In addition, the terminal may be implemented in hardware as a hardware module or as a circuit unit, in addition to the software as in fig. 12A and 12B.
The present application provides a computer-readable storage medium, the computer program product comprising computer instructions for instructing a computing device to execute the above-mentioned terminal information display method provided by the present application.
An embodiment of the present application provides a computer program product, including computer instructions, where the computer instructions are executed by a processor to perform the method for displaying terminal information provided by the present application.
The application provides a chip comprising at least one processor and a communication interface providing information input and/or output for the at least one processor. Further, the chip may also include at least one memory for storing computer instructions. The at least one processor is used for calling and running the computer instructions to execute the terminal information display method provided by the application.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.

Claims (8)

1. A terminal information display method, wherein the method is applied to a terminal, the method comprising:
acquiring an identity of a user, and authenticating the user according to the identity;
if the user authentication is passed, an aerial imaging module is started, and the aerial imaging module is used for aerial imaging based on a picture displayed in a display screen of the terminal;
generating a prompt on the display screen whether to perform air interaction;
if the air interaction request input by the user according to the prompt is obtained, an air interaction module is started according to the air interaction request, and the air interaction module is used for identifying one or more of the face, the gesture, the voice and the somatosensory of the user and determining a corresponding interaction instruction according to an identification result;
According to the interaction instruction and the aerial imaging module, interacting with the user;
after the activation of the aerial imaging module, further comprising:
judging whether the imaging height input by the user is received or not;
if the imaging height input by the user is received, acquiring the current air imaging height of the air imaging module, and controlling a terminal bracket to adjust the heights of the air imaging module and the display screen according to the imaging height input by the user and the current air imaging height so as to adjust the air imaging height;
the aerial imaging module comprises a converging lens array, a flat lens array and an aerial imaging array;
the converging lens array is used for amplifying a picture displayed in the display screen into a virtual image, the flat lens array is used for amplifying the virtual image into a real image, and the aerial imaging array is used for obtaining an aerial real image corresponding to the picture displayed in the display screen after the real image is reflected at least twice.
2. The method of claim 1, wherein the aerial imaging array comprises any one of an afocal lens array, a stereoscopic array that causes at least two reflections, and a combination retroreflective sheeting and half mirror.
3. The method of claim 1, further comprising, prior to said initiating an over-the-air interaction module in accordance with said over-the-air interaction request:
performing one or more of face recognition training, gesture recognition training, voice recognition training and somatosensory recognition training on the air interaction module;
and starting an air interaction module according to the air interaction request, wherein the method comprises the following steps:
and starting the trained air interaction module according to the air interaction request.
4. The method of claim 1, wherein authenticating the user based on the identity comprises:
judging whether the identity is one of prestored legal user identity;
and if the identity is one of the prestored legal user identities, judging that the user authentication passes.
5. The method of claim 1, wherein a protective film is provided on the display screen and a protective housing is provided on the terminal.
6. A terminal information display device, the device being applied to a terminal, the device comprising:
the authentication module is used for acquiring the identity of the user and authenticating the user according to the identity;
The first starting module is used for starting an aerial imaging module if the user authentication passes, and the aerial imaging module is used for aerial imaging based on a picture displayed in a display screen of the terminal;
the prompt module is used for generating a prompt on the display screen whether air interaction is carried out or not;
the second starting module is used for starting an air interaction module according to the air interaction request if the air interaction request input by the user according to the prompt is obtained, and the air interaction module is used for identifying one or more of the face, the gesture, the voice and the somatosensory of the user and determining a corresponding interaction instruction according to the identification result;
the interaction module is used for interacting with the user according to the interaction instruction and the aerial imaging module;
the adjusting module is used for judging whether the imaging height input by the user is received or not after the aerial imaging module is started; if the imaging height input by the user is received, acquiring the current air imaging height of the air imaging module, and controlling a terminal bracket to adjust the heights of the air imaging module and the display screen according to the imaging height input by the user and the current air imaging height so as to adjust the air imaging height;
The aerial imaging module comprises a converging lens array, a flat lens array and an aerial imaging array;
the converging lens array is used for amplifying a picture displayed in the display screen into a virtual image, the flat lens array is used for amplifying the virtual image into a real image, and the aerial imaging array is used for obtaining an aerial real image corresponding to the picture displayed in the display screen after the real image is reflected at least twice.
7. A terminal, comprising:
an aerial imaging module; an air interaction module; a processor;
a memory; and
a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor, the computer program comprising instructions for performing the method of any of claims 1-5.
8. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program, which causes a server to perform the method of any one of claims 1-5.
CN202110584897.7A 2021-05-27 2021-05-27 Terminal information display method and device, terminal and storage medium Active CN113282174B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110584897.7A CN113282174B (en) 2021-05-27 2021-05-27 Terminal information display method and device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110584897.7A CN113282174B (en) 2021-05-27 2021-05-27 Terminal information display method and device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN113282174A CN113282174A (en) 2021-08-20
CN113282174B true CN113282174B (en) 2023-10-17

Family

ID=77282005

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110584897.7A Active CN113282174B (en) 2021-05-27 2021-05-27 Terminal information display method and device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN113282174B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113838395A (en) * 2021-09-18 2021-12-24 湖南美景创意文化建设有限公司 Ultra-high contrast medium-free aerial imaging display screen for museum
CN115206215A (en) * 2022-06-10 2022-10-18 上海丹诺西诚智能科技有限公司 Position adjusting method and system for aerial image imaging projection pattern

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005208308A (en) * 2004-01-22 2005-08-04 Yasuaki Tanaka Stereoscopic projector
JP2011081296A (en) * 2009-10-09 2011-04-21 Pioneer Electronic Corp Display device
CN102150072A (en) * 2008-07-10 2011-08-10 实景成像有限公司 Broad viewing angle displays and user interfaces
CN104977794A (en) * 2015-06-19 2015-10-14 浙江吉利控股集团有限公司 Portable air projection device
JP2016006447A (en) * 2014-06-20 2016-01-14 船井電機株式会社 Image display device
CN105938425A (en) * 2015-03-03 2016-09-14 三星电子株式会社 Method of displaying image and electronic device
CN107113949A (en) * 2014-12-26 2017-08-29 日立麦克赛尔株式会社 Lighting device
WO2019039600A1 (en) * 2017-08-25 2019-02-28 林テレンプ株式会社 Aerial image display device
TWM585357U (en) * 2019-04-29 2019-10-21 大陸商北京眸合科技有限公司 Optical system for realizing aerial suspension type display
CN111133357A (en) * 2017-09-20 2020-05-08 三星电子株式会社 Optical lens assembly and electronic device with cross reference to related applications

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102324083B1 (en) * 2014-09-01 2021-11-09 삼성전자주식회사 Method for providing screen magnifying and electronic device thereof
KR20160027679A (en) * 2014-09-02 2016-03-10 삼성전자주식회사 Display device
EP3276608A4 (en) * 2015-03-26 2018-10-24 Kyocera Document Solutions Inc. Visible-image formation device and image formation device
US11796959B2 (en) * 2019-01-25 2023-10-24 International Business Machines Corporation Augmented image viewing with three dimensional objects
CN111868657A (en) * 2020-05-15 2020-10-30 曹庆恒 Suspension display device and using method thereof

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005208308A (en) * 2004-01-22 2005-08-04 Yasuaki Tanaka Stereoscopic projector
CN102150072A (en) * 2008-07-10 2011-08-10 实景成像有限公司 Broad viewing angle displays and user interfaces
JP2011081296A (en) * 2009-10-09 2011-04-21 Pioneer Electronic Corp Display device
JP2016006447A (en) * 2014-06-20 2016-01-14 船井電機株式会社 Image display device
CN107113949A (en) * 2014-12-26 2017-08-29 日立麦克赛尔株式会社 Lighting device
CN105938425A (en) * 2015-03-03 2016-09-14 三星电子株式会社 Method of displaying image and electronic device
CN104977794A (en) * 2015-06-19 2015-10-14 浙江吉利控股集团有限公司 Portable air projection device
WO2019039600A1 (en) * 2017-08-25 2019-02-28 林テレンプ株式会社 Aerial image display device
CN111133357A (en) * 2017-09-20 2020-05-08 三星电子株式会社 Optical lens assembly and electronic device with cross reference to related applications
TWM585357U (en) * 2019-04-29 2019-10-21 大陸商北京眸合科技有限公司 Optical system for realizing aerial suspension type display

Also Published As

Publication number Publication date
CN113282174A (en) 2021-08-20

Similar Documents

Publication Publication Date Title
US20210350623A1 (en) 3d object rendering using detected features
US10656423B2 (en) Head mounted display apparatus
EP3106911B1 (en) Head mounted display apparatus
CN113282174B (en) Terminal information display method and device, terminal and storage medium
KR20190133204A (en) Accumulation and Reliability Allocation of Iris Codes
CN104216118A (en) Head Mounted Display With Remote Control
US11675192B2 (en) Hybrid coupling diffractive optical element
CN104767962A (en) Multipurpose conference terminal and multipurpose conference system
WO2020036821A1 (en) Identification method and apparatus and computer-readable storage medium
JP6528680B2 (en) Display device, display method and display program
US9568732B2 (en) Mobile terminal
CN106020480A (en) Virtual reality device and image processing method of virtual reality images
CN104052977A (en) Interactive image projection method and device
US20220197028A1 (en) Method and electronic device for providing augmented reality environment
KR102043156B1 (en) Mobile terminal and method for controlling the same
KR102067599B1 (en) Mobile terminal and method for controlling the same
US20230367117A1 (en) Eye tracking using camera lens-aligned retinal illumination
WO2023224878A1 (en) Eye tracking using camera lens-aligned retinal illumination
CN114445605A (en) Free-form surface simulation method and device
KR20220169207A (en) Electronic device and biometric liveness authentication method
CN117372545A (en) Pose information determining method and system
KR101196570B1 (en) An apparatus for experiencing
JP2006154900A (en) Hand-written image display system, and portable information terminal for space hand-writing
CN109816746A (en) Sketch image generation method and Related product
NZ795513A (en) 3D object rendering using detected features

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant