CN111782063B - Real-time display method and system, computer readable storage medium and terminal equipment - Google Patents

Real-time display method and system, computer readable storage medium and terminal equipment Download PDF

Info

Publication number
CN111782063B
CN111782063B CN202010512742.8A CN202010512742A CN111782063B CN 111782063 B CN111782063 B CN 111782063B CN 202010512742 A CN202010512742 A CN 202010512742A CN 111782063 B CN111782063 B CN 111782063B
Authority
CN
China
Prior art keywords
user input
dimensional space
coordinate axes
input equipment
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010512742.8A
Other languages
Chinese (zh)
Other versions
CN111782063A (en
Inventor
杨宇
贺志军
林明田
陈镜州
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010512742.8A priority Critical patent/CN111782063B/en
Publication of CN111782063A publication Critical patent/CN111782063A/en
Application granted granted Critical
Publication of CN111782063B publication Critical patent/CN111782063B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels

Abstract

The embodiment of the invention discloses a real-time display method, a real-time display system, a computer readable storage medium and terminal equipment, which are applied to the technical field of communication equipment. The detection signals respectively transmitted by the signal transmitting sources of the three coordinate axes in the space positioning subsystem are detected through the user input equipment, the position of the user input equipment in the three-dimensional space is determined according to the time difference between the time when the signal transmitting sources of the three coordinate axes start to transmit signals and the time when the detection signals are detected by the user input equipment, further, the movement information of the user input equipment can be determined, and then, the display is carried out in the three-dimensional space where the user input equipment is located according to the movement information. In the process of interaction between a user and the real-time display system, the user input is not limited on the touch display screen but in a three-dimensional space, so that the real-time display is more diversified.

Description

Real-time display method and system, computer readable storage medium and terminal equipment
Technical Field
The present invention relates to the field of communications devices, and in particular, to a real-time display method, a real-time display system, a computer-readable storage medium, and a terminal device.
Background
At present, in the existing real-time display technology, a terminal device generally includes a touch display screen and a display control module, and a user can input information, such as inputting characters or pictures, on the touch display screen by using a touch pen or a finger, so that the display control module can control the touch display screen to display the information input by the user in real time, and for the characters or characters input by the user, the display control module can also recognize the characters or characters input by the user and display the characters or characters on the touch display screen. The existing real-time display technology is single and cannot meet the requirements of users on real-time display diversification.
Disclosure of Invention
The embodiment of the invention provides a real-time display method, a real-time display system, a computer readable storage medium and terminal equipment, which realize the input of a user in a three-dimensional space and the real-time display in the three-dimensional space.
An embodiment of the present invention provides a real-time display method, including:
when the user input equipment is in a three-dimensional space below the space positioning subsystem, detecting detection signals respectively emitted by signal emission sources of three coordinate axes in the space positioning subsystem through the user input equipment;
respectively acquiring time differences between the moment when the signal emission sources of the three coordinate axes start to emit signals and the moment when the user input equipment detects detection signals;
determining the position of the user input equipment in the three-dimensional space according to the time difference corresponding to the three coordinate axes respectively;
determining the movement information of the user input equipment in the three-dimensional space according to the position of the user input equipment in the three-dimensional space;
and displaying in the three-dimensional space according to the movement information.
In another aspect, an embodiment of the present invention further provides a real-time display system, including:
the system comprises a space positioning subsystem, a user input device, a signal emission source and a signal detection device, wherein the space positioning subsystem is used for acquiring a three-dimensional space;
the acquisition unit is used for respectively acquiring the time difference between the moment when the signal emission sources of the three coordinate axes start to emit signals and the moment when the user input equipment detects the detection signals;
the position determining unit is used for determining the position of the user input equipment in a three-dimensional space according to the time difference corresponding to the three coordinate axes respectively;
the movement determining unit is used for determining the movement information of the user input equipment in the three-dimensional space according to the position of the user input equipment in the three-dimensional space;
and the space display unit is used for displaying in the three-dimensional space according to the movement information.
In another aspect, an embodiment of the present invention further provides a computer-readable storage medium, which stores a plurality of computer programs, where the computer programs are suitable for being loaded by a processor and executing the real-time display method according to an embodiment of the present invention.
In another aspect, an embodiment of the present invention further provides a terminal device, including a processor and a memory;
the memory is used for storing a plurality of computer programs, and the computer programs are used for being loaded by the processor and executing the real-time display method according to the aspect of the embodiment of the invention; the processor is configured to implement each of the plurality of computer programs.
It can be seen that, in the method of this embodiment, in the real-time display system, the detection signals respectively emitted by the signal emission sources of the three coordinate axes in the spatial positioning subsystem are detected by the user input device, and the position of the user input device in the three-dimensional space is determined according to the time difference between the time when the signal emission sources of the three coordinate axes start to emit signals and the time when the user input device detects the detection signals, so that the movement information of the user input device can be determined, and then the display is performed in the three-dimensional space where the user input device is located according to the movement information. Therefore, a user can move in the three-dimensional space below the space positioning subsystem by using the user input equipment and input certain information, and the information input in the three-dimensional space by the user input equipment can be correspondingly displayed in the three-dimensional space through the real-time display system, so that the user input is not limited on the touch display screen but is in the three-dimensional space in the process of interaction between the user and the real-time display system, and the real-time display is more diversified.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a real-time display system according to an embodiment of the present invention;
FIG. 2 is a schematic illustration of lenticular lens imaging in an embodiment of the present invention;
FIG. 3 is a schematic diagram of a spatial positioning subsystem emitting laser signals in an embodiment of the present invention;
FIG. 4 is a schematic illustration of the operation of a negative index plate in an embodiment of the present invention;
FIG. 5 is a flow chart of a real-time display method in one embodiment of the invention;
FIG. 6 is a timing diagram of a detection signal of a user input device that transmits a signal from the spatial positioning subsystem in an embodiment of the present invention;
FIG. 7 is a schematic diagram of a logical structure of a real-time display system in an embodiment of the present invention;
FIG. 8 is a schematic diagram of a user input device in an embodiment of the present invention;
FIG. 9 is a schematic diagram of a real-time display method in an embodiment of the present invention;
FIG. 10 is a diagram illustrating a movement trace set by a user input device in an embodiment of the present invention;
FIG. 11a is a diagram illustrating a specific display implemented by the display control apparatus in an embodiment of the present invention;
FIG. 11b is a schematic diagram illustrating the effect of certain instructions executed by the display control device in an embodiment of the present invention;
FIG. 12 is a schematic structural diagram of a real-time display system according to an embodiment of the present invention;
fig. 13 is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The embodiment of the present invention provides a real-time display method, which is mainly a method executed by a real-time display system shown in fig. 1, where the real-time display system includes a display control device 10, a spatial positioning subsystem 20, a user input device 30, and a negative refractive index plate 40, and the real-time display system may perform real-time display of user input according to the following steps:
when the user input equipment is in a three-dimensional space below the space positioning subsystem, detecting detection signals respectively emitted by signal emission sources of three coordinate axes in the space positioning subsystem through the user input equipment; respectively acquiring time differences between the moment when the signal emission sources of the three coordinate axes start to emit signals and the moment when the user input equipment detects detection signals; determining the position of the user input equipment in the three-dimensional space according to the time difference corresponding to the three coordinate axes respectively; determining the movement information of the user input equipment in the three-dimensional space according to the position of the user input equipment in the three-dimensional space; and displaying in the three-dimensional space where the user input equipment is located according to the movement information.
In this way, the user can use the user input device 30 to move in the three-dimensional space below the spatial positioning subsystem 20 and input certain information, and the information input by the user input device 30 in the three-dimensional space can be correspondingly displayed in the three-dimensional space through the real-time display system, so that in the process of interaction between the user and the real-time display system, the user input is not limited on the touch display screen, but in the three-dimensional space, the real-time display is more diversified.
It should be noted that, in a specific application process, the display control device 10 may include a control calculation unit 11, a communication unit 12, a display 13 and a lenticular lens 14, where the communication unit 12 is responsible for communication with the spatial positioning subsystem 20 and the user input device 30; a control calculation unit 11, configured to control signal emission sources of three coordinate axes in the spatial positioning subsystem 20 to emit detection signals, and determine a position of the user input device 30 in a three-dimensional space according to information acquired from the user input device 30, such as the time difference or a time when the detection signals are detected; the lenticular lens 14 is installed on the display screen 13, and after information displayed on the display screen 13 passes through the lenticular lens 14, a naked eye three-dimensional (3D) display effect can be achieved.
The principle of the Lenticular Lens 3D technology is that a layer of Lenticular Lens is added in front of the display screen, so that the image plane of the display screen is located on the focal plane of the Lenticular Lens, and thus the pixels of the image under each Lenticular Lens are divided into several sub-pixels, so that the Lenticular Lens can project each sub-pixel in different directions, so that the user can see different sub-pixels when watching the display screen from different angles with his eyes, and the left and right eyes present different images with certain parallax, and the human brain synthesizes the two images with parallax to form a stereoscopic feeling.
Specifically, as shown in fig. 2, the display panel includes sub-pixels: a1, A2, B1, B2, C1, C2, D1 and D2, wherein the A1 and the D1 are pixel points of a color, and a stereoscopic feeling of the color is formed in the eyes of a user after passing through the cylindrical lens; the sub-pixels A2 and D2 are pixels of another color, and form a stereoscopic feeling of another color in the eyes of the user after passing through the cylindrical lens.
In a specific application, the stereoscopic space subsystem 20 in the real-time display system is used in cooperation with the user input device 30, wherein:
as shown in fig. 3, the stereo space customizing subsystem 20 may employ signal emitting sources with three coordinate axes, specifically, three surface laser sources X, Y and Z, respectively installed on X, Y and Z axes for time-sharing and rotating to emit laser signals, where a period of each surface laser source for rotating to emit laser signals is 10ms, and a period of each three surface laser sources for rotating 10ms is 10ms, for example, the surface laser source Y shown in fig. 3 rotates by a corresponding angle to reach surface 1, surface 2, and surface 3 to emit laser signals; a synchronous infrared Light Emitting Diode (LED) is arranged at the intersection (i.e. the origin) of the three coordinate axes, and an infrared synchronous signal lasting for 2ms is sent at the beginning of each period. The user input device 30 is embodied as an electronic pen, and an infrared laser detector is installed on the user input device 30 and can detect an infrared synchronization signal and a laser signal emitted from a surface laser source of each coordinate axis.
After the synchronous infrared LED emits the infrared synchronization signal, since the infrared synchronization signal emitted by the synchronous infrared LED is 360 degrees, the user input device 30 located at any position in the three-dimensional space can detect the infrared synchronization signal, when the user input device 30 detects the infrared synchronization signal, the timer is started immediately to start timing, and then the surface laser sources next to X, Y and the three coordinate axes Z respectively perform 360-degree rotation scanning on the three-dimensional space in time sequence. When the surface laser source of any one coordinate axis is rotated by a certain angle and the user input device 30 is on the scanning plane of the surface laser source, for example, the surface laser source Y shown in fig. 3 is rotated by a certain angle to reach the surface 1, and the user input device 30 is on the surface 1, the user input device 30 detects the laser signal emitted by the surface laser source and records the time when the laser signal is detected, and the user input device 30 may further determine the time difference between the time when the surface laser source of any one coordinate axis starts emitting the laser signal and the time when the user input device 30 detects the laser signal emitted by the surface laser source of any one coordinate axis, and transmit the time difference to the display control device 10 wirelessly. Further, the display control device 10 can calculate the rotation angle of the laser signal emitted by each area laser source through the time difference, so as to determine three scanning sections where the user input device 30 is located, where the three scanning sections intersect to form a point, which is the position of the user input device 30 in the three-dimensional space.
Further, the negative refractive index plate 40 installed above the display control device 10 in the real-time display system is used for displaying the input information displayed by the display control device 20 to the stereoscopic space where the user input device 30 is located, and giving the following visual sense to the user: the real-time display system displays the information input by the user in the three-dimensional space through the user input device 30 in real time, for example, if the user draws a bird in the three-dimensional space through the user input device 30, the real-time display system displays the bird drawn by the user in the three-dimensional space in real time, which is consistent with the effect achieved by the user inputting information such as characters or images on the touch display screen.
The negative refractive index plate 40 has the following properties: the incident light and the refracted light are on the same side of the normal, and the light on one side of the negative refractive index plate 40 forms a second convergence of the light on the other side thereof to form a real image. The angle of the real image can be adjusted by adjusting the angle of the negative refractive index plate 40, for example, the negative refractive index plate 40 in fig. 4 can form the real image in the vertical direction on the other side from the points a1, D1, a2 and D2 in the horizontal direction in the real object. In the present embodiment, information displayed on the lateral plane by the display control apparatus 20 can be displayed on the longitudinal plane on the other side of the negative refractive index plate 40.
Typically, the display control device 10 is fixed at an angle to the negative refractive index plate 40, and the three-dimensional positioning subsystem 20 may be mounted above the display control device 10, such as on the ceiling above, so that the user can input information in the three-dimensional space above the negative refractive index plate 40 using the user input device 30, wherein the user input device 30 and the display control device 10 need to be on both sides of the negative refractive index plate 40.
An embodiment of the present invention provides a real-time display method, which is a method executed by the real-time display system, and a flowchart is shown in fig. 5, where the method includes:
step 101, when the user input device 30 is in the three-dimensional space below the spatial positioning subsystem 20, detecting detection signals respectively emitted by signal emission sources of three coordinate axes in the spatial positioning subsystem 20 through the user input device 30.
Specifically, the display control device 10 in the real-time display system controls the synchronous infrared LEDs in the spatial positioning subsystem 20 to start emitting infrared synchronous signals in each period, and controls the signal emitting sources of each coordinate axis to respectively emit detection signals according to a time sequence, and the signal emitting source of any coordinate axis rotates around the corresponding coordinate axis to emit detection signals, and the detection signals are planar signals; the user input device 30 can detect the infrared synchronization signal emitted by the synchronous infrared LED at any position in the three-dimensional space, and when the user input device 30 is located on the plane of the detection signal emitted by the signal emission source of a certain coordinate axis, the detection signal can be detected.
And 102, respectively acquiring the time difference between the moment when the signal emission sources of the three coordinate axes start to emit signals and the moment when the user input equipment detects the detection signals.
For example, as shown in fig. 6, the synchronous infrared LED in the spatial positioning subsystem 20 emits an infrared synchronization signal at the beginning of 2ms of each period, and the user input device 30 immediately detects the infrared synchronization signal and starts timing; in the process of emitting a laser signal for the signal emitting source of the X-axis, i.e. the surface laser source X, in the following 10ms, the user input device 30 detects the laser signal at a certain time, and records a time difference Δ T1 between the stop time of the infrared synchronization signal (i.e. the time when the surface laser source X starts emitting the signal) and the time when the laser signal is detected; in the process of emitting a laser signal for the signal emitting source of the Y axis, i.e. the surface laser source Y, in the following 10ms, the user input device 30 detects the laser signal at a certain time, and records a time difference Δ T2 between the time when the surface laser source X stops emitting the signal (i.e. the time when the surface laser source Y starts emitting the signal) and the time when the laser signal is detected; in the process of emitting the laser signal for the signal emission source of the Z-axis, i.e., the surface laser source Z, in the following 10ms, the user input device 30 detects the laser signal at a certain time, and records the time difference Δ T3 between the time when the surface laser source Y stops emitting the signal (i.e., the time when the surface laser source Z starts emitting the signal) and the time when the laser signal is detected.
It should be noted that, in a practical case, the time difference determined in this step may be recorded by the user input device 30 and then transmitted to the display control device 10, in this case, the corresponding time difference is calculated by the user input device 30 in combination with the time when the signal emission source of each coordinate axis starts to emit the detection signal; alternatively, the time difference determined in this step may be determined by the display control device 10, in which case, the user input device 30 is only responsible for recording the time when the detection signal emitted from the signal emission source of each coordinate axis is detected and transmitting the recorded time to the display control device 30, so that the display control device 30 may calculate the time difference corresponding to each coordinate axis by combining the time when the detection signal is started to be emitted from the signal emission source of each coordinate axis.
And 103, determining the position of the user input equipment in the three-dimensional space according to the time difference corresponding to the three coordinate axes respectively.
Specifically, if the signal emission sources of the three coordinate axes are the plane laser sources of the planes of the three coordinate axes, the display control device 10 in the real-time display system determines the rotation angles of the laser signals emitted by the plane laser sources corresponding to the three coordinate axes according to the time differences corresponding to the three coordinate axes, and then determines the intersection points of the planes of the laser signals corresponding to the three coordinate axes as the positions of the user input device in the three-dimensional space according to the rotation angles of the laser signals emitted by the plane laser sources corresponding to the three coordinate axes. When determining the intersection point of the plane where the laser signal is located, the display control device 10 may first obtain the function representation of the plane where the laser signal is located according to the rotation angles of the laser signals emitted by the surface laser sources corresponding to the three coordinate axes, and then may obtain the coordinate information of the intersection point in the three-dimensional space according to the function representation, that is, the position of the user input device 30.
Specifically, there is a certain functional relationship between the time difference and the rotation angle, which is preset in the display control device 10 in the real-time display system in advance, for example, if the time for which the signal emission sources of the respective coordinate axes respectively emit signals is 10ms, the rotation angle θ is the time difference Δ T360/10.
And step 104, determining the movement information of the user input device 30 in the three-dimensional space according to the position of the user input device 30 in the three-dimensional space.
It should be noted that, when the user input device 30 inputs information in the three-dimensional space, the user input device 30 may continuously move, and the positions where the user input device 30 moves to any point at any time may be obtained through the above steps 101 to 103, so that the movement trajectory of the user input device 30 may be obtained, for example, continuously moving upwards, continuously moving rightwards, or intermittently clicking at different positions.
And 105, displaying in the stereoscopic space where the user input device 30 is located according to the movement information.
Specifically, the display control device 10 displays information on the display 13 according to the movement information of the user input device 30, and then the information displayed on the display 13 is displayed in the three-dimensional space where the user input device 30 is located through the negative refractive index plate 40 disposed above the display 13. Further, due to the fact that the lenticular lens 14 is arranged on the display screen 13, a naked eye 3D effect can be achieved, and thus, an image displayed in a stereoscopic space where the user input device 30 is located can also achieve the naked eye 3D effect.
In one case, when the user input device 30 is in a writing state and any text, character, image, or the like can be input, the display control device 10 in the real-time display system directly displays the movement track of the user input device 30 on the display 13, and the movement track displayed by the display control device 10 is displayed in the three-dimensional space where the user input device 30 is located through the negative refractive index plate 40 arranged above the movement track. Wherein, the user can operate the user input device 30 to make the user input device 30 in a writing state, and the user input device 30 can transmit the information of the writing state to the display control device 10, so that the display control device 10 can directly display the moving track of the user input device 30.
In another case, when the user input device 30 is in the instruction state, the display control device 10 in the real-time display system may determine a specific instruction corresponding to the movement trajectory of the user input device 30 according to a preset corresponding relationship between the movement trajectory and the specific instruction, and then display the specific instruction in the three-dimensional space where the user input device 30 is located according to the determined specific instruction, specifically, the display control device 10 may display an operation result of the determined specific instruction on the display screen 13, and the operation result displayed by the display control device 10 may be displayed in the three-dimensional space where the user input device 30 is located through the negative refractive index plate 40 disposed above the display control device. Wherein the user may operate the user input device 30 such that the user input device 30 is in an instruction state, the user input device 30 may transmit information of the instruction state to the display control device 10, and thus, the display control device 10 may display an operation result of a specific instruction.
The preset corresponding relationship between the movement track and the specific instruction may be preset in the display control device 10 through the user input device 30 by the user, specifically, the display control device 10 may display a setting interface of the corresponding relationship, the setting interface includes a plurality of input interfaces corresponding to the specific instructions, and the user may input the movement track corresponding to each specific instruction through the input interfaces; when the display control device 10 receives a movement trajectory from an input interface of any specific instruction, the corresponding relationship between the received movement trajectory and the corresponding specific instruction is stored. Note that, since the complex refractive index plate 40 is installed above the display control apparatus 10, in the process of setting the above correspondence, the setting interface and the like displayed on the display 13 by the display control apparatus 10 can be displayed in the above-mentioned stereoscopic space through the negative refractive index plate 40.
In addition, the negative refractive index plate 40 and the display control device 10 may be fixedly installed, so that a manufacturer may install the negative refractive index plate 40 and the display control device 10 at a fixed angle according to a certain display effect, and an end user may adjust the angle between the negative refractive index plate 40 and the display control device 10 during use. And a factory manufacturer does not fixedly install the display control device 10 and the spatial positioning subsystem 20, and needs to install the display control device 10 and the spatial positioning subsystem 20 before use by an end user, for example, the control positioning subsystem 20 is installed right above the display control device 10, after installation, calibration needs to be performed among the display control device 10, the spatial positioning subsystem 20, the user input device 30, and the negative refractive index plate 40 in the real-time display system, and the purpose of the specific calibration is as follows: through the mutual matching of all parts of the real-time display system, the line track displayed in the three-dimensional space where the user input equipment 30 is located is overlapped with the motion track of the user input equipment 30, and the real drawing scene can be accurately simulated.
It can be seen that, in the method of this embodiment, in the real-time display system, the detection signals respectively emitted by the signal emission sources of the three coordinate axes in the spatial positioning subsystem 20 are detected by the user input device 30, and the position of the user input device 30 in the three-dimensional space is determined according to the time difference between the time when the signal emission sources of the three coordinate axes start to emit signals and the time when the detection signals are detected by the user input device 30, so that the movement information of the user input device 30 can be determined, and then the movement information is displayed in the three-dimensional space where the user input device 30 is located according to the movement information. In this way, the user can use the user input device 30 to move in the three-dimensional space below the spatial positioning subsystem 20 and input certain information, and the information input by the user input device 30 in the three-dimensional space can be correspondingly displayed in the three-dimensional space through the real-time display system, so that in the process of interaction between the user and the real-time display system, the user input is not limited on the touch display screen, but in the three-dimensional space, the real-time display is more diversified.
The real-time display method of the present invention is described below with a specific application example, and the real-time display system of the present embodiment may include, as shown in fig. 1, a display control device 10, a spatial positioning subsystem 20, a user input device 30, and a negative refractive index plate 40, where the negative refractive index plate 40 is installed at a fixed angle above the display control device 10, the subsystem 20 is positioned in a space above the display control device 10 (for example, above a ceiling, etc.), and after the installation, the display control device 10 is calibrated with the spatial positioning subsystem 20. Among them, as shown in fig. 7:
the display control device 10 controls the signal emission sources of the coordinate axes in the space subsystem 20, specifically, the surface laser source in this embodiment, to start emitting laser signals, and controls the synchronous infrared LEDs in the space subsystem 20 to start emitting infrared synchronous signals through the serial port control command; the time difference information (specifically, the time difference) and the key information (specifically, the key information entering the writing state or the instruction state, etc.) transmitted by the user input device 30 are received. In the present embodiment, the display control device 10 may include a control calculation unit 11 and a communication unit 12, wherein the control calculation unit 11 may include functions of display driving, position calculation of the user input device 30, and color change of the user input device 30.
As shown in fig. 8, the user input device 30 may include a key 31, a processing unit 32 and an infrared laser detector 33, wherein the user input device 30 may include a plurality of keys 31, such as a key for switching to a writing state, a key for switching to an instruction state, a power-on key, and the like; the processing unit 32 is built in the user input device 30, and is shown by a dashed line box in the figure, and is used for communicating with the display control device 10, and recording the time difference and the like in the above embodiments, and specifically may include functions of wireless communication, key processing, time difference calculation, synchronization processing and the like; the infrared laser detector 33 is used for detecting detection signals and infrared synchronization signals emitted by the spatial positioning subsystem 20.
The real-time display system in this embodiment performs real-time display according to the following steps, and a flowchart is shown in fig. 9, and includes:
in step 201, after the real-time display system is turned on, the display control device 10 controls the synchronous infrared LEDs in the spatial positioning subsystem 20 to emit infrared synchronous signals, and then controls the surface laser sources of the coordinate axes to emit laser signals according to a time sequence, where a total period is 32ms, as shown in fig. 6, which is not described herein again.
In step 202, when the user operates the button 31 in the writing state in the user input device 30, the user input device 30 will send the information of the writing state to the display control device 10, so that the user input device 30 will write in the three-dimensional space below the spatial positioning subsystem 20, such as drawing or writing characters or characters.
In step 203, after the infrared laser detector 33 in the user input device 30 detects the infrared synchronization signal, the laser signals respectively sent by the surface laser sources X, Y and Z are detected, and three time differences Δ T1, Δ T2, and Δ T3 are recorded, which respectively represent the time differences between the times when the surface laser sources X, Y and Z start to emit laser signals and the times when the infrared laser detector 33 detects corresponding laser signals, as shown in fig. 6.
Step 204, the user input device 30 sends the recorded time difference to the display control device 10, and the display control device 10 determines the position of the user input device 30 in the three-dimensional space according to the received time difference, where the specific determination method is described in the above embodiments and is not described herein again; the display control device 10 then determines the movement trajectory of the user input device 30 based on the position of the user input device 30 in the three-dimensional space at each instant.
In step 205, since the display control device 10 receives the information of the writing state before and determines that the user input device 30 is in the writing state, the display control device 10 may directly display the movement track of the user input device 30 on the display screen 13, and after passing through the lenticular lens 14 and the negative refractive index plate 40 on the display screen 13, the movement track of the user input device 30 may be displayed in the three-dimensional space where the user input device is located, and a naked-eye 3D effect may be achieved.
In step 206, when the user operates the button 31 of the user input device 30 for instructing the status, the user input device 30 will send the information of the instructed status to the display control device 10, so that the user input device 30 will move in the three-dimensional space below the spatial positioning subsystem 20, mainly moving a specific track.
Step 207, the display control device 10 obtains the movement track of the user input device 30 according to the methods in the steps 203 and 204, and determines that the user input device 30 is in the instruction state because the display control device 10 receives the information of the instruction state before, then the display control device 10 obtains a specific instruction according to the movement track and the preset corresponding relation, and displays the operation result of the specific instruction on the display screen 13, and after passing through the lenticular lens 14 and the negative refractive index plate 40 on the display screen 13, the operation result of the specific instruction is displayed in the three-dimensional space where the display control device is located, and a naked eye 3D effect can be achieved. Step 207 is shown in the figure by the dashed arrow.
The specific instruction may include, but is not limited to, the following instructions: the image drawn by the user in the writing state through the user input device 30 is rotated to realize 360-degree viewing, or the written characters are amplified and the like.
The preset corresponding relationship may be that a user performs setting in the display control device through the user input device 30, and when the display control device 10 displays a setting interface of the corresponding relationship, the user may input a movement trajectory shown in fig. 10 on the setting interface, which may include: upward, downward, leftward, rightward, diagonal, right-fold, etc.
It should be noted that, besides the above specific instruction, when the user input device 30 is in the instruction state, the following instruction may be executed: capturing specific points, grid display, orthogonal rendering, specific object tracking, line width setting, color setting, surface coloring, etc. For example, the display control apparatus 10 may display a brush and a palette of different thicknesses, etc. as shown in fig. 11a, so that the user may select a brush of a certain thickness and select a certain color, etc. by using the user input apparatus 30 after displaying in the stereoscopic space through the negative refractive index plate 40.
As another example, the display control device 10 may also quickly color the closed surface drawn by the user through the user input device 30, such as in the order of "red, blue, green" colors as shown in FIG. 11 b; and the display control apparatus 10 automatically captures a specific point a, which is an intersection point of two straight lines, according to the two straight lines drawn by the user input apparatus 30.
An embodiment of the present invention further provides a real-time display system, a schematic structural diagram of which is shown in fig. 12, and the real-time display system specifically includes:
the user input device 120 is configured to detect detection signals respectively emitted by signal emission sources of three coordinate axes in the spatial localization subsystem when the user input device is in a three-dimensional space below the spatial localization subsystem.
The signal emission sources of three coordinate axes in the spatial positioning subsystem respectively emit detection signals according to a time sequence, and the signal emission source of any coordinate axis rotates around the corresponding coordinate axis to emit the detection signals; when the user input equipment is positioned on the plane where the detection signals transmitted by the signal transmission sources of a certain coordinate axis are positioned, the detection signals transmitted by the signal transmission sources of the corresponding coordinate axis are detected.
An acquiring unit 121, configured to acquire time differences between the time when the signal emitting sources of the three coordinate axes start to emit signals and the time when the user input device 120 detects detection signals, respectively.
A position determining unit 122, configured to determine a position of the user input device in a three-dimensional space according to the time difference corresponding to each of the three coordinate axes acquired by the acquiring unit 121.
If the signal emission sources of the three coordinate axes are surface laser sources of the plane where the three coordinate axes are located respectively; the position determining unit 122 is specifically configured to determine, according to the time differences respectively corresponding to the three coordinate axes, rotation angles of laser signals emitted by the surface laser sources respectively corresponding to the three coordinate axes; and determining the intersection point of the planes of the three coordinate axes corresponding to the laser signals as the position of the user input equipment in the three-dimensional space according to the rotation angle of the laser signals emitted by the surface laser source corresponding to the three coordinate axes.
A movement determining unit 123, configured to determine movement information of the user input device in the stereoscopic space according to the position of the user input device in the stereoscopic space determined by the position determining unit 122.
And a spatial display unit 124, configured to display in the stereoscopic space where the user input device is located according to the movement information determined by the movement determination unit 123.
The spatial display unit 124 is specifically configured to display on a display screen according to the movement information; and information displayed on the display screen is displayed in a three-dimensional space where the user input equipment is located through a negative refractive index plate arranged above the display screen. Further, a cylindrical lens can be arranged on the display screen. Specifically, the method comprises the following steps:
when the user input device is in a writing state, the spatial display unit 124 displays the movement track of the user input device on the display screen; when the user input device is in the instruction state, the spatial display unit 124 determines a specific instruction corresponding to the movement trajectory of the user input device according to a preset corresponding relationship between the movement trajectory and the specific instruction; and displaying the operation result of the determined specific instruction on the display screen.
Further, the real-time display system may further include a setting unit 125, configured to display a setting interface of the corresponding relationship, where the setting interface includes input interfaces corresponding to a plurality of specific instructions, respectively; and when a movement track is received from the input interface of any specific instruction, storing the corresponding relation between the received movement track and the corresponding specific instruction. Thus, the spatial display unit 124 determines the corresponding specific command according to the corresponding relationship set by the setting unit 125.
Thus, the user can use the user input device 120 to move in the three-dimensional space below the spatial positioning subsystem and input certain information, and the information input by the user input device 120 in the three-dimensional space can be correspondingly displayed in the three-dimensional space through the real-time display system, so that in the process of interaction between the user and the real-time display system, the user input is not limited on the touch display screen, but in the three-dimensional space, the real-time display is more diversified.
The present invention further provides a terminal device, a schematic structural diagram of which is shown in fig. 13, where the terminal device may generate a relatively large difference due to different configurations or performances, and may include one or more Central Processing Units (CPUs) 20 (e.g., one or more processors) and a memory 21, and one or more storage media 22 (e.g., one or more mass storage devices) storing the application programs 221 or the data 222. Wherein the memory 21 and the storage medium 22 may be a transient storage or a persistent storage. The program stored in the storage medium 22 may include one or more modules (not shown), each of which may include a series of instruction operations for the terminal device. Still further, the central processor 20 may be arranged to communicate with the storage medium 22, and to execute a series of instruction operations in the storage medium 22 on the terminal device.
Specifically, the application programs 221 stored in the storage medium 22 include real-time display application programs, and the programs may include the acquiring unit 121, the position determining unit 122, the movement determining unit 123, the space displaying unit 124, and the setting unit 125 in the real-time display system, which are not described in detail herein. Further, the central processor 20 may be configured to communicate with the storage medium 22, and execute a series of operations corresponding to the application program for real-time display stored in the storage medium 22 on the terminal device.
The terminal equipment may also include one or more power supplies 23, one or more wired or wireless network interfaces 24, one or more input-output interfaces 25, and/or one or more operating systems 223, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, and the like.
The steps executed by the display control device in the real-time display system in the above-described method embodiment may be based on the structure of the terminal device shown in fig. 8.
Embodiments of the present invention also provide a computer-readable storage medium, which stores a plurality of computer programs, where the computer programs are suitable for being loaded by a processor and executing the real-time display method executed by the display control device.
The embodiment of the invention also provides terminal equipment, which comprises a processor and a memory; the memory is used for storing a plurality of computer programs which are used for being loaded by the processor and executing the real-time display method executed by the display control device; the processor is configured to implement each of the plurality of computer programs.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable storage medium, and the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
The real-time display method, the real-time display system, the computer-readable storage medium and the terminal device provided by the embodiments of the present invention are described in detail above, and a specific example is applied in the present disclosure to explain the principle and the implementation of the present invention, and the description of the above embodiments is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A real-time display method, comprising:
when the user input equipment is in a three-dimensional space below the space positioning subsystem, detecting detection signals respectively emitted by signal emission sources of three coordinate axes in the space positioning subsystem through the user input equipment;
respectively acquiring time differences between the moment when the signal emission sources of the three coordinate axes start to emit signals and the moment when the user input equipment detects detection signals;
determining the position of the user input equipment in the three-dimensional space according to the time difference corresponding to the three coordinate axes respectively;
determining the movement information of the user input equipment in the three-dimensional space according to the position of the user input equipment in the three-dimensional space;
displaying in the three-dimensional space according to the mobile information;
the signal emission sources of the three coordinate axes are surface laser sources of planes where the three coordinate axes are located respectively;
the determining the position of the user input device in the three-dimensional space according to the time differences corresponding to the three coordinate axes respectively specifically includes:
determining the rotation angle of the laser signal emitted by the surface laser source corresponding to the three coordinate axes according to the time difference corresponding to the three coordinate axes;
and determining the intersection point of the planes of the three coordinate axes corresponding to the laser signals as the position of the user input equipment in the three-dimensional space according to the rotation angle of the laser signals emitted by the surface laser source corresponding to the three coordinate axes.
2. The method of claim 1, wherein the signal emitting sources of the three coordinate axes respectively emit detection signals in a time sequence, and the signal emitting source of any coordinate axis rotates around the corresponding coordinate axis to emit detection signals;
when the user input equipment is positioned on the plane where the detection signals transmitted by the signal transmission sources of a certain coordinate axis are positioned, the detection signals transmitted by the signal transmission sources of the corresponding coordinate axis are detected.
3. The method according to any one of claims 1 to 2, wherein the displaying in the stereoscopic space where the user input device is located according to the movement information specifically includes:
displaying on a display screen according to the mobile information;
and information displayed on the display screen is displayed in a three-dimensional space where the user input equipment is located through a negative refractive index plate arranged above the display screen.
4. The method of claim 3, wherein a lenticular lens is disposed over the display screen.
5. The method of claim 3, wherein the movement information includes a movement trajectory of the user input device, and the displaying on the display screen according to the movement information specifically includes:
and when the user input equipment is in a writing state, displaying the moving track of the user input equipment on the display screen.
6. The method of claim 3, wherein the movement information includes a movement trajectory of the user input device, and the displaying on the display screen according to the movement information specifically includes:
when the user input equipment is in an instruction state, determining a specific instruction corresponding to the movement track of the user input equipment according to the corresponding relation between a preset movement track and the specific instruction;
and displaying the operation result of the determined specific instruction on the display screen.
7. The method of claim 6, wherein the method further comprises:
displaying a setting interface of the corresponding relation, wherein the setting interface comprises input interfaces corresponding to a plurality of specific instructions respectively;
and when a movement track is received from the input interface of any specific instruction, storing the corresponding relation between the received movement track and the corresponding specific instruction.
8. A real-time display system, comprising:
the system comprises a space positioning subsystem, a user input device, a signal emission source and a signal detection device, wherein the space positioning subsystem is used for acquiring a three-dimensional space;
the acquisition unit is used for respectively acquiring the time difference between the moment when the signal emission sources of the three coordinate axes start to emit signals and the moment when the user input equipment detects the detection signals;
the position determining unit is used for determining the position of the user input equipment in a three-dimensional space according to the time difference corresponding to the three coordinate axes respectively;
the movement determining unit is used for determining the movement information of the user input equipment in the three-dimensional space according to the position of the user input equipment in the three-dimensional space;
the space display unit is used for displaying in the three-dimensional space according to the movement information;
the signal emission sources of the three coordinate axes are surface laser sources of planes where the three coordinate axes are located respectively; the position determining unit is used for determining the rotation angle of the laser signal emitted by the surface laser source corresponding to the three coordinate axes according to the time difference corresponding to the three coordinate axes; and determining the intersection point of the planes of the three coordinate axes corresponding to the laser signals as the position of the user input equipment in the three-dimensional space according to the rotation angle of the laser signals emitted by the surface laser source corresponding to the three coordinate axes.
9. A computer-readable storage medium storing a plurality of computer programs, the computer programs being loaded by a processor and executing the real-time display method according to any one of claims 1 to 7.
10. A terminal device comprising a processor and a memory;
the memory is used for storing a plurality of computer programs, and the computer programs are used for being loaded by the processor and executing the real-time display method according to any one of claims 1 to 7; the processor is configured to implement each of the plurality of computer programs.
CN202010512742.8A 2020-06-08 2020-06-08 Real-time display method and system, computer readable storage medium and terminal equipment Active CN111782063B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010512742.8A CN111782063B (en) 2020-06-08 2020-06-08 Real-time display method and system, computer readable storage medium and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010512742.8A CN111782063B (en) 2020-06-08 2020-06-08 Real-time display method and system, computer readable storage medium and terminal equipment

Publications (2)

Publication Number Publication Date
CN111782063A CN111782063A (en) 2020-10-16
CN111782063B true CN111782063B (en) 2021-08-31

Family

ID=72753733

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010512742.8A Active CN111782063B (en) 2020-06-08 2020-06-08 Real-time display method and system, computer readable storage medium and terminal equipment

Country Status (1)

Country Link
CN (1) CN111782063B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916141A (en) * 2010-08-27 2010-12-15 林民东 Interactive input device and method based on space orientation technique
CN103838376A (en) * 2014-03-03 2014-06-04 深圳超多维光电子有限公司 3D interactive method and 3D interactive system
CN106919294A (en) * 2017-03-10 2017-07-04 京东方科技集团股份有限公司 A kind of 3D touch-controls interactive device, its touch-control exchange method and display device
CN107589884A (en) * 2017-07-18 2018-01-16 朱小军 A kind of 3D stereoscopic displays exchange method and intelligent mobile terminal
CN107831558A (en) * 2017-12-09 2018-03-23 安徽省东超科技有限公司 Multiple rows of multiple row equivalent negative refractive index flat plate lens
CN107884940A (en) * 2017-11-28 2018-04-06 腾讯科技(深圳)有限公司 Display module, head-mounted display apparatus and image stereo display method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10254855B2 (en) * 2013-06-04 2019-04-09 Wen-Chieh Geoffrey Lee High resolution and high sensitivity three-dimensional (3D) cursor maneuvering device
CN104181547B (en) * 2014-08-26 2016-08-24 西安交通大学 A kind of three-dimensional laser imaging system based on array detection unit and formation method
CN107238842B (en) * 2016-03-29 2020-06-16 中国人民解放军92232部队 Area array target searching, scanning and imaging device and method
CN210638697U (en) * 2019-06-18 2020-05-29 易视智瞳科技(深圳)有限公司 Positioning measurement device based on 3D coaxial vision sensor

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916141A (en) * 2010-08-27 2010-12-15 林民东 Interactive input device and method based on space orientation technique
CN103838376A (en) * 2014-03-03 2014-06-04 深圳超多维光电子有限公司 3D interactive method and 3D interactive system
CN106919294A (en) * 2017-03-10 2017-07-04 京东方科技集团股份有限公司 A kind of 3D touch-controls interactive device, its touch-control exchange method and display device
CN107589884A (en) * 2017-07-18 2018-01-16 朱小军 A kind of 3D stereoscopic displays exchange method and intelligent mobile terminal
CN107884940A (en) * 2017-11-28 2018-04-06 腾讯科技(深圳)有限公司 Display module, head-mounted display apparatus and image stereo display method
CN107831558A (en) * 2017-12-09 2018-03-23 安徽省东超科技有限公司 Multiple rows of multiple row equivalent negative refractive index flat plate lens

Also Published As

Publication number Publication date
CN111782063A (en) 2020-10-16

Similar Documents

Publication Publication Date Title
US11887312B2 (en) Fiducial marker patterns, their automatic detection in images, and applications thereof
WO2019123729A1 (en) Image processing device, image processing method, and program
CN108139803A (en) For the method and system calibrated automatically of dynamic display configuration
US20100315414A1 (en) Display of 3-dimensional objects
US10324736B2 (en) Transitioning between 2D and stereoscopic 3D webpage presentation
US11375559B2 (en) Communication connection method, terminal device and wireless communication system
US20190318201A1 (en) Methods and systems for shape based training for an object detection algorithm
US10321126B2 (en) User input device camera
US10257500B2 (en) Stereoscopic 3D webpage overlay
Li et al. Enhancing 3d applications using stereoscopic 3d and motion parallax
US10803652B2 (en) Image generating apparatus, image generating method, and program for displaying fixation point objects in a virtual space
US11057612B1 (en) Generating composite stereoscopic images usually visually-demarked regions of surfaces
CN111782063B (en) Real-time display method and system, computer readable storage medium and terminal equipment
US20220075477A1 (en) Systems and/or methods for parallax correction in large area transparent touch interfaces
US10642349B2 (en) Information processing apparatus
RU2695053C1 (en) Method and device for control of three-dimensional objects in virtual space
US20190089899A1 (en) Image processing device
TWI759764B (en) Superimpose virtual object method based on optical communitation device, electric apparatus, and computer readable storage medium
WO2020244576A1 (en) Method for superimposing virtual object on the basis of optical communication apparatus, and corresponding electronic device
CN112053444A (en) Method for superimposing virtual objects based on optical communication means and corresponding electronic device
CN116866541A (en) Virtual-real combined real-time video interaction system and method
CN112667079A (en) Virtual reality equipment and reverse prompt picture display method
CN117252919A (en) Method, device, equipment, medium and program product for full-automatic calibration of room
CN117716419A (en) Image display system and image display method
JP2012079211A (en) Display system and display method of stereoscopic image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40030646

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant