CN114374831A - Interactive system - Google Patents

Interactive system Download PDF

Info

Publication number
CN114374831A
CN114374831A CN202011103117.4A CN202011103117A CN114374831A CN 114374831 A CN114374831 A CN 114374831A CN 202011103117 A CN202011103117 A CN 202011103117A CN 114374831 A CN114374831 A CN 114374831A
Authority
CN
China
Prior art keywords
image information
dimensional
image display
display device
interactive system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011103117.4A
Other languages
Chinese (zh)
Inventor
杨钧翔
丁志宏
张凯杰
余德彰
林文绮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mirage Start Co ltd
Original Assignee
Mirage Start Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mirage Start Co ltd filed Critical Mirage Start Co ltd
Priority to CN202011103117.4A priority Critical patent/CN114374831A/en
Publication of CN114374831A publication Critical patent/CN114374831A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

Abstract

The invention discloses an interactive system. The interactive system comprises an electronic device, a first image display device and a second image display device. The electronic device transmits the first image information to the first image display equipment for displaying. The electronic device transmits the second image information to the second image display equipment for displaying. The second image information is displayed in a space on one side of the second image display device by the second image display device in three-dimensional. The three-dimensional second image information interacts with at least one object.

Description

Interactive system
Technical Field
The present invention relates to an interactive system, and more particularly, to an interactive system for directly interacting with three-dimensional image information.
Background
In the current Augmented Reality (Augmented Reality) or Virtual Reality (Virtual Reality) field, most users need to use their physical and mental senses to interact with electronic devices, rather than directly using props or hands to interact with three-dimensional image information.
Therefore, how to provide an interactive system for allowing a user to directly interact with three-dimensional image information has become one of the important issues to be solved by the industry.
Disclosure of Invention
The present invention provides an interactive system for overcoming the drawbacks of the prior art, including an electronic device, a first image display device and a second image display device, wherein the electronic device includes a controller, a storage module, and a connection module; the first image display equipment is electrically connected with the electronic device, and the first image display equipment is a two-dimensional image display equipment and is used for displaying two-dimensional image information; a second image display device is connected with the electronic device, and the second image display device is a three-dimensional image display device; the electronic device is provided with a program, the program at least provides first image information and second image information, the controller transmits the first image information to the first image display device for display, the controller transmits the second image information to the second image display device for display, and the second image information displays three-dimensional image information of the second image information in a space on one side of the second image display device through the second image display device; and the three-dimensional second image information at least interacts with an object.
Preferably, the method further comprises the following steps: the sensor is arranged on the electronic device, the first image display device or the second image display device and used for detecting at least one object three-dimensional coordinate value of the object, the sensor transmits the at least one object three-dimensional coordinate value to the controller, and the controller determines a plurality of second three-dimensional coordinate values of the three-dimensional second image information according to the at least one object three-dimensional coordinate value and a plurality of first three-dimensional coordinate values of the three-dimensional second image information.
Preferably, the first image display device is disposed at one side of the electronic apparatus, and the second image display device is disposed on the electronic apparatus.
Preferably, the second image display apparatus includes: a flat display module; a lens array layer; and a microstructure layer; the planar display module is provided with a display surface, the planar display module provides at least one image information, the lens array layer and the microstructure layer are displayed, the lens array layer is arranged on the display surface of the planar display module and receives the light of the image information to be adjusted, the lens array layer comprises a plurality of lenses and is used for regulating and controlling a light field, the microstructure layer is arranged on the lens array layer and comprises a plurality of microstructures, the microstructure layer is used for modulating the angle direction of the light, and the microstructure layer receives the light penetrating through the lens array layer to be adjusted so that the second image display device displays three-dimensional image information floating in the space on one side of the second image display device.
Preferably, the lenses of the lens array layer conform to a lens maker formula: (n-1) (1/R1+1/R2), where R1 and R2 are the radii of curvature on both sides of the lens, respectively, f is the lens focal length, and n is the lens refractive index.
Preferably, the microstructure layer is a dynamic optical component layer, the microstructure layer has a microstructure functional mode and a microstructure-free functional mode, and when the microstructure layer is switched to the microstructure functional mode, the microstructure layer can be used for modulating the angle direction of light.
Preferably, the microstructure layer is a liquid crystal lens array layer including a plurality of microstructure liquid crystal lenses, and the lens array layer is a liquid crystal lens array layer including a plurality of optically tuned liquid crystal lenses.
Preferably, the lens array layer comprises a first structured lens array layer and a second structured lens array layer, the first structured lens array layer has lens structures and can be used for regulating and controlling the light field, and the second structured lens array layer does not have lens structures.
Preferably, the second image display device further includes an electron beam sensor, and the electronic device or the second image display device provides an electron beam to the object according to the interaction condition between the three-dimensional second image information and the object.
Preferably, the electronic device or the second image display device provides a plurality of control signals to an electronic interactive glove of the object, so as to enable a plurality of tactile feedback components of the electronic interactive glove to act.
The invention also discloses an interactive system which comprises an electronic device, a three-dimensional image display device and a sensor. The electronic device comprises a controller, a storage module and a connecting module; the connecting module of the electronic device is connected with the three-dimensional image display equipment; the sensor is arranged on the electronic device or the three-dimensional image display equipment and used for detecting a plurality of object three-dimensional coordinate values of an object; the electronic device is provided with a program, the program is stored in the storage module and at least provides image information, the controller transmits the image information to the three-dimensional image display equipment for display, the controller transmits the image information to the three-dimensional display equipment for display, and the image information displays three-dimensional image information in a space on one side of the three-dimensional image display equipment through the three-dimensional image display equipment; wherein the three-dimensional image information interacts with at least the object; wherein the sensor transmits the plurality of object three-dimensional coordinate values to the controller, and the controller determines a plurality of second three-dimensional coordinate values of the three-dimensional image information according to the plurality of object three-dimensional coordinate values and a plurality of first three-dimensional coordinate values of the three-dimensional image information of the second image information; wherein the second three-dimensional coordinate values of the three-dimensional image information determine an action when the three-dimensional image information interacts with the object.
Preferably, the three-dimensional image display apparatus further includes an electron beam sensor, and the electronic device or the three-dimensional image display apparatus provides an electron beam to the object according to the interaction condition between the three-dimensional image information and the object.
Preferably, the electronic device or the three-dimensional image display device provides a plurality of control signals to an electronic interaction device disposed on the object, so as to actuate a plurality of tactile feedback components of the electronic interaction device.
The interactive system provided by the invention has the beneficial effects that the user can intuitively interact with the three-dimensional image information to generate corresponding actions and changes, and the electronic devices of various types of augmented reality do not need to be worn.
For a better understanding of the features and technical content of the present invention, reference should be made to the following detailed description of the invention and accompanying drawings, which are provided for purposes of illustration and description only and are not intended to limit the invention.
Drawings
Fig. 1 is a schematic diagram of an interactive system according to a first embodiment of the present invention.
Fig. 2A is a functional block diagram of an interactive system according to a first embodiment of the invention.
Fig. 2B is another functional block diagram of the interactive system according to the first embodiment of the present invention.
Fig. 3 is another schematic diagram of the interactive system according to the first embodiment of the invention.
Fig. 4 is a schematic configuration diagram of a second image display apparatus of the present invention.
Fig. 5 is a schematic diagram of the second image display device displaying a second image information.
FIG. 6 is a schematic diagram of second image information interacting with an object.
FIG. 7 is another schematic diagram of second image information interacting with an object.
FIG. 8 is another schematic diagram of the second image information interacting with an object.
FIG. 9 is another illustration of the second image information interacting with an object.
FIG. 10 is a diagram illustrating a second image information displayed according to a first entity and a second entity.
FIG. 11 is a diagram illustrating a second image information displayed according to a first entity and a second entity.
Fig. 12 is a schematic diagram of an interactive system according to a second embodiment of the present invention.
FIG. 13 is another schematic diagram of an interactive system according to a second embodiment of the present invention.
FIG. 14 is another schematic diagram of an interactive system according to a second embodiment of the present invention.
FIG. 15 is another schematic diagram of an interactive system according to a second embodiment of the present invention.
Fig. 16 is a schematic diagram of an interactive system according to a third embodiment of the present invention.
FIG. 17 is another schematic diagram of an interactive system according to a third embodiment of the present invention.
Detailed Description
The following is a description of the embodiments of the present disclosure related to "interactive system" by specific embodiments, and those skilled in the art can understand the advantages and effects of the present disclosure from the disclosure of the present disclosure. The invention is capable of other and different embodiments and its several details are capable of modifications and various changes in detail, all without departing from the spirit and scope of the present invention. The drawings of the present invention are for illustrative purposes only and are not intended to be drawn to scale. The following embodiments will further explain the related art of the present invention in detail, but the disclosure is not intended to limit the scope of the present invention. In addition, the term "or" as used herein should be taken to include any one or combination of more of the associated listed items as the case may be.
[ first embodiment ]
Referring to fig. 1, fig. 2A, fig. 2B, and fig. 3, fig. 1 is a schematic diagram of an interactive system according to a first embodiment of the invention. Fig. 2A is a functional block diagram of an interactive system according to a first embodiment of the invention. Fig. 2B is another functional block diagram of the interactive system according to the first embodiment of the present invention. Fig. 3 is another schematic diagram of the interactive system according to the first embodiment of the invention.
Referring to fig. 1, the interactive system SYS1 includes an electronic apparatus 1, a first image display device 2 and a second image display device 3. The electronic device 1 is a computer system, and includes a controller 11, a storage module 12, and a connection module 13.
The first image display device 2 is electrically connected to the electronic apparatus 1. The first image display device 2 is a two-dimensional image display device for displaying a two-dimensional image information. The second image display device 3 is connected to the electronic apparatus 1. The second image display device 3 is a three-dimensional image display device. In this embodiment, the second image display device 3 may be connected to the electronic apparatus 1 through a wired communication method or a wireless communication method. The second image display device 3 can be communicatively connected to the electronic apparatus 1 by a Wi-Fi communication unit, a bluetooth communication unit, a Zigbee communication unit (Zigbee), a LoRa communication unit, a Sigfox communication unit, or an NB-IoT communication unit.
The electronic device 1 further comprises a communication module 14. The electronic apparatus 1 may connect the first image display device 2 and the second image display device 3 using the communication module 14. The communication module 14 may be a wired communication module or a wireless communication module. When the communication module 14 is a wireless communication module, the communication module 14 may be a Wi-Fi communication module, a bluetooth communication module, a Zigbee communication module (Zigbee), a LoRa communication module, a Sigfox communication module, or an NB-IoT communication module.
The controller 11 is a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), an Application Specific Integrated Circuit (ASIC), or a Microprocessor (MCU). The memory module 12 is a flash memory, a read only memory, a programmable read only memory, an electrically rewritable read only memory, an erasable programmable read only memory or an electrically erasable programmable read only memory.
The electronic device 1 is provided with a program, which at least provides a first image information G1 and a second image information G2. In addition, the program may also be provided to the electronic apparatus 1 by the cloud server. The controller 11 transmits the first image information G1 to the first image display device 2 for display. The controller 1 transmits the second image information G2 to the second image display device 3 for display. The second image information G1 displays a three-dimensional second image information G2 in a space on one side of the second image display device 3 by the second image display device 3. The three-dimensional second image information G2 interacts with at least one object. For example: a user's hand, a user's finger, a user's palm, a tool, a toy, or a prop, etc. In other embodiments, the first image display device 2 may also be used for displaying three-dimensional image information.
In addition, the interactive system SYS1 also includes a sensor 7. In the present embodiment, the sensor 7 is disposed on the electronic apparatus 1, the first image display device 2 or the second image display device 3, and is used for detecting a three-dimensional coordinate value of an object. In the present embodiment, the sensor 7 is an image extraction unit (CCD). The sensor 7 may also include a gaze tracking function to track the gaze of a human or animal.
The sensor 7 transmits at least one three-dimensional coordinate value of the object to the controller 11. The controller 11 determines a plurality of second three-dimensional coordinate values of the three-dimensional second image information G2 based on at least one object three-dimensional coordinate value and a plurality of first three-dimensional coordinate values of the three-dimensional second image information G2. That is, the sensor 7 detects the three-dimensional coordinate values of the object and compares the detected three-dimensional coordinate values with the first three-dimensional coordinate values of the three-dimensional second image information G2 to determine the second three-dimensional coordinate values of the three-dimensional second image information G2. At this time, the plurality of first three-dimensional coordinate values of the three-dimensional second image information G2 are the same as or different from the plurality of second three-dimensional coordinate values of the three-dimensional second image information G2.
When the plurality of first three-dimensional coordinate values of the three-dimensional second image information G2 are different from the plurality of second three-dimensional coordinate values of the three-dimensional second image information G2, the three-dimensional second image information G2 performs a different operation from the original three-dimensional second image information G2.
As shown in fig. 1, the first image display device 2 and the second image display device 3 are provided separately from the electronic apparatus 1. In fig. 1, the electronic device 1 is a desktop computer. The first image DISPLAY device 2 is a liquid crystal DISPLAY, a micro led DISPLAY (micro led DISPLAY) or an organic light emitting diode DISPLAY (OLED DISPLAY).
Referring to fig. 2B, in another embodiment, the interactive system SYS2 may further include a microphone MP, a speaker SP, and a connection device CON. The controller 11 of the electronic device 1 is electrically connected to the storage module 12, the connection module 13 and the communication module 14. The connection module 13 of the electronic device 1 is connected to the connection device CON. The connecting means CON connects the microphone MP, the speaker SP, the second image display device 3 and the sensor 7.
In the present embodiment, the connection device CON, the microphone MP, the speaker SP, the second image display apparatus 3, and the sensor 7 may be provided in the same device.
The microphone MP is used for receiving an audio message. When the microphone MP receives a control audio signal, the microphone MP transmits the control audio signal to the controller 11 for analysis through the connection device CON and the connection device 13. The controller 11 will further analyze the control demand signal in the control audio information, e.g. spin, sit down, jump. Then, the controller 11 further controls the operation of the three-dimensional second image information G2 displayed on the second image display device 3 according to the control request signal. In this way, the user can interact with the three-dimensional second image information G2 through sound. In addition, the controller 11 can transmit the interactive audio information corresponding to the three-dimensional second image information G2 to the speaker SP for playing, thereby improving the overall interactive experience of the user.
As shown in fig. 3, the first image display device 2 is disposed on one side of the electronic apparatus 1, and the second image display device 3 is disposed on the electronic apparatus 1. In fig. 3, the electronic device 1 is a notebook computer. That is, the electronic apparatus 1 is a dual-screen notebook computer.
In the present embodiment, the first image information G1 and the second image information G2 of the interactive system SYS1 are two pieces of image information that can be stitched, such as maps of different areas. However, the first image information G1 and the second image information G2 may be only two image information related in content, for example, the second image information G2 in fig. 3 is a virtual keyboard, and the first image information G1 is a note-skipping image. In addition, in the embodiment, the second image information G2 may also be a marble, and the first image information G1 is a bounce path of the marble table.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a second image display device according to the present invention.
The second image display device 3 includes a flat display module 31, a lens array layer 32 and a microstructure layer 33.
The flat display module 31 has a display surface 311. The flat panel display module 31 is used for providing two-dimensional second image information G2, and displaying three-dimensional second image information G2 through the lens array layer 32 and the microstructure layer 33.
The lens array layer 32 is disposed on the display surface 311 of the flat display module 31, receives the light of the two-dimensional second image information G2, and adjusts the light, and the lens array layer 32 includes a plurality of lenses 321. The lens array layer 32 is used to modulate the light field.
The microstructure layer 33 is disposed on the lens array layer 32. The microstructure layer 33 includes a plurality of microstructures. The microstructure layer 33 is used to modulate the angular direction of light. The microstructure layer 33 receives the light penetrating the lens array layer 32 to adjust so that the second image display device 3 displays three-dimensional image information floating in a space on the side of the second image display device 3.
The lenses 321 of the lens array layer 32 conform to the lens-maker formula: 1/f is (n-1) (1/R1+1/R2), where R1 and R2 are the radii of curvature on both sides of the lens 321, respectively, f is the lens focal length, and n is the lens refractive index.
In addition, the microstructure layer 33 is a dynamic optical element layer. That is, the microstructure layer 33 has a microstructure functional mode and a microstructure-free functional mode. The microstructure layer 33 can be adjusted to a microstructure functional mode and a microstructure-free functional mode according to a control signal. When the microstructure layer 33 is switched to the microstructure functional mode, it can be used to modulate the light angle direction. When the microstructure layer 33 is switched to the microstructure-less functional mode, then the second image display apparatus 3 is used to display two-dimensional image information, without displaying three-dimensional image information floating in the space on the second image display apparatus 3 side. The micro-structure Layer 33 is a Liquid crystal Lens array Layer (Liquid Lens Layer) including a plurality of micro-structure Liquid crystal lenses, and the Lens array Layer is also a Liquid crystal Lens array Layer (Liquid Lens Layer) including a plurality of optical adjustment Liquid crystal lenses. That is, the second image display device 3 is a display device including at least two liquid crystal lens array layers, and in other embodiments, the second image display device 3 may include more liquid crystal lens array layers for light adjustment. Furthermore, in other embodiments, the second image display device 3 may be provided with different numbers of second image display devices 3 in different areas for light adjustment. In addition, the lens array layer 32 includes a first structural lens array layer (not shown) and a second structural lens array layer (not shown). The first structured lens array layer (not shown) has lens structures that can be used to manipulate the light field, and the second structured lens array layer (not shown) does not have lens structures.
Referring to fig. 5, fig. 6 and fig. 7, fig. 5 is a schematic diagram of the second image display device displaying a second image information. FIG. 6 is a schematic diagram of second image information interacting with an object. FIG. 7 is another schematic diagram of second image information interacting with an object.
The three-dimensional second image information G2 is a pet dog. The plurality of first three-dimensional coordinate values G2C1-G2C3 of the second image information G2 in fig. 6 may be detected by the sensor 7 or directly obtained from the data of the second image information G2 by the controller 11.
The plurality of three-dimensional coordinate values FC1 to FC3 of the finger in fig. 6 may be detected by the sensor 7. Further, the plurality of three-dimensional coordinate values FC1-FC3 of the finger are a plurality of three-dimensional coordinate values over a period of time.
The controller 11 of the electronic apparatus 1 can determine the second three-dimensional coordinate values G2C1 '-G2C 3' of the second image information G2 and further determine the operation or deformation state of the second image information G2 based on the plurality of three-dimensional coordinate values FC1-FC3 of the finger and the plurality of first three-dimensional coordinate values G2C1-G2C3 of the second image information G2.
Referring to fig. 8 and 9, fig. 8 and 9 are schematic diagrams illustrating interaction between second image information and an object.
The three-dimensional second image information G2-1-G2-3 is a set of images of a land mouse, and the user can hit the plurality of three-dimensional second image information G2-1-G2-3 with a prop Tool. At this time, the controller 11 of the electronic device 1 may perform calculation according to the three-dimensional coordinate values TC1-TC3 of the Tool and the first three-dimensional coordinate values G2-1C1-G2-1C3 of the three-dimensional second image information G2-1 to determine the motion of the second image information G2-1 and the second three-dimensional coordinate values G2-1C1 '-G2-1C 3' of the second image information G2-1.
Referring to fig. 10 and 11, fig. 10 is a schematic diagram illustrating second image information displayed according to a first entity and a second entity. FIG. 11 is a diagram illustrating a second image information displayed according to a first entity and a second entity.
In fig. 10 and 11, when the first entity SB1 and the second entity SB2 are disposed on the second image display device 3, the second image information G2 is displayed above the second image display device 3. That is, the second image information G2 has an association with the first entity SB1 and the second entity SB 2. The sensor 7 can detect the setting positions of the first entity SB1 and the second entity SB2 to notify the electronic apparatus 1 or the second image display device 3 to provide the second image information G2. The electronic device 1 or the second image display apparatus 3 may detect the electronic tags RD1-RD3 of the first entity SB1 and the second entity SB2 by using electronic tags (RFID), photo sensing, capacitive sensing, voltage sensing, radar sensing, and the like. The sensor 7 may also be an electronic tag sensor, a light sensor, a capacitance sensor, a voltage sensor or a radar sensor.
Moreover, the number and form of the first entities SB1 and the second entities SB2 can be adjusted according to actual requirements, and are not limited in the present invention. In addition, different second image information G2 may be displayed according to the different placement positions of the first entity SB1 and the second entity SB 2. Furthermore, the electronic apparatus 1 or the second image display device 3 may control the second image information G2 to be active in the first active area AA1 or other active areas. That is, the second image information G2 may interact with the first entity SB1 and the second entity SB2 in a story-wise manner.
[ second embodiment ]
Referring to fig. 12, fig. 12 is a schematic diagram of an interactive system according to a second embodiment of the invention. The interactive system SYS2 may comprise only the electronic apparatus 1 and the three-dimensional image display device 3'. In the present embodiment, the structure and function of the electronic apparatus 1 and the three-dimensional image display device 3' are the same as those of the electronic apparatus 1 and the second image display device 3 in the previous embodiment, and are not limited herein.
The electronic device 1 can provide image information to the three-dimensional image display apparatus 3' to display the three-dimensional image information, and interact with the displayed floating three-dimensional image information by using an object. Similarly, the interactive system SYS2 further includes a sensor 7, which may be disposed on the electronic apparatus 1 or the three-dimensional image display device 3'.
Similarly, the interactive system SYS3 may further include a microphone MP and a speaker SP. The connection module 13 of the electronic apparatus 1 connects the microphone MP, the speaker SP, the three-dimensional image display device 3', and the sensor 7.
Similarly, in the present embodiment, the microphone MP, the speaker SP, the three-dimensional image display device 3', and the sensor 7 may be provided in the same apparatus.
The microphone MP is used for receiving an audio message. When the microphone MP receives a control audio signal, the microphone MP transmits the control audio signal to the controller 11 for analysis through the connection device CON and the connection device 13. The controller 11 will further analyze the control demand signal in the control audio information, e.g. spin, sit down, jump. Then, the controller 11 further controls the operation of the three-dimensional second image information G2 displayed on the second image display device 3 according to the control request signal. In this way, the user can interact with the three-dimensional second image information G2 through sound. In addition, the controller 11 can transmit the interactive audio information corresponding to the three-dimensional second image information G2 to the speaker SP for playing, thereby improving the overall interactive experience of the user.
Referring to fig. 13, 14 and 15, fig. 13 is another schematic diagram of an interactive system according to a second embodiment of the invention. FIG. 14 is another schematic diagram of an interactive system according to a second embodiment of the present invention. FIG. 15 is another schematic diagram of an interactive system according to a second embodiment of the present invention.
The first electronic device ED1 of FIG. 13 is a notebook computer. The second electronic device ED2 of fig. 14 is a smart phone. The third electronic device ED3 of fig. 15 is a tablet computer. That is, the first electronic device ED1, the second electronic device ED2, and the third electronic device ED3, which are different in fig. 13 to 15, may be connected to the three-dimensional image display apparatus 3' of the present invention to display three-dimensional image information.
In other embodiments, the electronic devices ED1-ED3 may be disposed in the same housing as the three-dimensional image display apparatus 3', or may be disposed together as an electronic device, which is not limited in this embodiment.
[ third embodiment ]
Referring to fig. 16 and 17, fig. 16 is a schematic diagram of an interactive system according to a third embodiment of the invention. FIG. 17 is another schematic diagram of an interactive system according to a third embodiment of the present invention.
Referring to fig. 16, the three-dimensional image display device 3 ' may further include an electron beam sensor 37 ' for emitting an electron beam to the user's hand to interact with the user's hand when the user's hand interacts with the floating three-dimensional image information.
Referring to fig. 17, the user may wear an electronic interactive glove comprising a plurality of touch sensing elements (not shown), the electronic interactive glove ECL is communicatively connected to the three-dimensional image display device 3 'and may receive control signals from the three-dimensional image display device 3', the electronic device ED3 or a server (not shown).
That is, when the user wears the electronic interactive glove ECL to interact with the floating three-dimensional image information displayed by the three-dimensional image display device 3', the electronic device ED3 or the server (not shown) can provide a plurality of control signals to a plurality of tactile feedback components (not shown) of the electronic interactive glove ECL according to the interaction state of the user with the floating three-dimensional image information, so as to provide a plurality of different tactile sensations to the user. In this embodiment, the haptic feedback element (not shown) is a piezoelectric element.
[ advantageous effects of the embodiments ]
The interactive system provided by the invention has the beneficial effects that the user can intuitively interact with the three-dimensional image information to generate corresponding actions and changes, and the electronic devices of various types of augmented reality do not need to be worn.
The disclosure is only a preferred embodiment of the invention and should not be taken as limiting the scope of the invention, which is defined by the appended claims.

Claims (13)

1. An interactive system, comprising:
an electronic device, including a controller, a storage module, a connection module;
the first image display equipment is electrically connected with the electronic device, and the first image display equipment is a two-dimensional image display equipment and is used for displaying two-dimensional image information; and
the second image display device is connected with the electronic device and is a three-dimensional image display device;
the electronic device is provided with a program, the program at least provides first image information and second image information, the controller transmits the first image information to the first image display device for display, the controller transmits the second image information to the second image display device for display, and the second image information displays three-dimensional image information of the second image information in a space on one side of the second image display device through the second image display device;
and the three-dimensional second image information at least interacts with an object.
2. The interactive system of claim 1, further comprising:
the sensor is arranged on the electronic device, the first image display device or the second image display device and used for detecting at least one object three-dimensional coordinate value of the object, the sensor transmits the at least one object three-dimensional coordinate value to the controller, and the controller determines a plurality of second three-dimensional coordinate values of the three-dimensional second image information according to the at least one object three-dimensional coordinate value and a plurality of first three-dimensional coordinate values of the three-dimensional second image information.
3. The interactive system of claim 2, wherein the first image display device is disposed on a side of the electronic apparatus and the second image display device is disposed on the electronic apparatus.
4. The interactive system of claim 3, wherein the second image display device comprises:
a flat display module;
a lens array layer; and
a microstructure layer;
the planar display module is provided with a display surface, the planar display module provides at least one image information, the lens array layer and the microstructure layer are displayed, the lens array layer is arranged on the display surface of the planar display module and receives the light of the image information to be adjusted, the lens array layer comprises a plurality of lenses and is used for regulating and controlling a light field, the microstructure layer is arranged on the lens array layer and comprises a plurality of microstructures, the microstructure layer is used for modulating the angle direction of the light, and the microstructure layer receives the light penetrating through the lens array layer to be adjusted so that the second image display device displays three-dimensional image information floating in the space on one side of the second image display device.
5. The interactive system of claim 4, wherein the lenses of the lens array layer conform to a lens maker formula: (n-1) (1/R1+1/R2), where R1 and R2 are the radii of curvature on both sides of the lens, respectively, f is the lens focal length, and n is the lens refractive index.
6. The interactive system of claim 5, wherein the microstructured layer is a dynamic optical element layer, the microstructured layer having a microstructured functional mode and a non-microstructured functional mode, the microstructured layer being operable to modulate the angular direction of light when switched to the microstructured functional mode.
7. The interactive system of claim 6, wherein the micro-structured layer is a liquid crystal lens array layer comprising a plurality of micro-structured liquid crystal lenses, and the lens array layer is a liquid crystal lens array layer comprising a plurality of optically tuned liquid crystal lenses.
8. The interactive system of claim 7, wherein the lens array layer comprises a first structured lens array layer having lens structures for manipulating the light field and a second structured lens array layer having no lens structures.
9. The interactive system as claimed in claim 1, wherein the second image display device further comprises an electron beam sensor, and the electronic device or the second image display device provides an electron beam to the object according to the interaction condition of the three-dimensional second image information and the object.
10. The interactive system of claim 1, wherein the electronic device or the second image display device provides control signals to an electronic interactive glove of the object to actuate haptic feedback elements of the electronic interactive glove.
11. An interactive system, comprising:
an electronic device, including a controller, a storage module and a connection module;
the connecting module of the electronic device is connected with the three-dimensional image display equipment; and
the sensor is arranged on the electronic device or the three-dimensional image display equipment and used for detecting a plurality of object three-dimensional coordinate values of an object;
the electronic device is provided with a program, the program is stored in the storage module and at least provides image information, the controller transmits the image information to the three-dimensional image display equipment for display, the controller transmits the image information to the three-dimensional display equipment for display, and the image information displays three-dimensional image information in a space on one side of the three-dimensional image display equipment through the three-dimensional image display equipment;
wherein the three-dimensional image information interacts with at least the object;
wherein the sensor transmits the plurality of object three-dimensional coordinate values to the controller, and the controller determines a plurality of second three-dimensional coordinate values of the three-dimensional image information according to the plurality of object three-dimensional coordinate values and a plurality of first three-dimensional coordinate values of the three-dimensional image information of the image information;
wherein the second three-dimensional coordinate values of the three-dimensional image information determine an action when the three-dimensional image information interacts with the object.
12. The interactive system of claim 11, wherein the three-dimensional image display device further comprises an electron beam sensor, and the electronic device or the three-dimensional image display device provides an electron beam to the object according to the interaction condition between the three-dimensional image information and the object.
13. The interactive system of claim 11, wherein the electronic device or the three-dimensional image display device provides control signals to an electronic interactive device disposed on the object to actuate haptic feedback elements of the electronic interactive device.
CN202011103117.4A 2020-10-15 2020-10-15 Interactive system Pending CN114374831A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011103117.4A CN114374831A (en) 2020-10-15 2020-10-15 Interactive system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011103117.4A CN114374831A (en) 2020-10-15 2020-10-15 Interactive system

Publications (1)

Publication Number Publication Date
CN114374831A true CN114374831A (en) 2022-04-19

Family

ID=81138061

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011103117.4A Pending CN114374831A (en) 2020-10-15 2020-10-15 Interactive system

Country Status (1)

Country Link
CN (1) CN114374831A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002082751A (en) * 2000-09-08 2002-03-22 Mitsubishi Electric Corp Device for interaction with virtual space and virtual space system applied with the same
US20100007636A1 (en) * 2006-10-02 2010-01-14 Pioneer Corporation Image display device
CN101641964A (en) * 2007-03-30 2010-02-03 独立行政法人情报通信研究机构 Mid-air video interaction device and its program
CN102566110A (en) * 2010-12-13 2012-07-11 财团法人工业技术研究院 Display with dimension switching function
CN102681183A (en) * 2012-05-25 2012-09-19 合肥鼎臣光电科技有限责任公司 Two-way three-dimensional imaging and naked-eye three-dimensional display system based on lens array
US20180033211A1 (en) * 2016-07-29 2018-02-01 Zspace, Inc. Personal Electronic Device with a Display System
US10120451B1 (en) * 2014-01-09 2018-11-06 D.R. Systems, Inc. Systems and user interfaces for dynamic interaction with two- and three-dimensional medical image data using spatial positioning of mobile devices
US10275098B1 (en) * 2015-07-12 2019-04-30 sigmund lindsay clements Laser mid-air hologram touch input buttons for a device
US20200057352A1 (en) * 2018-08-16 2020-02-20 Cheray Co. Ltd. Image display device
JP3225817U (en) * 2020-01-27 2020-04-02 幻景▲ケイ▼動股▲フン▼有限公司 Electronic products with 3D display devices
US20200290513A1 (en) * 2019-03-13 2020-09-17 Light Field Lab, Inc. Light field display system for vehicle augmentation

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002082751A (en) * 2000-09-08 2002-03-22 Mitsubishi Electric Corp Device for interaction with virtual space and virtual space system applied with the same
US20100007636A1 (en) * 2006-10-02 2010-01-14 Pioneer Corporation Image display device
CN101641964A (en) * 2007-03-30 2010-02-03 独立行政法人情报通信研究机构 Mid-air video interaction device and its program
CN102566110A (en) * 2010-12-13 2012-07-11 财团法人工业技术研究院 Display with dimension switching function
CN102681183A (en) * 2012-05-25 2012-09-19 合肥鼎臣光电科技有限责任公司 Two-way three-dimensional imaging and naked-eye three-dimensional display system based on lens array
US10120451B1 (en) * 2014-01-09 2018-11-06 D.R. Systems, Inc. Systems and user interfaces for dynamic interaction with two- and three-dimensional medical image data using spatial positioning of mobile devices
US10275098B1 (en) * 2015-07-12 2019-04-30 sigmund lindsay clements Laser mid-air hologram touch input buttons for a device
US20180033211A1 (en) * 2016-07-29 2018-02-01 Zspace, Inc. Personal Electronic Device with a Display System
US20200057352A1 (en) * 2018-08-16 2020-02-20 Cheray Co. Ltd. Image display device
US20200290513A1 (en) * 2019-03-13 2020-09-17 Light Field Lab, Inc. Light field display system for vehicle augmentation
JP3225817U (en) * 2020-01-27 2020-04-02 幻景▲ケイ▼動股▲フン▼有限公司 Electronic products with 3D display devices

Similar Documents

Publication Publication Date Title
US9983676B2 (en) Simulation of tangible user interface interactions and gestures using array of haptic cells
CN108475120B (en) Method for tracking object motion by using remote equipment of mixed reality system and mixed reality system
CN103412659B (en) Sense user's input of edge of display area
US10488928B2 (en) Tactile sensation providing system and tactile sensation providing apparatus
TWI492146B (en) Virtual hand based on combined data
CN116724285A (en) Micro-gestures for controlling virtual and graphical elements
US20180181235A1 (en) Systems and methods for distributing haptic effects to users interacting with user interfaces
CN114080585A (en) Virtual user interface using peripheral devices in an artificial reality environment
CN111033572B (en) Method and apparatus for detecting planes and/or quadtrees used as virtual substrates
US20100020036A1 (en) Portable electronic device and method of controlling same
US10509489B2 (en) Systems and related methods for facilitating pen input in a virtual reality environment
US20180373325A1 (en) Haptic dimensions in a variable gaze orientation virtual environment
US20160209968A1 (en) Mapping touch inputs to a user input module
US11009949B1 (en) Segmented force sensors for wearable devices
US10540023B2 (en) User interface devices for virtual reality system
CN105829948A (en) Wearable Display Input System
KR20190014738A (en) An electronic apparatus comprising a force sensor and a method for controlling electronic apparatus thereof
Rehman et al. Gestures and marker based low-cost interactive writing board for primary education
CN114374831A (en) Interactive system
TWI761976B (en) Interactive system
Shen et al. Fluid Reality: High-Resolution, Untethered Haptic Gloves using Electroosmotic Pump Arrays
CN116324579A (en) Augmented reality game using virtual eye-wear beams
Bogue Sensors for interfacing with consumer electronics
US11687166B2 (en) Image processing system and image processing device
Vimali et al. Hand gesture recognition control for computers using Arduino

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination