WO2018205426A1 - 桌面型空间立体交互系统 - Google Patents

桌面型空间立体交互系统 Download PDF

Info

Publication number
WO2018205426A1
WO2018205426A1 PCT/CN2017/095272 CN2017095272W WO2018205426A1 WO 2018205426 A1 WO2018205426 A1 WO 2018205426A1 CN 2017095272 W CN2017095272 W CN 2017095272W WO 2018205426 A1 WO2018205426 A1 WO 2018205426A1
Authority
WO
WIPO (PCT)
Prior art keywords
infrared
spatial
stereoscopic
interaction
stereo
Prior art date
Application number
PCT/CN2017/095272
Other languages
English (en)
French (fr)
Inventor
阮仕叠
郑执权
林忠球
Original Assignee
深圳未来立体教育科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳未来立体教育科技有限公司 filed Critical 深圳未来立体教育科技有限公司
Priority to US16/610,908 priority Critical patent/US20200159339A1/en
Publication of WO2018205426A1 publication Critical patent/WO2018205426A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/341Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation

Definitions

  • the present invention relates to the field of multi-space stereo interaction technology, and in particular, to a desktop-type spatial stereo interaction system capable of quickly processing spatial data of a control device.
  • 2D video display ⁇ continuous display of video image frames regardless of left and right eye angles, the time between image frames is very short, therefore, the human eye sees the scene continuous video, and the stereo image or video acquisition
  • the process is much more complicated.
  • Two cameras are required for shooting.
  • the dedicated stereo camera consists of two imaging lenses, which are equivalent to two eyes of the person, so that two sets of images with parallax are obtained.
  • the device or software is combined into a set of images.
  • the standard synthesis is the left and right format, that is, the left eye image width is compressed by 1/2, placed on the left side of the screen, and the right eye image is compressed by 1/2, placed on the screen. On the right.
  • the left and right eye images are moved to the middle of the screen and doubled in width to restore the original image ratio without the aid of visual aids such as polarized glasses or shutter-type active glasses.
  • the image on the screen will be ghosted, because each shot will have two different angles of view, which is the result of the image of the left and right perspective being superimposed on the screen.
  • the visual aid device not only the visual aid device but also the head space coordinate information of the viewer is obtained to determine the visual observation point information.
  • the manipulation device is required, and the manipulation device is stereoscopic. The image is captured, dragged, and zoomed in space, requiring precise tracking of the spatial position of the manipulation device.
  • the spatial position of the interactive control device is determined by the infrared positioning unit, and then interacts with the stereo interaction device.
  • the signal shift of the interactive control device may occur during the actual interaction process. The problem causes the user to accurately select the stereo content during the interaction process, and the user experience is poor.
  • the operation of the interactive control device in the existing spatial stereo interaction system is complicated, and the problem of signal drift of the interactive control device may occur during the actual interaction process, so that the user cannot accurately select the stereo content during the interaction process, and the user experience is poor.
  • the problem is to propose a desktop spatial stereo interaction system. By setting up a nine-axis motion sensor in the interactive control device, the original data of the three-dimensional acceleration, angular velocity and geomagnetic direction of X, ⁇ , and ⁇ during the operation are detected, which greatly improves the interaction.
  • the accuracy of the control device eliminates the signal drift problem during the operation of the interactive control device; the MCU is set in the interactive control device, and the raw data of the acceleration, angular velocity and geomagnetic direction are processed to obtain the Euler angle parameter of the interactive control device and the fourth
  • the number of elements, the stereo interaction device only needs to fuse the Euler angle parameter and the quaternion number with the space coordinate data to obtain the precise posture position, which reduces the processing load of the stereo interaction device.
  • a desktop spatial stereo interaction system which is used to implement an interaction between an operator and a stereo interaction device, and the spatial interaction system includes:
  • a stereo interaction device configured to track an operator's visual observation point through an infrared coordinate component, acquire an operation instruction of the interaction manipulation device, and display virtual stereoscopic content corresponding to the visual observation point;
  • an infrared coordinate component configured to acquire first spatial coordinate data, second spatial coordinate data, and transmit the data to the stereo interaction device;
  • a visual aid device configured to acquire virtual stereoscopic content from the stereo interaction device
  • an interaction control device configured to output an operation instruction to the stereo interaction device
  • the interaction control device includes a nine-axis motion sensor for acquiring spatial image raw data and an MCU for processing the spatial attitude original data into Euler angle parameters and quaternions,
  • An axis motion sensor is coupled to the MCU.
  • the infrared coordinate component includes an infrared emitting unit, an optical capturing unit, and a first one disposed on the visual aid device.
  • An optical identification point and a second optical identification point disposed on the interactive control device.
  • the infrared emitting unit includes at least one infrared emitting device for emitting infrared light
  • the optical capturing unit includes at least one 2 infrared capture cameras for acquiring a target image; the infrared emitting device and the infrared capture camera are embedded and mounted on the stereo interaction device.
  • the nine-axis motion sensor includes acceleration sensing in a third possible implementation manner.
  • Unit, gyro unit and geomagnetic sensor unit are possible implementation manner.
  • the first and second optical recognition points in the fourth possible implementation manner are active infrared emitting devices or passive optical reflection points.
  • the first optical recognition point is a passive optical reflection point, and the number of the optical reflection points is at least two;
  • the second optical identification point is an active infrared emitting device, and the infrared emitting device is disposed at a top end position of the interactive manipulation device.
  • the interaction control device is provided with a programmable function button for operating the virtual stereoscopic content displayed by the stereo interaction device.
  • the visual aid device is a polarized stereoscopic glasses or a shutter-type stereoscopic glasses.
  • the infrared capture camera lens has a viewing angle of at least 70 degrees.
  • the infrared coordinate component of the stereo interaction device in the ninth possible implementation manner has a capture distance of 0 to 3 m.
  • FIG. 1 is a schematic diagram of a logical connection of a desktop type spatial stereo interaction system
  • FIG. 2 is a schematic diagram showing the logical composition of components included in the interactive control device in a desktop type spatial stereo interaction system according to the present invention
  • FIG. 3 is a schematic diagram showing the logical composition and signal flow of an interactive control device in a desktop type spatial stereo interaction system according to the present invention
  • FIG. 4 is a schematic diagram of a self-test circuit of a state aid device in a desktop type spatial stereo interaction system according to the present invention
  • 30 stereo interactive device
  • 31 infrared coordinate component
  • 32 interactive control device
  • 33 visual aid device
  • 311 infrared emitting unit
  • 312 infrared capture Unit
  • 314 - second optical identification unit 321 - _MCU
  • 322 - nine-axis motion sensor 3221 - acceleration sensor unit, 3222 - gyroscope unit, 3223 - geomagnetic sensor unit, 331 - state Self-test circuit
  • 3311 - acceleration sensor detection circuit 3312 - angular velocity detection circuit
  • 3313 - distance sensor detection circuit 331 - state Self-test circuit
  • FIG. 1 is a schematic diagram of a logical connection of a desktop spatial stereo interaction system.
  • the technical solution is to provide a desktop spatial stereo interaction system capable of accurately tracking the interactive control device 32.
  • the spatial interaction system includes a stereo interaction device 30, an infrared coordinate component 31, an interactive manipulation device 32, and a visual aid device 33, and the infrared coordinate component 31 tracks the visual aid device 33 and the interactive manipulation device 32, and spaces it.
  • the coordinate data is transmitted to the stereo interaction device 30, and the spatial coordinate data of the visual aid device 33 is consistent with the visual observation point of the operator.
  • the visual observation point herein refers to the spatial positional relationship of the human eye relative to the stereoscopic interaction device 30, and the stereoscopic relationship.
  • the interaction device 30 determines a visual observation point The purpose is to display corresponding stereoscopic image frames in order to obtain an optimal stereoscopic effect, and to realize an operator visual observation point to interact with the stereoscopic interaction device 30.
  • the interactive control device 32 is provided with a plurality of programmable function buttons for completing the main interaction tasks between the operator and the stereo interaction device 30. Due to the complicated operation of the interaction control device 32, the precise spatial posture needs to be determined.
  • the interactive control device 32 in the spatial stereo interaction system adopts a nine-axis motion sensor 322, detects the spatial attitude change without dead angle, and preprocesses the detected spatial attitude original data, and obtains the Euler angle parameter and the quaternion transmission to the stereo
  • the interaction device 30 only needs to fuse the spatial coordinate data with the Euler angle parameter and the quaternion according to the spatial data fusion algorithm to obtain the precise spatial posture position of the interaction control device 32, thereby greatly reducing the processing load of the stereo interaction device 30.
  • the operator can operate the stereoscopic object in the virtual scene of the stereo interaction device 30 through the programmable function button as needed to realize human-computer interaction.
  • a desktop spatial stereo interaction system is provided, which is used to implement an interaction between an operator and a stereo interaction device 30.
  • the spatial interaction system includes:
  • the stereo interaction device 30 is configured to track an operator's visual observation point through the infrared coordinate component 31, acquire an operation instruction of the interaction control device 32, and display virtual stereoscopic content corresponding to the visual observation point.
  • FIG. 3 is a schematic diagram showing the logical composition and signal flow of the components of the interactive control device in the desktop spatial stereo interaction system according to the present invention
  • the infrared coordinate component 31 is configured to acquire the first and second spatial coordinate data and transmit the data to the stereo interaction device 30, where the first and second spatial coordinate data are respectively the space of the visual aid device 33 and the interactive manipulation device 32. Coordinate data.
  • the infrared coordinate component 31 includes an infrared emitting unit 311 disposed on the stereo interaction device 30, an optical capturing unit, a first optical identification point 313 disposed on the visual aid device 33, and an interaction control device 32.
  • the infrared emitting unit 311 includes at least one infrared emitting device for emitting infrared light
  • the optical capturing unit 312 includes at least two infrared capturing cameras for acquiring a target image; the infrared emitting device and the infrared capturing camera are embedded and installed in the stereo interactive device. on.
  • the stereo interaction device 30 supports a shutter stereoscopic technology. After the stereoscopic image or video is input to the stereo interaction device 30, the image of the stereo interaction device 30 having a refresh rate of at least 120 Hz implements the left and right frames in a frame sequence format. Alternately generated, the shutter glasses receive the synchronization signal of the stereo interaction device 30, with the same Frequency snoring or turning off the left and right liquid crystal lenses, refreshing the synchronization to achieve the left and right eye to view the corresponding image, and maintaining the same number of frames as the 2D video, the operator's two eyes see different screens for fast switching, and create an illusion in the brain , you can see the stereo image.
  • the stereo interaction device 30 in the technical solution, the built-in optical processing unit, can also be used with the polarized stereo glasses, and the original image is divided into vertical directions by changing the arrangement of the liquid crystal molecules of the liquid crystal display screen in the stereo interaction device 30.
  • Polarized light and horizontally polarized light are two sets of images, and then the polarized lenses of different polarization directions are respectively used on the right and left sides of the stereoscopic glasses, so that the left and right eyes of the person can receive two sets of pictures, and then the stereoscopic images are synthesized through the brain.
  • the infrared emitting unit 311 in the infrared coordinate component 31 includes at least one infrared emitting device for emitting infrared light, and the infrared emitting device is configured to emit infrared light to the optical reflecting point of the visual aid device 33 or the interactive steering device 32, Embedded to the stereoscopic interaction device 30, the infrared capture camera in the infrared coordinate component 31 captures an image and determines its spatial coordinates. Therefore, the angle and the number of the infrared emitting device have an influence on the image capturing effect of the infrared capturing camera.
  • the infrared emitting unit 311 is preferably four infrared emitting devices, and the installed position is preferably two or two in the three-dimensional interactive device.
  • the left and right sides of 30 are used to ensure that the emitted infrared light can effectively cover the liquid crystal display screen of the entire stereo device, and the four infrared emitting devices can also be embedded with the upper and lower sides or either side of the above display screen.
  • Four infrared emitting devices are used to emit infrared light.
  • the infrared capturing unit 312 After being reflected by the optical identification point, acquires a spatial image of the visual aid device 33 or the interactive control device 32 provided with the optical identification point, and the stereo interaction device 30 acquires the spatial image.
  • the spatial coordinates of the above-mentioned visual aid device 33 or the interactive manipulation device 32 are obtained.
  • an infrared emitting device can also track the visual aid device and the interactive control device, and the position and number of the infrared emitting device are not limited.
  • the optical capturing unit 312 includes at least two infrared capturing cameras for acquiring a target image, so as to effectively acquire spatial coordinate data of the target, and simultaneously acquire a spatial image having a parallax characteristic of the spatial target, according to the position of the infrared capturing camera and The projection principle acquires the spatial coordinates of the spatial target.
  • the infrared capture unit 312 is preferably four infrared capture cameras embedded in the stereo interaction device, and the two adjacent cameras compensate each other, or two Or more than 4 sets, increasing the number of elevation angles of the infrared capture camera can expand the range of the capture distance, but the acquired image has a large distortion, and the error of the acquired target space coordinates is also large.
  • the infrared capture camera has a tilt angle of at least 70 in the technical solution. Degree The range is from 70 degrees to 130 degrees. In the preferred range, the infrared capture camera can obtain a space image with almost no distortion within a capture distance of 0 ⁇ 3m, and can obtain more accurate target space coordinates when the capture distance is large. .
  • the infrared capture camera in the technical solution has a refresh rate greater than 60 Hz, which can greatly improve the smoothness of the infrared capture camera capturing the target trajectory, improve the tracking accuracy, and make the stereoscopic effect of the image acquired by the visual aid device 33 better.
  • the infrared light emitted by the four infrared emitting devices is tested according to the size of the display screen to determine the number of angles between the four infrared transmitting devices and the four infrared capturing cameras and the display screen.
  • the four infrared transmitting devices are adjacent to each other and can compensate each other to ensure that there is no dead angle covering the entire liquid crystal display screen.
  • the two infrared capture cameras can compensate each other on the same side, ensuring effective capture.
  • the image of the visual aid device 33 or the interactive control device 32 in the infrared light emission range is taken.
  • the first optical identification point 313 is disposed on the visual aid device 33, and the second optical identification point 314 is disposed on the interactive control device 32.
  • the first and second optical identification points may be active infrared emitting devices, or may be passive.
  • Optical reflection point as a preferred embodiment, the first optical identification point 313 is preferably an optical reflection point, the second optical identification point 314 is an infrared emitting device, and the optical reflection point is provided with an infrared light reflecting substance, and the passive optical identification point is set in the polarized light.
  • the problem of increased cost of the infrared emitting device circuit of the active shutter glasses can be avoided.
  • the second optical identification point 314 is disposed on the circuit board inside the interactive control device 32, and the active infrared emitting device can avoid the hand-inconvenience and wear problem caused by the passive infrared reflection point.
  • the number of infrared emitting devices in the present technical solution is preferably two: respectively disposed at the two top positions of the internal circuit board of the interactive control device 32, so that even if one of the infrared emitting devices is blocked, the interactive operating device 32 can be stereoscopically
  • the interactive device 30 is effectively tracked. It should be noted that the number of the infrared transmitting devices may be multiple, and the number of the specific infrared transmitting devices should be determined according to actual needs.
  • the visual aid device 33 is worn on the head, and the infrared light is reflected by the infrared reflection point, and the infrared capture camera captures the image to determine the coordinates of the operator's head.
  • the number of infrared reflection points is at least two, and the position of the infrared reflection point can be set. At any position of the above-described visual aid device 33, preferably:
  • the infrared reflection point is 3 turns, one of which has an infrared reflection point to set the nose pad position of the visual aid device 33, and
  • the outer two are symmetric with respect to the nose pad, and are respectively disposed at the upper left corner position of the left lens and the upper right corner of the right lens so as to be able to completely track the head dynamic coordinate;
  • the infrared reflection point is 4 turns, one of which has an infrared reflection point to set the nose pad position of the visual aid device 33, and two of the remaining three are symmetric with respect to the nose pad, respectively, and are respectively disposed at the upper left corner of the left lens. Position, position of the upper right corner of the right lens, the last one is set at the lower left corner of the left lens and the lower right corner of the right lens;
  • the infrared reflection point is 5 ⁇
  • one of the infrared reflection points is set to the nose pad position of the visual aid device 33, and the remaining 4 are respectively set to the upper left corner position, the lower left corner position, the right lens upper right corner position, and the lower right corner position.
  • the five infrared reflection points define the frame of the polarized glasses, which ensures the accuracy of the operator's head tracking.
  • the number of infrared emission points can be more without considering the cost factor.
  • the stereo interaction system further includes:
  • the visual aid device 33 is configured to acquire virtual stereoscopic content from the stereo interaction device 30.
  • the visual aid device 33 may be a polarized stereoscopic glasses provided with a specific number of infrared reflection points.
  • the stereo interaction device 30 in the technical solution has an optical processing unit, and the liquid crystal molecules of the liquid crystal display screen in the stereo interaction device 30 are changed.
  • the arrangement decomposes the original image, and divides the original image into two groups of vertically polarized light and horizontally polarized light. Then, polarized lenses of different polarization directions are respectively used on the left and right sides of the polarized stereo glasses, so that the left and right eyes of the person can receive two sets of pictures. , through the brain to synthesize stereoscopic images.
  • the visual aid device 33 may also be an active shutter glasses provided with a specific number of infrared reflection points.
  • the image of the stereo interaction device 30 having a refresh rate of at least 120 Hz is framed.
  • the format of the sequence realizes that the left and right frames are alternately generated.
  • the shutter glasses receive the synchronization signal of the stereo interaction device 30, and the left and right liquid crystal lenses are opened or closed at the same frequency, and the corresponding images are viewed by the left and right eyes while refreshing, and remain the same as the 2D video.
  • the number of frames, the operator's two eyes see different images of fast switching, and create an illusion in the brain to view the stereoscopic image.
  • FIG. 4 is a schematic diagram of the self-test circuit of the visual aid device in a desktop type spatial stereo interaction system according to the present invention.
  • the shutter glasses in the technical solution have a state self-test function, so that the power can be turned off and the power consumption can be reduced.
  • the state self-test circuit 331 can be an acceleration sensor detection circuit 3311, and the acceleration sensor detection circuit 3311 detects In the active shutter glasses state, the acceleration sensor detection circuit 3311 can be two-axis or three-axis, the state changes, or the distance parameter is detected to be less than a certain threshold ⁇ , in the set In the daytime, the working mode of the Bluetooth master chip is controlled. For example, it is detected that the state of the active shutter glasses is changed from static to motion, and the time of the Bluetooth master chip is awake for 2s to 2s, so that the active shutter is activated.
  • the Bluetooth master chip of the glasses enters the working state, and the user starts to use; when the state of the active shutter glasses is detected to change from the motion to the stationary state, the setting is 3s to 3s ⁇ , and the Bluetooth master of the active shutter glasses is enabled. The chip, the user stops using. By detecting the state of the active shutter glasses, the automatic control of the working mode of the Bluetooth main control chip is realized, the waste of electric energy is reduced, the effective life and the user experience are improved, and the user experience is improved;
  • the state self-test circuit may also be an angular velocity detecting circuit 3312 for detecting a change in the moving angle of the active shutter glasses, and adopting an active shutter. The angle change of the glasses is detected to realize the control of the working mode of the Bluetooth master chip, which will not be described here.
  • the state self-test circuit may also be an angular velocity detecting circuit 3312.
  • the distance sensor detecting circuit 3313 is configured to detect the distance from the active shutter glasses to the face. When the distance of the face is detected to be less than a threshold, the active The shutter-type glasses Bluetooth master chip performs the operation of the gate, for example, when the distance between the active shutter glasses and the user's face is less than 20 mm, and the Bluetooth master chip of the active shutter glasses enters the working state, when detecting When the distance from the active shutter glasses to the user's face is greater than 40 mm for more than 3 s, the Bluetooth master chip of the active shutter glasses enters a sleep state.
  • the active shutter glasses can also be combined with the acceleration sensor detection circuit 3311, the angular velocity detection circuit 3312, and the distance sensor detection circuit 3313 to realize automatic control of the working mode of the active shutter glasses Bluetooth master chip.
  • any acceleration sensor detection circuit 3311, angular velocity detection circuit 3312, and distance sensor detection circuit 3313 are used to detect the active shutter glasses motion state or distance parameter, or a combination of the three.
  • the technical solution for realizing the automatic control of the working mode of the Bluetooth master chip belongs to the protection scope of the utility model.
  • the active shutter type Bluetooth main control chip in the technical solution adopts a BCM series chip, and the BCM series chip is a Bluetooth main control chip produced by American Broadcom.
  • the BCM series chip is a Bluetooth main control chip produced by American Broadcom.
  • FIG. 2 is a schematic diagram showing the logical composition of components included in the interactive control device in a desktop spatial stereo interaction system according to the present invention
  • the stereo interaction system in the technical solution further includes:
  • the interaction control device 32 is configured to output an operation instruction to the stereo interaction device 30.
  • the interactive manipulation device 32 includes a nine-axis motion sensor 322 for acquiring the spatial attitude original data and an MCU 321 for processing the spatial attitude original data into Euler angle parameters and quaternions, and a nine-axis motion sensor.
  • the 322 is connected to the MCU 321.
  • the MCU 321 in the interactive control device 32 has strong processing capability, small size, and low cost, and is very suitable for the interactive control device 32 in this embodiment.
  • the processing unit can also adopt the D SP and the FPGA. Processing chips with data processing capabilities.
  • the motion sensor is a nine-axis motion sensor 322.
  • the stereo interaction device 30 captures an image through the infrared capture camera, and can only determine the spatial coordinate position of the interactive manipulation device 32, and cannot track the interactive manipulation device 32 relative to the stereo interaction device.
  • the nine-axis motion sensor 322 is a combination of three types of sensors: a three-axis acceleration sensor unit 3221, a three-axis gyro unit 3222, and a three-axis geomagnetic sensor unit 3223, and the three portions cooperate and function. With the acceleration sensor unit and gyroscope unit 3222, the complete motion state of the device can be basically described.
  • the geomagnetic sensor unit 3223 can correct the compensation deviation by measuring the earth's magnetic field and correcting the compensation by the absolute pointing function, thereby correcting the moving direction, the attitude angle, the moving force and the speed of the interactive control device 32.
  • the nine-axis motion sensor 322 can improve the accuracy of the dynamic attitude position tracking of the interactive manipulation device 32, the occurrence of the "drift" problem of the cursor on the stereo interaction device 30 by the interactive manipulation device 32 is avoided.
  • the attitude original data detected by the nine-axis motion sensor 322 includes three degrees of freedom acceleration, angular velocity, and direction
  • the nine-axis motion sensor 322 is composed of the acceleration sensor unit 3221, the gyro unit 3222, and the geomagnetic sensor unit 3223.
  • the absolute direction of the output of the nine-axis motion sensor 322 is derived from the gravity field of the earth and the magnetic field of the earth.
  • the static final accuracy of the nine-axis motion sensor 322 depends on the measurement accuracy of the magnetic field and the measurement accuracy of gravity, and the dynamic performance depends on the gyro. Instrument unit 3222.
  • the acceleration sensor unit 3221 and the gyro unit 3222 in the consumer-grade nine-axis motion sensor 322 have a large interference noise.
  • the integration of the gyro unit 3222 of the ADI will drift about 2 degrees for one minute.
  • the architecture of the low-cost gyro unit 3222 and the acceleration sensor unit 3221 The field vector must be used for correction.
  • the nine-axis motion sensor 322 in the present technical solution utilizes the three-dimensional gyro unit 3222 to quickly track the three-dimensional posture of the interactive manipulation device 32.
  • the gyro unit 3222 unit is used as the core, and the direction of the acceleration and the geomagnetic field is also measured as a system.
  • Provide a reliable reference Specifically, the absolute angular rate, the acceleration, and the magnetic field strength in three directions of the carrier are measured, and the quaternion of the interactive control device 32, the attitude data, and the like are obtained.
  • An integrated algorithm is needed to provide accurate, reliable, and reliable attitude output for the system.
  • the refresh rate of the nine-axis motion sensor in the technical solution is greater than 60 Hz, which ensures the smoothness of the spatial attitude trajectory of the interactive control device acquired by the stereo interaction device 30, makes the operation cursor signal more continuous, and responds to the operation instruction. .
  • the outer casing of the interactive control device 32 has a common pen shape, and the interactive control device 32 is provided with a plurality of function buttons for operating the virtual stereoscopic content displayed by the stereo interaction device 30.
  • the shape and size of the pen-shaped housing are preferably just that the user can hold it.
  • the interactive control device 32 is connected to the stereo interaction device 30 by using a USB data line of the HID transmission protocol, and the pen-shaped housing is provided with a gap for use with the USB interface, compared to the conventional HDMI data line, the USB data line. More versatile, its data transmission has more reliable advantages than wireless connection.
  • the interactive control device 32 is further provided with a plurality of function buttons, which are equivalent to the functions of an ordinary mouse before entering the stereoscopic content display, and are moved on the display screen of the stereo interaction device 30, and the stereoscopic content resources to be displayed are selected, for example. Click to enter, or, to display the stereo content; after entering the stereo content, the button unit can also pop up the menu shortcut key, grab and drag the stereo content to move in all directions.
  • the spatial attitude data processing process in the technical solution is: the infrared coordinate component 31 respectively obtains the spatial position image of the visual aid device 33 and the interactive manipulation device 32 to be transmitted to the stereo interaction device 30, and the stereo interaction device 30 acquires according to the spatial location algorithm.
  • the first and second spatial coordinate data; the inter-manipulation device acquires the spatial attitude raw data through the nine-axis sensor and transmits it to the MCU 321 for processing, and the MCU 321 processes the original data into the Euler angle parameter of the spatial attitude of the interactive control device 32 according to the spatial data fusion algorithm.
  • the quaternion is transmitted to the stereo interactive device 30.
  • the stereo interaction device 30 determines the spatial position and posture of the interactive manipulation device 32 based on the second spatial coordinate data and the Euler angle parameters and the quaternion.
  • the first and second spatial coordinate data are respectively the visual aid device 3 3.
  • the specific interaction method is: the operator moves the cursor of the interactive manipulation device 32 to a specific position of the virtual stereoscopic content displayed by the stereo interaction device 30, and the stereo interaction device 30 acquires an operation instruction of the interaction control device 32.
  • the stereo interaction device 30 operates a specific virtual stereoscopic display function in accordance with an operation instruction.
  • the stereoscopic interaction device 30 acquires an operator visual observation point through the infrared coordinate component 31, and transmits the stereoscopic display content matching the visual observation point to the operator's eyes through the visual aid device 33.
  • the spatial data processing method of the desktop spatial stereo interaction system shown in FIG. 1 is as follows:
  • Step 101 Acquire first and second spatial position images, and combine the spatial position algorithm to obtain first and second spatial coordinate data.
  • the first and second spatial position images are spatial position images of the visual aid device 33 and the interactive manipulation device 32, respectively, and the first and second spatial coordinate data respectively refer to the visual aid device 33 and the interactive manipulation device 32 through the infrared coordinate component. 31 determined spatial coordinate data.
  • the infrared coordinate component 31 respectively acquires the spatial position image of the visual aid device 33 and the interactive manipulation device 32 to the stereo interaction device 30, and the stereo interaction device 30 acquires the first and second spatial coordinate data according to the spatial position algorithm.
  • the infrared capturing unit 312 acquires a spatial image of the visual aid device 33 or the interactive control device 32 provided with the optical identification point.
  • the spatial interaction device 30 acquires the spatial image
  • the stereoscopic interaction device 30 acquires the visual aid according to the spatial image coordinate algorithm.
  • the infrared capture unit 312 is preferably four infrared capture cameras.
  • Step 102 Acquire spatial orientation original data of the interaction control device 32, and process the spatial attitude original data into a spatial attitude Euler angle parameter and a quaternion;
  • the interactive control device 32 detects the motion posture of the interactive manipulation device 32 through the nine-axis motion sensor 322, acquires the spatial attitude original data and transmits it to the MCU 321 for processing, and the MCU 321 processes the original data into the Euler angle of the spatial posture of the interactive manipulation device 32.
  • the parameters and quaternions are transmitted to the stereo interaction device 30.
  • the stereo interaction device 30 captures an image by the infrared camera, and can only determine the spatial coordinate position of the interactive manipulation device 32, and cannot track the complete motion posture of the interactive manipulation device 32 with respect to the screen of the stereo interaction device 30.
  • the nine-axis motion sensor 322 is a combination of three types of sensors: a three-axis acceleration sensor unit 3221, a three-axis gyro unit 3222, and a three-axis geomagnetic sensor unit 3223, and the three portions cooperate with each other. With the acceleration sensor unit and the gyro unit 3222, the complete motion state of the device can be basically described.
  • the geomagnetic sensor unit 3223 can correct the compensation by the absolute pointing function by measuring the earth's magnetic field, thereby effectively correcting the cumulative deviation, thereby correcting the moving direction, the attitude angle, the moving force and the speed of the interactive control device 32.
  • the nine-axis motion sensor 322 can improve the accuracy of the dynamic attitude position tracking of the interactive manipulation device 32, the occurrence of the "drift" problem of the cursor on the stereo interaction device 30 by the interactive manipulation device 32 is avoided.
  • Step 103 Determine, according to the second spatial coordinate data, the Euler angle parameter, and the quaternion, a spatial position and posture of the interactive manipulation device 32 by using a spatial data fusion algorithm;
  • the stereo interaction device 30 determines the spatial position and posture of the interactive manipulation device 32 based on the second spatial coordinate data and the Euler angle parameters and the quaternion.
  • the stereo interaction device 30 needs to fuse the spatial coordinate data of the interactive manipulation device 32 and the posture original data (the Euler angle parameter and the quaternion) to obtain the final posture, and generate a corresponding spatial operation cursor.
  • the interactive method of the desktop spatial stereo interaction system includes:
  • Step 201 determining a visual observation point of the operator wearing the visual aid device 33, and acquiring a function key operation instruction of the interactive manipulation device 32, where the visual observation point is the visual aid device 33 relative to the virtual three-dimensional The spatial coordinate point of the content.
  • the stereoscopicity of an image frame is a result of the synthesis of left and right image frames having parallax characteristics in the human brain, which is actually an illusion, and therefore the visual observation point is stereoscopic for the virtual stereoscopic content displayed by the stereo interaction device 30.
  • the effect has a very important influence.
  • the visual observation point of the operator changes unconsciously, or changes the visual observation point for observation for some observation purpose, and the infrared coordinate component 31 tracks the head of the operator.
  • the coordinate of the motion space is determined, and the visual observation point is determined.
  • the purpose is to make the virtual stereoscopic content displayed by the stereo interaction device 30 have a better stereoscopic effect, and realize the interaction between the operator and the stereo interaction device 30.
  • Step 202 Display, according to the operation instruction, virtual three-dimensional content that matches the visual observation point.
  • the interactive control device 32 is provided with three different types of programmable function buttons for the stereo interaction device 30.
  • the displayed virtual stereo content is operated.
  • the interactive manipulation device 32 is equivalent to the function of the ordinary mouse; moving on the display screen of the stereo interaction device 30, selecting the stereoscopic content resource to be displayed, for example, clicking to enter, or,
  • the virtual stereo content is displayed; after entering the virtual stereo content, the button unit can also pop up the menu shortcut key, grab and drag the virtual stereo content to move in all directions.
  • the stereoscopic interactive device 30 operates a specific virtual stereoscopic display function in accordance with an operation instruction.
  • Step 203 The stereo interaction device 30 acquires an operator visual observation point through the infrared coordinate component 31, and transmits the stereoscopic display content matched with the visual observation point to the operator's eyes through the visual aid device 33.
  • the spatial attitude data detected by the nine-axis motion sensor corrects the spatial coordinate data detected by the infrared coordinate component 31, thereby effectively improving the spatial position of the interactive manipulation device 32. Tracking accuracy.
  • the technical solution aims to provide a desktop spatial stereo interaction system capable of accurately tracking the interactive manipulation device 32, and solves the problem of signal drift in the prior art;
  • the spatial interaction system includes a stereo interaction device 30 and an infrared coordinate component 31.
  • the interactive control device 32 and the visual aid device 33, the infrared coordinate component 31 tracks the visual aid device 33 and the interactive manipulation device 32, and transmits its spatial coordinate data to the stereo interaction device 30, the spatial coordinate data of the visual aid device 33 and the operator
  • the visual observation points are consistent.
  • the visual observation point herein refers to the spatial positional relationship of the human eye relative to the stereoscopic interaction device 30.
  • the stereo interaction device 30 determines the visual observation point to display the corresponding stereoscopic image frame in order to obtain The optimal stereoscopic effect enables the operator visual observation point to interact with the stereoscopic interaction device 30.
  • the interactive control device 32 is provided with a plurality of programmable function buttons, which are complicated in operation, and are used for completing the main interaction tasks between the operator and the stereo interaction device 30, and need to determine a precise spatial posture.
  • the interaction in the desktop spatial interaction system of the technical solution is
  • the control device 32 adopts a nine-axis sensor, detects the change of the spatial attitude without dead angle, and preprocesses the detected spatial attitude original data, and obtains the Euler angle parameter and the quaternion transmission to the stereo interaction device 30, only according to the spatial data.
  • the fusion algorithm can combine the spatial coordinate data with the Euler angle parameter and the quaternion to obtain the precise spatial attitude position of the interactive control device 32, which greatly reduces the processing load of the stereo interaction device 30; the operator operates through the function button as needed
  • the stereo interaction device 30 stereoscopic objects in the virtual scene realize human-computer interaction.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

本实用新型公开了一种桌面型空间立体交互系统,交互系统包括:立体交互设备,用于通过红外坐标组件跟踪操作人员的视觉观察点,获取交互操控装置的操作指令并显示与视觉观察点对应的虚拟立体内容;红外坐标组件,用于获取第一空间坐标数据、第二空间坐标数据并传输到立体交互设备;助视装置,用于从立体交互设备中获取虚拟立体内容;交互操控装置,用于输出操作指令到立体交互设备;实施本实用新型提出的一种桌面型空间立体交互系统,大大提高了交互操控装置的精确性,杜绝了交互操控装置操作过程中的信号漂移问题,降低了立体交互设备的处理负担。

Description

桌面型空间立体交互系统 技术领域
[0001] 本实用新型涉及多空间立体交互技术领域, 特别涉及一种能够快速处理操控装 置空间数据的桌面型空间立体交互系统。
背景技术
[0002] 2D视频显示吋, 连续显示不分左右眼视角的视频图像帧, 图象帧之间的吋间很 短, 因此, 人眼看到的是场景连续的视频, 而立体图像或视频的获取过程则复 杂得多, 拍摄吋需要两台并列的摄像机, 专用的立体摄像机包括两个成像镜头 , 相当于人的两个眼睛, 因此得到有视差的两组图像, 这两组图像经过专门的 设备或软件被合成为一组图像, 现在标准的合成各式是左右格式, 即左眼像宽 度压缩 1/2, 放在画面的左边, 右眼像宽度压缩 1/2, 放在画面的右边。 用专用的 立体设备显示吋, 左右眼像都被移到屏幕的中间, 并在宽度上放大一倍, 恢复 到原始图像的比例, 在不借助助视设备例如偏振眼镜、 快门式主动眼镜的情况 下, 人眼看到屏幕上的图像会有重影, 因为每次拍摄的图像都会有视角不同的 两个, 这是左右视角的图像在屏幕上叠加的结果。 在实际运用中获取较佳的立 体效果不仅需要助视设备, 还要获取观看人员的头部空间坐标信息以确定其视 觉观察点信息, 如果要操控立体图像, 还需要操控装置, 操控装置对立体图像 在空间中进行抓取、 拖拽移动及缩放操作, 需要精确跟踪操控装置的空间位置 f π息。
技术问题
[0003] 现有的交互系统中, 通过红外定位单元确定交互操控装置的空间位置, 进而与 立体交互设备进行交互, 但是由于交互操控装置操作复杂, 在实际交互过程中 会出现交互操控装置信号漂移的问题, 导致用户在交互过程中不能及吋准确选 中立体内容, 用户体验差。
问题的解决方案
技术解决方案 [0004] 针对现有的空间立体交互系统中的交互操控装置操作复杂, 在实际交互过程中 会出现交互操控装置信号漂移的问题, 导致用户在交互过程中不能准确选中立 体内容, 用户体验差的问题, 提出一种桌面型空间立体交互系统, 通过在交互 操控装置设置九轴运动传感器, 检测其在操作过程中 X、 Υ、 Ζ三维的加速度、 角速度及地磁方向的原始数据, 大大提高了交互操控装置的精确性, 杜绝了交 互操控装置操作过程中的信号漂移问题; 在交互操控装置设置 MCU, 对加速度 、 角速度及地磁方向的原始数据进行处理, 获得交互操控装置的欧拉角参数和 四元数, 立体交互设备只需将欧拉角参数和四元数与空间坐标数据进行融合就 可以获取精确姿态位置, 降低了立体交互设备的处理负担。
[0005] 提供一种桌面型空间立体交互系统, 用于实现操作人员与立体交互设备进行交 互, 所述空间交互系统包括:
[0006] 立体交互设备, 用于通过红外坐标组件跟踪操作人员的视觉观察点, 获取交互 操控装置的操作指令并显示与所述视觉观察点对应的虚拟立体内容;
[0007] 红外坐标组件, 用于获取第一空间坐标数据、 第二空间坐标数据并传输到所述 立体交互设备;
[0008] 助视装置, 用于从所述立体交互设备中获取虚拟立体内容;
[0009] 交互操控装置, 用于输出操作指令到所述立体交互设备;
[0010] 其中, 所述交互操控装置包括用于获取检测空间姿态原始数据的九轴运动传感 器及用于将所述空间姿态原始数据处理成欧拉角参数和四元数的 MCU, 所述九 轴运动传感器与所述 MCU连接。
[0011] 结合第一方面, 第一种可能的实现方式中所述红外坐标组件包括设于所述立体 交互设备上的红外发射单元、 光学捕捉单元、 设于所述助视装置上的第一光学 识别点以及设于所述交互操控装置上的第二光学识别点。
[0012] 结合第一方面的第一种可能的实现方式, 第二种可能的实现方式中所述红外发 射单元包括至少 1个用于发射红外光的红外发射装置, 所述光学捕捉单元包括至 少 2个用于采集目标图像的红外捕捉相机; 所述红外发射装置与红外捕捉相机嵌 入安装在所述立体交互设备上。
[0013] 结合第一方面, 第三种可能的实现方式中所述九轴运动传感器包括加速度传感 器单元、 陀螺仪单元及地磁传感器单元。
[0014] 结合第一方面, 第四种可能的实现方式中所述第一、 二光学识别点为主动式红 外发射器件或被动式光学反射点。
[0015] 结合第一方面的第四种可能的实现方式, 第五种可能的实现方式中所述第一光 学识别点为被动式光学反射点, 所述光学反射点的数量至少为 2个; 所述第二光 学识别点为主动式红外发射器件, 所述红外发射器件设置在所述交互操控装置 的顶端位置。
[0016] 结合第一方面, 第六种可能的实现方式中所述交互操控装置设置有用于操作所 述立体交互设备显示的虚拟立体内容的可编程功能按键。
[0017] 结合第一方面, 第七种可能的实现方式中所述助视装置为偏光式立体眼镜或快 门式立体眼镜。
[0018] 结合第一方面, 第八种可能的实现方式中所述红外捕捉相机镜头的视角至少为 70度。
[0019] 结合第一方面, 第九种可能的实现方式中所述立体交互设备的红外坐标组件的 捕获距离为 0~3m。
发明的有益效果
有益效果
[0020] 实施本实用新型所述的一种桌面型空间立体交互系统, 通过在交互操控装置设 置九轴运动传感器, 检测其在操作过程中 X、 Υ、 Ζ三维的加速度、 角速度及地 磁方向的原始数据, 大大提高了交互操控装置的精确性, 杜绝了交互操控装置 操作过程中的信号漂移问题; 在交互操控装置设置 MCU, 对加速度、 角速度及 地磁方向的原始数据进行处理, 获得交互操控装置的欧拉角参数和四元数, 立 体交互设备只需将欧拉角参数和四元数与空间坐标数据进行融合就可以获取精 确姿态位置, 降低了立体交互设备的处理负担。
对附图的简要说明
附图说明
[0021] 为了更清楚地说明本实用新型实施例中的技术方案, 下面将对实施例描述中所 需要使用的附图作简单地介绍, 显而易见地, 下面描述中的附图仅仅是本实用 新型的一些实施例, 对于本领域普通技术人员来讲, 在不付出创造性劳动的前 提下, 还可以根据这些附图获得其他的附图。
[0022] 图 1是一种桌面型空间立体交互系统逻辑连接示意图;
[0023] 图 2是本实用新型中一种桌面型空间立体交互系统中的交互操控装置包含的组 件的逻辑组成示意图;
[0024] 图 3是本实用新型中一种桌面型空间立体交互系统中的交互操控装置包含组件 的逻辑组成及信号流向示意图;
[0025] 图 4是本实用新型中一种桌面型空间立体交互系统中的助视装置状态自测电路 示意图;
[0026] 附图中各数字所指代的部位名称为: 30—立体交互设备、 31—红外坐标组 件、 32—交互操控装置、 33—助视装置、 311—红外发射单元、 312— 红外捕捉单元、 313—第一光学识别单元、 314—第二光学识别单元、 321— _MCU、 322—九轴运动传感器、 3221—加速度传感器单元、 3222—陀螺 仪单元、 3223—地磁传感器单元、 331—状态自测电路、 3311—加速度传 感器检测电路、 3312—角速度检测电路、 3313—距离传感器检测电路。
[0027] 本实用新型的实施方式
[0028] 下面将结合实用新型中的附图, 对本实用新型中的技术方案进行清楚、 完整地 描述, 显然, 所描述的实施例仅仅是本实用新型的一部分实施例, 而不是全部 的实施例。 基于本实用新型中的实施例, 本领域普通技术人员在没有付出创造 性劳动前提下所获得的其他实施例, 都属于本实用新型保护的范围。
[0029] 1、 本技术方案中的系统
[0030] 请参考图 1, 图 1是一种桌面型空间立体交互系统逻辑连接示意图, 本技术方案 旨在提供一种能够高精度追踪交互操控装置 32的桌面型空间立体交互系统, 解 决现有技术中的信号漂移问题; 空间交互系统包括立体交互设备 30、 红外坐标 组件 31、 交互操控装置 32及助视装置 33, 红外坐标组件 31跟踪助视装置 33及交 互操控装置 32, 并将其空间坐标数据传输到立体交互设备 30, 助视装置 33的空 间坐标数据与操作人员的视觉观察点一致, 这里的视觉观察点指的是人眼相对 于立体交互设备 30显示屏幕的空间位置关系, 立体交互设备 30确定视觉观察点 的目的在于显示对应的立体图象帧, 以便获得最佳的立体效果, 实现操作人员 视觉观察点与立体交互设备 30交互。 交互操控装置 32上设有多个可编程功能按 键, 用于完成操作人员与立体交互设备 30的主要交互任务, 由于交互操控装置 3 2操作复杂, 需要确定其精确空间姿态, 本技术方案桌面型空间立体交互系统中 的交互操控装置 32采用九轴运动传感器 322, 无死角检测空间姿态变化, 并对检 测到的空间姿态原始数据进行预处理, 得出欧拉角参数和四元数传输到立体交 互设备 30, 只需要根据空间数据融合算法, 将空间坐标数据与欧拉角参数和四 元数进行融合处理即可获取交互操控装置 32的精确空间姿态位置, 大大降低立 体交互设备 30的处理负担; 操作人员根据需要通过可编程功能按键操作立体交 互设备 30虚拟场景中的立体对象, 实现人机交互。
[0031] 2、 系统实施例
[0032] 提供一种桌面型空间立体交互系统, 用于实现操作人员与立体交互设备 30进行 交互, 空间交互系统包括:
[0033] 立体交互设备 30, 用于通过红外坐标组件 31跟踪操作人员的视觉观察点, 获取 交互操控装置 32的操作指令并显示与视觉观察点对应的虚拟立体内容。
[0034] 请参考图 3, 图 3是本实用新型中一种桌面型空间立体交互系统中的交互操控装 置包含组件的逻辑组成及信号流向示意图;
[0035] 进一步地, 红外坐标组件 31, 用于获取第一、 二空间坐标数据并传输到立体交 互设备 30, 上述第一、 二空间坐标数据分别为助视装置 33及交互操控装置 32的 空间坐标数据。
[0036] 进一步地, 红外坐标组件 31包括设于立体交互设备 30上的红外发射单元 311、 光学捕捉单元、 设于助视装置 33上的第一光学识别点 313以及设于交互操控装置 32上的第二光学识别点 314。 红外发射单元 311包括至少 1个用于发射红外光的红 外发射装置, 光学捕捉单元 312包括至少 2个用于采集目标图像的红外捕捉相机 ; 上述红外发射装置与红外捕捉相机嵌入安装在立体交互设备上。
[0037] 具体地, 立体交互设备 30, 支持快门式立体技术, 当立体图像或视频输入到立 体交互设备 30后, 立体交互设备 30的刷新率至少 120Hz的图像便以帧序列的格式 实现左右帧交替产生, 快门式眼镜接收立体交互设备 30的同步信号, 以同样的 频率打幵或关闭左右液晶镜片, 刷新同步实现左右眼观看对应的图像, 并且保 持与 2D视像相同的帧数, 操作人员的两只眼睛看到快速切换的不同画面, 并且 在大脑中产生错觉, 便观看到立体影像。 本技术方案中的立体交互设备 30, 内 置光学处理单元, 还可以配合偏光式立体眼镜使用, 通过改变立体交互设备 30 中液晶显示屏幕的液晶分子的排列分解原始图像, 把原始图像分为垂直向偏振 光和水平向偏振光两组画面, 然后立体眼镜左右分别采用不同偏振方向的偏光 镜片, 这样人的左右眼就能接收两组画面, 再经过大脑合成立体影像。
[0038] 红外坐标组件 31中的红外发射单元 311包括至少 1个用于发射红外光的红外发射 装置, 红外发射装置用于发射红外光到助视装置 33或交互操控装置 32的光学反 射点, 嵌入安装至立体交互设备 30, 用于使红外坐标组件 31中的红外捕捉相机 采集图像, 并确定其空间坐标。 因此红外发射装置的角度与数量对于红外捕捉 相机的采集图像效果有影响, 作为一种优选实施方案, 红外发射单元 311优选为 四台红外发射装置, 安装的位置优选为两两设置在立体交互设备 30的左右两侧 , 以保证发射的红外光线能有效覆盖整个立体设备的液晶显示屏幕, 四台红外 发射装置也可以两两嵌入式安装上述显示屏幕的上下两侧或任意一侧。 四台红 外发射装置用于发射红外光, 经过光学识别点反射后, 红外捕捉单元 312获取设 有光学识别点的助视装置 33或交互操控装置 32的空间图像, 立体交互设备 30获 取空间图像后, 根据空间图像坐标算法, 获取上述助视装置 33或交互操控装置 3 2的空间坐标。 但是在对红外捕捉相机的捕捉距离范围要求不高的情况下, 一个 红外发射装置也能实现对助视装置和交互操控装置的追踪, 这里对红外发射装 置安装的位置及数量不做限制。
[0039] 光学捕捉单元 312包括至少 2个用于采集目标图像的红外捕捉相机, 才能有效获 取目标的空间坐标数据, 同吋采集空间目标的具有视差特性的空间图像, 根据 红外捕捉相机的位置及投影原理获取空间目标的空间坐标; 作为一种优选实施 方式, 红外捕捉单元 312优选为四台红外捕捉相机, 嵌入安装至立体交互设备上 , 相邻的两个相机相互补偿, 也可以为两台或 4台以上, 增加红外捕捉相机的仰 角度数可以扩大捕捉距离范围, 但是采集的图像畸变较大, 获取的目标空间坐 标的误差也较大, 本技术方案中的红外捕捉相机仰角度数至少为 70度, 优选范 围为 70度 -130度, 在优选度数范围内能够使红外捕捉相机在捕获距离为 0~3m内 获取几乎不畸变的空间图像, 在捕捉距离较大的情况下能获取较为准确的目标 空间坐标。
[0040] 本技术方案中的红外捕捉相机, 刷新率大于 60Hz, 能大幅度提高红外捕捉相机 捕捉目标轨迹的平滑性, 提高其跟踪精度, 使助视装置 33获取的图像的立体效 果更佳。
[0041] 安装前, 根据显示屏幕的尺寸对四台红外发射装置发射的红外光进行测试, 以 确定四台红外发射装置及四台红外捕捉相机与显示屏幕的夹角度数。 四台红外 发射装置同侧相邻近的两台可以相互补偿, 确定无死角覆盖整个液晶显示屏幕 , 同理, 四台红外捕捉相机同侧相邻近的两台可以相互补偿, 保证能有效抓取 红外光发射范围内的助视装置 33或交互操控装置 32图像。
[0042] 第一光学识别点 313设置助视装置 33上, 第二光学识别点 314设置交互操控装置 32上, 第一、 二光学识别点可以为主动式的红外发射器件, 也可以为被动式的 光学反射点, 作为优选实施方式, 第一光学识别点 313优选为光学反射点, 第二 光学识别点 314为红外发射器件, 光学反射点上设有红外光反射物质, 被动式光 学识别点设置在偏光眼镜上吋, 可以避免主动快门眼镜的红外发射器件电路成 本增加的问题。
[0043] 第二光学识别点 314设置在交互操控装置 32内部的电路板上, 采用主动式的红 外发射器件可以避免被动式的红外反射点带来的手持不便以及磨损问题。 本技 术方案中的红外发射器件数量优选为两个: 分别设置在交互操控装置 32的内部 电路板的两顶端位置, 这样即使其中有一个红外发射器件被遮挡也能保证交互 操控装置 32能够被立体交互设备 30有效追踪到, 需要说明的是, 红外发射器件 的数量可以为多个, 这里不做限制, 具体的红外发射器件数量应根据实际需要 来确定。
[0044] 助视装置 33佩戴在头部, 通过红外反射点反射红外光、 红外捕捉相机抓拍图像 , 确定操作人员头部坐标, 红外反射点的数量至少为 2个, 红外反射点的位置可 以设置在上述助视装置 33的任意位置, 优选地:
[0045] 红外反射点为 3个吋, 其中有一个红外反射点设置助视装置 33的鼻托位置, 另 外两个相对于鼻托对称, 分别设置在在左镜片的左上角位置、 右镜片右上角位 置, 以便能完整跟踪头部动态坐标;
[0046] 红外反射点为 4个吋, 其中有一个红外反射点设置助视装置 33的鼻托位置, 剩 下三个中的两个相对于鼻托对称, 分别设置在在左镜片的左上角位置、 右镜片 右上角位置, 最后一个设置在左镜片的左下角位置、 右镜片右下角位置;
[0047] 红外反射点为 5个吋, 其中有一个红外反射点设置助视装置 33的鼻托位置, 剩 余的 4个分别设置左上角位置、 左下角位置、 右镜片右上角位置、 右下角位置, 5个红外反射点确定了偏光眼镜的框架, 保证了操作人员头部跟踪的精确性。
[0048] 在不考虑成本因素的情况下, 红外发射点的数量还可以更多。
[0049] 立体交互系统还包括:
[0050] 助视装置 33, 用于从立体交互设备 30中获取虚拟立体内容。
[0051] 助视装置 33可以为设有特定数量红外反射点的偏光式立体眼镜, 本技术方案中 的立体交互设备 30, 内置光学处理单元, 通过改变立体交互设备 30中液晶显示 屏幕的液晶分子的排列分解原始图像, 把原始图像分为垂直向偏振光和水平向 偏振光两组画面, 然后偏光式立体眼镜左右分别采用不同偏振方向的偏光镜片 , 这样人的左右眼就能接收两组画面, 再经过大脑合成立体影像。
[0052] 助视装置 33也可以为设置了特定数量红外反射点的主动快门式眼镜, 当立体图 像或视频输入到立体交互设备 30后, 立体交互设备 30的刷新率至少 120Hz的图像 便以帧序列的格式实现左右帧交替产生, 快门式眼镜接收立体交互设备 30的同 步信号, 以同样的频率打幵或关闭左右液晶镜片, 刷新同步实现左右眼观看对 应的图像, 并且保持与 2D视像相同的帧数, 操作人员的两只眼睛看到快速切换 的不同画面, 并且在大脑中产生错觉, 便观看到立体影像。
[0053] 请参考图 4, 图 4是本实用新型中一种桌面型空间立体交互系统中的助视装置状 态自测电路示意图。
[0054] 本技术方案中的快门式眼镜具有状态自测功能, 以便能及吋关闭电源, 减少电 耗; 具体地, 状态自测电路 331可以为加速度传感器检测电路 3311, 加速度传感 器检测电路 3311检测主动快门式眼镜状态, 加速度传感器检测电路 3311可以为 两轴或三轴, 状态发生改变吋或检测到距离参数小于一定的阈值吋, 在设定的 吋间内, 对蓝牙主控芯片的工作模式进行控制, 例如, 检测到主动快门式眼镜 的状态由静止转为运动吋, 唤醒蓝牙主控芯片的吋间为 2s, 到 2s吋, 使主动快门 式眼镜的蓝牙主控芯片进入工作状态, 用户幵始使用; 当检测到主动快门式眼 镜的状态由运动转为静止吋, 吋间设置 3s, 到 3s吋, 使主动快门式眼镜的蓝牙主 控芯片, 用户停止使用。 通过检测主动快门式眼镜的状态, 实现了对蓝牙主控 芯片工作模式的自动控制, 减少了电能的浪费, 提高了有效续航吋间及用户体 验, 提高了用户体验;
[0055] 为了实现对电源的自动控制, 减少电能的浪费, 提高有效续航吋间, 状态自测 电路也可以为角速度检测电路 3312, 用于检测主动快门式眼镜的运动角度变化 , 通过对主动快门式眼镜的角度变化进行检测来实现对蓝牙主控芯片工作模式 的控制, 这里不再赘述。
[0056] 状态自测电路也可以为角速度检测电路 3312距离传感器检测电路 3313, 距离传 感器检测电路 3313用于检测主动快门式眼镜到面部的距离, 当检测到面部的距 离小于一阈值吋, 对主动快门式眼镜蓝牙主控芯片进行幵关操作, 例如, 当检 测到主动快门式眼镜到用户面部的距离小于 20mm吋间超过 2s吋, 使主动快门式 眼镜的蓝牙主控芯片进入工作状态, 当检测到主动快门式眼镜到用户面部的距 离大于 40mm吋间超过 3s吋, 使主动快门式眼镜的蓝牙主控芯片进入睡眠状态。
[0057] 当然, 也可以在主动快门式眼镜同吋设置加速度传感器检测电路 3311、 角速度 检测电路 3312以及距离传感器检测电路 3313进行组合来实现对主动快门式眼镜 蓝牙主控芯片工作模式的自动控制, 以达到提高有效续航吋间以及用户体验的 目的, 因此, 任何采用加速度传感器检测电路 3311、 角速度检测电路 3312以及 距离传感器检测电路 3313检测主动快门式眼镜运动状态或距离参数, 或者三者 的组合来实现对其蓝牙主控芯片工作模式的自动控制的技术方案都属于本实用 新型的保护范围。
[0058] 为了降低电路板功耗, 作为一种优选实施方式, 本技术方案中的主动快门式的 蓝牙主控芯片采用 BCM系列芯片, BCM系列芯片, 是美国博通公司生产的蓝牙 主控芯片, 具有增强型数据传输能力, 支持蓝牙通信技术, 功耗低, 有利于增 加主动快门式眼镜的有效续航吋间。 [0059] 请参考图 2, 图 2是本实用新型中一种桌面型空间立体交互系统中的交互操控装 置包含的组件的逻辑组成示意图;
[0060] 本技术方案中的立体交互系统还包括:
[0061] 交互操控装置 32, 用于输出操作指令到立体交互设备 30。
[0062] 进一步地, 交互操控装置 32包括用于获取检测空间姿态原始数据的九轴运动 传感器 322及用于将空间姿态原始数据处理成欧拉角参数和四元数的 MCU321 , 九轴运动传感器 322与 MCU321连接。
[0063] 交互操控装置 32中 MCU321的处理能力强、 体积小、 造价低廉, 非常适用于本 实施例中的交互操控装置 32, 在体积要求放宽的情况下, 处理单元也可以采用 D SP、 FPGA等有数据处理能力的处理芯片。
[0064] 进一步地, 运动传感器为九轴运动传感器 322, 立体交互设备 30通过红外捕捉 相机拍摄图像, 只能确定交互操控装置 32的空间坐标位置, 并不能追踪交互操 控装置 32相对于立体交互设备 30屏幕的完整运动姿态。 九轴运动传感器 322是三 种传感器的组合: 三轴加速度传感器单元 3221、 三轴陀螺仪单元 3222和三轴地 磁传感器单元 3223, 三个部分相互配合、 作用。 利用加速传感器单元和陀螺仪 单元 3222, 基本可以描述设备的完整运动状态。 但是随着长吋间运动, 也会产 生累计偏差, 不能准确描述运动姿态, 比如操控画面发生倾斜。 地磁传感器单 元 3223利用测量地球磁场, 通过绝对指向功能进行修正补偿, 可以有效解决累 计偏差, 从而修正交互操控装置 32的运动方向、 姿态角度、 运动力度和速度等 。 通过采用九轴运动传感器 322提高对交互操控装置 32动态姿态位置追踪的精度 , 避免了交互操控装置 32在立体交互设备 30上光标的"漂移"问题的出现。
[0065] 进一步地, 九轴运动传感器 322检测到的姿态原始数据包括三个自由度的加速 度、 角速度以及方向, 九轴运动传感器 322由加速度传感器单元 3221、 陀螺仪单 元 3222以及地磁传感器单元 3223构成, 九轴运动传感器 322输出的绝对方向来自 于地球的重力场和地球的磁场, 九轴运动传感器 322的静态终精度取决于对磁场 的测量精度和对重力的测量精度, 而动态性能取决于陀螺仪单元 3222。 消费级 九轴运动传感器 322中的加速度传感器单元 3221和陀螺仪单元 3222干扰噪声很大 , 以平面陀螺为例用 ADI的陀螺仪单元 3222进行积分一分钟会漂移 2度左右, 这 种前提下如果没有磁场和重力场来校正三轴陀螺的话, 那么基本上 3分钟以后物 体的实际姿态和测量输出姿态就完全变样了, 所以, 低价陀螺仪单元 3222和加 速度传感器单元 3221的架构下必须运用场向量来进行修正。 本技术方案中的九 轴运动传感器 322利用三维的陀螺仪单元 3222来快速跟踪交互操控装置 32的三维 姿态, 它以陀螺仪单元 3222单元为核心, 同吋也测量加速度和地磁场的方向为 系统提供可靠的参考。 具体测量载体三个方向的的绝对角速率、 加速度以及磁 场强度, 得到交互操控装置 32的四元数、 姿态数据等。 需要实吋的集成算法为 系统提供准确, 可靠, 及吋以及稳定的姿态输出。
[0066] 本技术方案中的九轴运动传感器的刷新率大于 60Hz, 保证立体交互设备 30获取 的交互操控装置空间姿态轨迹的平滑性, 使操作光标信号连续性更强, 并及吋 响应操作指令。
[0067] 进一步地, 交互操控装置 32外壳为普通笔状外形, 交互操控装置 32设置有多个 功能按键, 用于操作立体交互设备 30显示的虚拟立体内容。 笔状壳体的外形及 尺寸以用户能够刚好握住为宜。
[0068] 交互操控装置 32采用 HID传输协议的 USB数据线, 与立体交互设备 30连接, 笔 状壳体上设有与 USB接口配合使用的缺口, 相比于传统的 HDMI数据线, USB数 据线通用性更强, 相比于无线连接方式, 其数据传输具有更可靠的优点。
[0069] 交互操控装置 32上还设有多个功能按键, 在进入立体内容显示前, 相当于普通 鼠标的功能, 在立体交互设备 30的显示屏幕上移动, 选中要显示的立体内容资 源, 例如点击进入, 或者, 显示立体内容; 进入立体内容后, 按键单元还可以 弹出菜单快捷键、 抓取、 拖动立体内容向各个方向移动。
[0070] 本技术方案中的空间姿态数据处理过程为: 红外坐标组件 31分别获取助视装置 33和交互操控装置 32的空间位置图像传输到立体交互设备 30, 立体交互设备 30 根据空间位置算法获取第一、 二空间坐标数据; 互操控装置通过九轴传感器获 取空间姿态原始数据并传输到 MCU321进行处理, MCU321根据空间数据融合算 法将原始数据处理成交互操控装置 32空间姿态的欧拉角参数和四元数传输到立 体交互设备 30。 立体交互设备 30根据第二空间坐标数据及欧拉角参数和四元数 确定交互操控装置 32的空间位置姿态。 第一、 二空间坐标数据分别为助视装置 3 3、 交互操控装置 32的空间坐标数据。 具体的交互方法为: 操作人员移动交互操 控装置 32的光标到立体交互设备 30显示的虚拟立体内容的特定位置, 立体交互 设备 30获取交互操控装置 32的操作指令。 立体交互设备 30根据操作指令运行特 定的虚拟立体显示功能。 立体交互设备 30通过红外坐标组件 31获取操作人员视 觉观察点, 并将与视觉观察点匹配的立体显示内容通过助视装置 33传输到操作 人员眼中。
[0071] 图 1所示的桌面型空间立体交互系统的空间数据处理方法如下:
[0072] 步骤 101、 采集第一、 二空间位置图像, 结合空间位置算法获取第一、 二空间 坐标数据。
[0073] 第一、 二空间位置图像分别为助视装置 33和交互操控装置 32的空间位置图像, 第一、 二空间坐标数据分别指的是助视装置 33和交互操控装置 32通过红外坐标 组件 31确定的空间坐标数据。
[0074] 红外坐标组件 31分别获取助视装置 33和交互操控装置 32的空间位置图像传输到 立体交互设备 30, 立体交互设备 30根据空间位置算法获取第一、 二空间坐标数 据。
[0075] 红外坐标组件 31中的红外发射单元 311, 作为一种优选实施方案, 红外发射单 元 311优选为四台红外发射装置, 也可以为两台, 四台红外发射装置用于发射红 外光, 经过光学识别点反射后, 红外捕捉单元 312获取设有光学识别点的助视装 置 33或交互操控装置 32的空间图像, 立体交互设备 30获取空间图像后, 根据空 间图像坐标算法, 获取上述助视装置 33或交互操控装置 32的空间坐标。 作为一 种优选实施方式, 红外捕捉单元 312优选为四台红外捕捉相机。
[0076] 步骤 102、 获取所述交互操控装置 32的空间姿态原始数据, 将所述空间姿态原 始数据处理成空间姿态欧拉角参数和四元数;
[0077] 交互操控装置 32通过九轴运动传感器 322检测交互操控装置 32的运动姿态, 获 取空间姿态原始数据并传输到 MCU321进行处理, MCU321将原始数据处理成交 互操控装置 32空间姿态的欧拉角参数和四元数传输到立体交互设备 30。
[0078] 立体交互设备 30通过红外相机拍摄图像, 只能确定交互操控装置 32的空间坐标 位置, 并不能追踪交互操控装置 32相对于立体交互设备 30屏幕的完整运动姿态 。 九轴运动传感器 322是三种传感器的组合: 三轴加速度传感器单元 3221、 三轴 陀螺仪单元 3222和三轴地磁传感器单元 3223, 三个部分相互配合、 作用。 利用 加速传感器单元和陀螺仪单元 3222, 基本可以描述设备的完整运动状态。 但是 随着长吋间运动, 也会产生累计偏差, 不能准确描述运动姿态, 比如操控画面 发生倾斜。 地磁传感器单元 3223利用测量地球磁场, 通过绝对指向功能进行修 正补偿, 可以有效解决累计偏差, 从而修正交互操控装置 32的运动方向、 姿态 角度、 运动力度和速度等。 通过采用九轴运动传感器 322提高对交互操控装置 32 动态姿态位置追踪的精度, 避免了交互操控装置 32在立体交互设备 30上光标的" 漂移"问题的出现。
[0079] 步骤 103、 根据所述第二空间坐标数据、 欧拉角参数及四元数, 利用空间数据 融合算法确定所述交互操控装置 32的空间位置姿态;
[0080] 立体交互设备 30根据第二空间坐标数据及欧拉角参数和四元数确定交互操控装 置 32的空间位置姿态。
[0081] 立体交互设备 30需要将交互操控装置 32的空间坐标数据和姿态原始数据 (欧拉 角参数和四元数) 进行融合得到最终的姿态, 并生成对应的空间操作光标。
[0082] 桌面型空间立体交互系统的交互方法包括:
[0083] 步骤 201, 确定佩戴所述助视装置 33操作人员的视觉观察点, 获取所述交互操 控装置 32的功能按键操作指令, 这里的视觉观察点为所述助视装置 33相对于虚 拟立体内容的空间坐标点。
[0084] 图象帧的立体性是具有视差特性的左右图象帧在人的大脑中合成的结果, 实际 上是一种错觉, 因此视觉观察点对于立体交互设备 30显示的虚拟立体内容的立 体效果具有十分重要的影响, 在操作的过程中, 操作人员的视觉观察点无意识 地不停地变化, 或出于某种观察的目的变换视觉观察点进行观察, 红外坐标组 件 31跟踪操作人员的头部运动空间坐标, 进而确定其视觉观察点, 目的是为了 使立体交互设备 30显示的虚拟立体内容具有更好的立体效果, 实现操作人员与 立体交互设备 30的交互性。
[0085] 步骤 202, 根据所述操作指令, 显示与所述视觉观察点匹配的虚拟立体内容。
[0086] 交互操控装置 32设有三种不同类型的可编程功能按键, 用于对立体交互设备 30 显示的虚拟立体内容进行操作, 进入虚拟立体内容前, 交互操控装置 32相当于 普通鼠标的功能; 在立体交互设备 30的显示屏幕上移动, 选中要显示的立体内 容资源, 例如点击进入, 或者, 显示虚拟立体内容; 进入虚拟立体内容后, 按 键单元还可以弹出菜单快捷键、 抓取、 拖动虚拟立体内容向各个方向移动。
[0087] 立体交互设备 30根据操作指令运行特定的虚拟立体显示功能。 步骤 203, 立体 交互设备 30通过红外坐标组件 31获取操作人员视觉观察点, 并将与视觉观察点 匹配的立体显示内容通过助视装置 33传输到操作人员眼中。
[0088] 需要说明的是在确定交互操控装置 32的空间姿态吋, 九轴运动传感器检测的空 间姿态数据对红外坐标组件 31检测的空间坐标数据进行修正, 有效提高了交互 操控装置 32的空间位置的追踪精度。
[0089] 本技术方案旨在提供一种能够高精度追踪交互操控装置 32的桌面型空间立体交 互系统, 解决现有技术中的信号漂移问题; 空间交互系统包括立体交互设备 30 、 红外坐标组件 31、 交互操控装置 32及助视装置 33, 红外坐标组件 31跟踪助视 装置 33及交互操控装置 32, 并将其空间坐标数据传输到立体交互设备 30, 助视 装置 33的空间坐标数据与操作人员的视觉观察点一致, 这里的视觉观察点指的 是人眼相对于立体交互设备 30显示屏幕的空间位置关系, 立体交互设备 30确定 视觉观察点的目的在于显示对应的立体图象帧, 以便获得最佳的立体效果, 实 现操作人员视觉观察点与立体交互设备 30交互。 交互操控装置 32上设有多个可 编程功能按键, 操作复杂, 用于完成操作人员与立体交互设备 30的主要交互任 务, 需要确定精确空间姿态, 本技术方案桌面型空间立体交互系统中的交互操 控装置 32采用九轴传感器, 无死角检测空间姿态变化, 并对检测到的空间姿态 原始数据进行预处理, 得出欧拉角参数和四元数传输到立体交互设备 30, 只需 要根据空间数据融合算法, 将空间坐标数据与欧拉角参数和四元数进行融合处 理即可获取交互操控装置 32的精确空间姿态位置, 大大降低立体交互设备 30的 处理负担; 操作人员根据需要通过功能按键操作立体交互设备 30虚拟场景中的 立体对象, 实现人机交互。
[0090] 以上仅为本实用新型的较佳实施例, 并不用以限制本实用新型, 凡在本实用新 型的精神和原则之内, 所作的任何修改、 等同替换、 改进等, 均应包含在本实 用新型的保护范围之内

Claims

权利要求书
[权利要求 1] 一种桌面型空间立体交互系统, 其特征在于, 用于实现操作人员与立 体交互设备进行交互, 所述空间交互系统包括: 立体交互设备, 用于通过红外坐标组件跟踪操作人员的视觉观察点, 获取交互操控装置的操作指令并显示与所述视觉观察点对应的虚拟立 体内容;
红外坐标组件, 用于获取第一、 二空间坐标数据并传输到所述立体交 互设备;
助视装置, 用于从所述立体交互设备中获取虚拟立体内容; 交互操控装置, 用于输出操作指令到所述立体交互设备;
其中, 所述视觉观察点为所述助视装置相对于虚拟立体内容的空间坐 标点。 所述交互操控装置包括用于获取检测空间姿态原始数据的九轴 运动传感器及用于将所述空间姿态原始数据处理成欧拉角参数和四元 数的 MCU, 所述九轴运动传感器与所述 MCU连接。
[权利要求 2] 根据权利要求 1所述的一种桌面型空间立体交互系统, 其特征在于, 所述红外坐标组件包括设于所述立体交互设备上的红外发射单元、 光 学捕捉单元、 设于所述助视装置上的第一光学识别点以及设于所述交 互操控装置上的第二光学识别点。
[权利要求 3] 根据权利要求 2所述的一种桌面型空间立体交互系统, 其特征在于, 所述红外发射单元包括至少 1个用于发射红外光的红外发射装置, 所 述光学捕捉单元包括至少 2个用于采集目标图像的红外捕捉相机; 所 述红外发射装置与红外捕捉相机嵌入安装在所述立体交互设备上。
[权利要求 4] 根据权利要求 1所述的一种桌面型空间立体交互系统, 其特征在于, 所述九轴运动传感器包括加速度传感器单元、 陀螺仪单元及地磁传感 器单元。
[权利要求 5] 根据权利要求 2所述的一种桌面型空间立体交互系统, 其特征在于, 所述第一、 二光学识别点为主动式红外发射器件或被动式光学反射点
[权利要求 6] 根据权利要求 2或 5所述的一种桌面型空间立体交互系统, 其特征在于 , 所述第一光学识别点为被动式光学反射点, 所述光学反射点的数量 至少为 2个; 所述第二光学识别点为主动式红外发射器件, 所述红外 发射器件设置在所述交互操控装置的顶端位置。
[权利要求 7] 根据权利要求 1所述的一种桌面型空间立体交互系统, 其特征在于, 所述交互操控装置设置有用于操作所述立体交互设备显示的虚拟立体 内容的可编程功能按键。
[权利要求 8] 根据权利要求 1所述的一种桌面型空间立体交互系统, 其特征在于, 所述助视装置为偏光式立体眼镜或快门式立体眼镜。
[权利要求 9] 根据权利要求 3所述的一种桌面型空间立体交互系统, 其特征在于, 所述红外捕捉相机镜头的视角至少为 70度。
[权利要求 10] 根据权利要求 1所述的一种桌面型空间立体交互系统, 其特征在于, 所述立体交互设备的红外坐标组件的捕获距离为 0~3m。
PCT/CN2017/095272 2017-05-09 2017-07-31 桌面型空间立体交互系统 WO2018205426A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/610,908 US20200159339A1 (en) 2017-05-09 2017-07-31 Desktop spatial stereoscopic interaction system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2017205119779 2017-05-09
CN201720511977.9U CN206741431U (zh) 2017-05-09 2017-05-09 桌面型空间立体交互系统

Publications (1)

Publication Number Publication Date
WO2018205426A1 true WO2018205426A1 (zh) 2018-11-15

Family

ID=60564545

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/095272 WO2018205426A1 (zh) 2017-05-09 2017-07-31 桌面型空间立体交互系统

Country Status (3)

Country Link
US (1) US20200159339A1 (zh)
CN (1) CN206741431U (zh)
WO (1) WO2018205426A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN206741431U (zh) * 2017-05-09 2017-12-12 深圳未来立体教育科技有限公司 桌面型空间立体交互系统
US11769396B2 (en) * 2021-02-05 2023-09-26 Honeywell International Inc. Initiating and monitoring self-test for an alarm system using a mobile device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104811641A (zh) * 2015-04-24 2015-07-29 段然 一种带有云台的头戴式摄录系统及其控制方法
CN106200985A (zh) * 2016-08-10 2016-12-07 北京天远景润科技有限公司 桌面型个人沉浸虚拟现实交互设备
CN206741431U (zh) * 2017-05-09 2017-12-12 深圳未来立体教育科技有限公司 桌面型空间立体交互系统

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104811641A (zh) * 2015-04-24 2015-07-29 段然 一种带有云台的头戴式摄录系统及其控制方法
CN106200985A (zh) * 2016-08-10 2016-12-07 北京天远景润科技有限公司 桌面型个人沉浸虚拟现实交互设备
CN206741431U (zh) * 2017-05-09 2017-12-12 深圳未来立体教育科技有限公司 桌面型空间立体交互系统

Also Published As

Publication number Publication date
CN206741431U (zh) 2017-12-12
US20200159339A1 (en) 2020-05-21

Similar Documents

Publication Publication Date Title
EP3469458B1 (en) Six dof mixed reality input by fusing inertial handheld controller with hand tracking
CN109313500B (zh) 纤细形状因子的无源光学和惯性跟踪
US10019831B2 (en) Integrating real world conditions into virtual imagery
US9904056B2 (en) Display
TWI722280B (zh) 用於多個自由度之控制器追蹤
US11287905B2 (en) Trackability enhancement of a passive stylus
EP2979127B1 (en) Display method and system
US9380295B2 (en) Non-linear navigation of a three dimensional stereoscopic display
WO2016203792A1 (ja) 情報処理装置、情報処理方法及びプログラム
JP2013258614A (ja) 画像生成装置および画像生成方法
KR20150093831A (ko) 혼합 현실 환경에 대한 직접 상호작용 시스템
US11284061B2 (en) User input device camera
CN103517061B (zh) 一种终端设备的显示控制方法及装置
US11151804B2 (en) Information processing device, information processing method, and program
CN112655202A (zh) 用于头戴式显示器的鱼眼镜头的减小带宽立体失真校正
JP2021060627A (ja) 情報処理装置、情報処理方法、およびプログラム
WO2018146922A1 (ja) 情報処理装置、情報処理方法、及びプログラム
WO2018205426A1 (zh) 桌面型空间立体交互系统
US20210084197A1 (en) Data processing
JP6467039B2 (ja) 情報処理装置
WO2023278132A1 (en) Augmented reality eyewear with x-ray effect
CN106970713A (zh) 桌面型空间立体交互系统及方法
CN116866541A (zh) 一种虚实结合的实时视频交互系统及方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17909122

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: 1205A 24.04.2020

122 Ep: pct application non-entry in european phase

Ref document number: 17909122

Country of ref document: EP

Kind code of ref document: A1