CN101995943B - three-dimensional image interactive system - Google Patents

three-dimensional image interactive system Download PDF

Info

Publication number
CN101995943B
CN101995943B CN 200910167142 CN200910167142A CN101995943B CN 101995943 B CN101995943 B CN 101995943B CN 200910167142 CN200910167142 CN 200910167142 CN 200910167142 A CN200910167142 A CN 200910167142A CN 101995943 B CN101995943 B CN 101995943B
Authority
CN
China
Prior art keywords
stereopsis
processing unit
interaction systems
central processing
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 200910167142
Other languages
Chinese (zh)
Other versions
CN101995943A (en
Inventor
叶裕洲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
E Touch Corp
Original Assignee
E Touch Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by E Touch Corp filed Critical E Touch Corp
Priority to CN 200910167142 priority Critical patent/CN101995943B/en
Publication of CN101995943A publication Critical patent/CN101995943A/en
Application granted granted Critical
Publication of CN101995943B publication Critical patent/CN101995943B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to a three-dimensional image interactive system, which comprises a three-dimensional image reading module, a three-dimensional image processing unit, a system host and a three-dimensional image display module, wherein when the three-dimensional image display module displays the three-dimensional image, and after the three-dimensional image reading module acquires an action image of an operating body, the three-dimensional image processing unit acquires the action characteristic from the action image and transmits the action characteristic to a central processing unit, the central processing unit calculates the immediate action of the three-dimensional image under the action characteristic, and the three-dimensional image displayed by the three-dimensional image display module is changed correspondingly according to the action characteristic; and thus, the virtual three-dimensional image is displayed in entity space, and the operating body can be subjected to immediate interaction directly on the three-dimensional image. In addition, the displayed three-dimensional image can be from a first three-dimensional image shot by the three-dimensional image reading module or from a second three-dimensional image prestored in a storage unit in the system host.

Description

The stereopsis interaction systems
Technical field
The present invention relates to a kind of stereopsis interaction systems, be meant a kind of stereopsis interaction systems especially for user's real time operation stereopsis.
Background technology
Present known technology has polarized area point-score (polarization division method), method timing separation (time division method), wavelength zone point-score (wavelength division method) and space region point-score (spatial division method) etc., and (described 3 D image display method has been disclosed technology, be not the demand of this case simultaneously, so do not give unnecessary details); The above four kinds of method can be described as Eyeglass-type 3D system again, and the stereopsis that it is shown also need be worn the anaglyph spectacles of its reply and watch, and then is difficult to allow common people accept at large.Therefore, exempt to wear the characteristics of mirror, free fancy space, be only the developing direction of stereopsis demonstration now.The 3D 3 D image display method of existing Glassless, then the mode by fly's-eye lens panel (lenticular sheet), parallax barrier method (parallax barriermethod), two poor parallax method (binocular parallax method) or light source slitization (slit sourcemethod) realizes.
Existing instant stereopsis shows, as U.S. patent gazette, " the Imagesynthesizing apparatus and method; position detecting apparatus andmethod; and supply medium " of notification number the 6404913rd, described patent has proposed the production method of a continuous stereo image, after it catches object surfaces by a plurality of cameras, instant playback is on displays such as liquid crystal panel, and by modes such as coordinate prediction and coordinate Calculation, it is more true to nature that stereopsis is shown; Yet though this piece patent can show stereopsis, the user also can't be directly in the enterprising edlin action of three-dimensional shadow, as amplifying, dwindle, rotating or the like, so the user still needs by devices such as keyboard, slide-mouse, just can carry out editor's action of stereopsis.
In addition, show the stereopsis at realistic space except above-mentioned, show the technology of three-dimensional object in flat image, go for many years already, for example all kinds of recreation or Film Animation or the like are exactly to show three-dimensional object in the plane; When three-dimensional object is shown as an inanimate objects, the feature of its action designs quite easily, when if three-dimensional object is shown as a lived object, the design of its motion characteristic is just quite complicated, with the human body is example, though the action on the limbs can be finished with simulation by the calculating of program, the effect that presents is quite stiff and unnatural.
Therefore, (motion capture technology commonly used can be divided into mechanical type, acoustics formula, electromagnetic type and optical profile type etc. from the principle for motion capture, technology Mocap) just to have the dealer to send motion capture.From the angle of technology, the essence of motion capture is exactly to measure, to follow the tracks of, write down the movement locus of object in three dimensions; Here only do diagrammatic illustration at existing most widely used optical profile type motion capture.
The optical profile type motion capture is by to the supervision of specific luminous point on the target and the task of following the trail of execution to catch.At present common optical profile type motion capture is mostly based on principle of computer vision.In theory, for a point in the space, as long as it can be simultaneously by two camera findings, then according under the same time, two image and camera parameters that camera is captured just can be determined, described position in the space under this time.When camera is taken continuously with two-forty, from image sequence, just can obtain the movement locus of described point.
The advantage of optical profile type motion capture is that performing artist's scope of activities is big, the restriction of no cable, mechanical hook-up, and the performing artist can freely perform, and uses easily.The sampling rate that adds the optical profile type motion capture is higher, can satisfy the needs that most high-speed motions are measured.
This type of patent, tw Taiwan patent gazette for example, " method and the device that instant dynamic image is caught " that notification number is I274295 number is provided for the instant dynamic image method for catching of control of video game role.Originate in the model of definition one control object, then, confirm the position of mark on the described model.Then, catch the relevant action of described control object.Then, the action that the described control object of decipher is relevant is followed with the state that changes described model, the action that presents the role on the screen is showed in change control according to described mould shape attitude, though but this patent can be caught user's action immediately, control the image that is shown in the screen, but the image that this only limits to the plane there is no Faxian and shows that three-dimensional image also can't do interaction.
U.S. Patent bulletin, " the Gesture recognition systemusing depth perceptive sensors " that No. the 7340077th, notification number, the sensor of employing depth perception carries out the system of gesture identification, utilize a plurality of sensors to carry out the gesture motion identification, the image that display is shown produces corresponding action according to gesture; Though this patent can be detected user's action, it only is defined in hand motion, and can't carry out sensing at all limb actions of human body.
U.S. Patent bulletin, " the System and method for gesturerecognition in three dimensions using stereo imaging and color vision " that No. the 6788809th, notification number, described patent provides a gesture identification method, set up relevant image archives such as hand, finger in advance, and produce to close the database of tubercle point according to the human synovial position, then when gesture motion produces, can more accurate identification gesture motion; Though this patent can be detected user's action, it only is defined in hand motion, and can't carry out sensing at all limb actions of human body.
U.S. Patent bulletin, " the Method and system for detectingconscious hand movement patterns and computer-generated visual feedbackfor facilitating human-computer interaction " that No. the 7274803rd, notification number, the present invention is a kind of system and detection method, in order to the motor pattern of analyzing personal in the computer front, utilize a plurality of video cameras to obtain user's action data, and obtain user's operating state, and instant playback is on display; This patent also can't be directly produces interaction with the user, and it is only at the instant motion capture of user, and can't with application program, grade is carried out interaction as playing.
U.S. Patent bulletin, " the Bounding box gesturerecognition on a touch detecting interactive display " that No. the 20060288313rd, notification number, described patent is used a plurality of sensors of adjusting sensing region, and the three-dimensional zone of sensing of a rectangle is provided.User's gesture motion is all finished in this three-dimensional zone; Because all coordinates in three-dimensional zone have defined and have finished in advance, therefore, easier identification gesture motion.
U.S. Patent bulletin, " the Gesture recognitionsimulation system and method " that No. the 20080013793rd, notification number, described patent is except the gesture identification, it is by the mode of simulation control, before the gesture identification, the more accurate gesture image of setting up is simulated gesture motion simultaneously in advance, reaches gesture identification accurately.But above-mentioned two pieces of patents, its fundamental purpose are just in order to make the gesture identification more accurate, and are the patent that progresses greatly under the simple gesture identification technique, so there is no any interaction function and application.
U.S. Patent bulletin, " the Video-supported planning ofequipment installation and/or room design " that No. the 7139685th, notification number, a kind of indoor planning and design system comprise a Virtual space; Described Virtual space comprises virtual physical space, a dummy object database, described dummy object comprises virtual equipment, machine and can be placed on the object in described space, an and user interface, described user interface comprises the first user interface element and the second user interface element, the function of the first user interface element can be selected dummy object from the dummy object database, and object is placed in the described Virtual Space, the second user interface element is then handled position and the direction of described a plurality of dummy object in described Virtual Space; Though this patent provides the user to carry out the configuration of virtual article in the Virtual Space, but whole action, still need be by outside input media, for example contact panel, keyboard slide-mouse or the like, and the mode that is presented, still need a flat-panel screens, therefore, all interactive process all are in the image shown in display, can't accomplish stereopsis and user's direct interaction.
U.S. Patent bulletin, " the Interactive video basedgames using objects sensed by TV cameras " that No. the 20060033713rd, notification number, described patent is mainly utilized the technology of motion capture, be combined on the holder for TV playing, and be provided with a plurality of stereopsis video cameras, to obtain user's action, allow the flat image interaction that the user can be shown with display; The patent gazette relevant with this piece patent, as U.S. Patent bulletin, notification number No. 20060202953 " Novel man machineinterfaces and applications " etc.; Therefore, described patent only limits to the interaction of flat image, and when being stereopsis as if the image that shows, it also can't produce interactive with stereopsis.
Because the three-dimensional object that the plane shows can carry out the real time operation of object by the mode of motion capture, but learn by aforementioned, stereopsis is shown as following important display technique, and stereopsis is by after the holographic display device projection, is presented in the space of reality; In other words, viewpoint with the user, the position of stereopsis, it is a virtual stereopsis between holographic display device and naked eyes, then the user is when the operation stereopsis, for example rotate, stretch, decompose, make up, be out of shape or move or the like, subconsciousness is used limbs and is removed to contact virtual stereopsis, and this is a kind of reaction of very intuition.
Yet, existing operative technique still needs the input media by the entity contact, as keyboard, slide-mouse, trace ball, contact panel etc., operates virtual stereopsis, so aforesaid stereopsis display technique must be operated with the input media of entity contact.
In addition, though aforementioned patent has the technology of mentioning instant motion capture, but still be only limited to the three-dimensional object of plane under showing, and can't operate at the stereopsis that shows in the entity space; Therefore, how to make stereopsis cooperate limb action, produce corresponding variation, be the generation of this case invention conception.
Summary of the invention
Because above-mentioned demand, the inventor studies intensively, and long-pending individual is engaged in the many years of experience of described cause, designs a kind of brand-new stereopsis interaction systems eventually.
A purpose of the present invention aims to provide a kind of stereopsis interaction systems of obtaining the object image and showing with stereopsis.
A purpose of the present invention, but aim to provide a kind of stereopsis interaction systems of real time operation stereopsis.
A purpose of the present invention aims to provide a kind of stereopsis interaction systems that can directly contact stereopsis and carry out interaction.
A purpose of the present invention aims to provide a kind of stereopsis interaction systems that produces three-dimensional situation.
A purpose of the present invention aims to provide a kind of stereopsis interaction systems that the solid space plot ratio is calculated that carries out.
A purpose of the present invention aims to provide a kind of stereopsis interaction systems for a plurality of operating body real time operation stereopsis.
For reaching above-mentioned purpose, stereopsis interaction systems of the present invention, it includes:
One stereopsis read module is provided with one or more stereopsis reading unit, and it is in order to obtaining a plurality of object images of a default object, and in order to obtain a motion image of an operating body;
One stereopsis processing unit electrically connects with described a plurality of stereopsis reading units, and it is in order to being integrated into one first stereopsis with described a plurality of object images, and by obtaining a motion characteristic in the described motion image;
One system host includes a CPU (central processing unit) that electrically connects with described stereopsis processing unit, and a storage element that electrically connects with described CPU (central processing unit), wherein,
Be preset with one second stereopsis in the described storage element, and in order to store described first stereopsis;
Described CPU (central processing unit) and described stereopsis processing unit electrically connect, and it is in order to calculate the instant action of described a plurality of stereopsis under described motion characteristic;
One stereopsis display module electrically connects with described CPU (central processing unit), and it is in order to show the instant action of described a plurality of stereopsis.
When showing stereopsis, after described stereopsis read module is obtained the motion image of described operating body, described stereopsis processing unit is by obtaining described motion characteristic in the described motion image, and be transferred to described CPU (central processing unit), then described CPU (central processing unit) is calculated the instant action of described stereopsis under described motion characteristic, at this moment, the shown described stereopsis of stereopsis display module will produce corresponding variation according to described motion characteristic, thus, reach the virtual stereopsis of demonstration in the entity space, and described operating body can directly carry out instant interaction on stereopsis.
It should be noted, described stereopsis read module can be obtained a plurality of described motion image of a plurality of described operating bodies simultaneously, then described stereopsis processing unit will produce a plurality of described motion characteristics according to described a plurality of motion images, in view of the above, described CPU (central processing unit) is calculated the instant action of described a plurality of stereopsis under described a plurality of motion characteristics.
In this manual, in the definition of described operating body, except the pointer of general cognition, finger, also comprise face four limbs, game item of human body or the like, such as can obtain motion image, motion characteristic person by described stereopsis read module, all belong to the category of described operating body; In addition, for the definition of described object, which kind of kenel does not limit is, as long as can carry out the object that image read and synthesized stereopsis by described stereopsis read module, utensil of human body, game item, life periphery or the like for example all belongs to the category of the object that this instructions mentions.
Description of drawings
Fig. 1 is the calcspar of preferred embodiment of the present invention;
Fig. 2 is the process flow diagram one of preferred embodiment of the present invention;
Fig. 3 is the flowchart 2 of preferred embodiment of the present invention;
Fig. 4 is the flow chart 3 of preferred embodiment of the present invention;
Fig. 5 is the process flow diagram four of preferred embodiment of the present invention;
Fig. 6 is the process flow diagram five of preferred embodiment of the present invention;
Fig. 7 is the process flow diagram six of preferred embodiment of the present invention;
Fig. 8 is the synoptic diagram one of preferred embodiment of the present invention;
Fig. 9 is the synoptic diagram two of preferred embodiment of the present invention;
Figure 10 is the synoptic diagram three of preferred embodiment of the present invention;
Figure 11 is the synoptic diagram four of preferred embodiment of the present invention;
Figure 12 is the synoptic diagram five of preferred embodiment of the present invention;
Figure 13 is the synoptic diagram six of preferred embodiment of the present invention;
Figure 14 is the process flow diagram seven of preferred embodiment of the present invention.
Description of reference numerals: 1-stereopsis read module; 11-stereopsis reading unit; 2-stereopsis processing unit; The 3-system host; The 31-CPU (central processing unit); The 32-storage element; The 33-application program; 4-stereopsis display module; The 41-display unit; 42-three-dimensional imaging unit; The 5-stereopsis; 51-the 3rd stereopsis; 52-the 4th stereopsis; The 6-solid space; The 7-operating body.
Embodiment
See also shown in Figure 1, calcspar for preferred embodiment of the present invention, as shown in the figure, stereopsis interaction systems of the present invention includes a stereopsis read module 1, a stereopsis processing unit 2, a system host 3 and a stereopsis display module 4 and constitutes, wherein:
Described stereopsis read module 1 is provided with one or more stereopsis reading unit 11, and it is in order to obtaining a plurality of object images of a default object, and in order to obtain a motion image of an operating body; Described a plurality of stereopsis reading unit 11 is a charge coupled cell (charge-coupled device, CCD) or be a complementary matal-oxide semiconductor (complementary metal-oxidesemiconductor, CMOS) photo-sensitive cell that is constituted one of them, when described stereopsis reading unit 11 is charge coupled cell (charge-coupled device, CCD) time, can be selected from pure linear sensitization coupling element (linear CCD), scan type sensitization coupling element (interline transfer CCD), full figure type sensitization coupling element (full-frame CCD) or entirely the charge coupled cell formed of biography type sensitization coupling element (frame-transfer CCD) (charge-coupled device, CCD) group one of them.
Described stereopsis processing unit 2 electrically connects with described a plurality of stereopsis reading units 11, and it is in order to being integrated into one first stereopsis with described a plurality of object images, and by obtaining a motion characteristic in the described motion image.
Described system host 3 includes a CPU (central processing unit) 31 that electrically connects with described stereopsis processing unit 2, and a storage element 32 that electrically connects with described CPU (central processing unit) 31, wherein,
Be preset with one second stereopsis in the described storage element 32, and in order to store described first stereopsis, and described storage element 32 internal preset have one or more application program 33, described application program 33 can be CAD stereomapping software, video editing software or the like, and described storage element 32 can be hard disk, laser disc, memory card or storer etc.
Described CPU (central processing unit) 31 electrically connects with described stereopsis processing unit 2, and it is in order to calculate the instant action of described a plurality of stereopsis under described motion characteristic.For example; described stereopsis presents a triangle cone; and tip portion is towards user's naked eyes direct-view direction; described motion characteristic from top to bottom at this moment; then described triangle cone will be rotated thereupon; and look at the bottom flat face straight direction over against user's naked eyes; this section narration is only for illustrating the interactive relationship between described stereopsis and motion characteristic; other are as rotation at any angle; amplify action; level or vertical move or the like; perhaps described motion characteristic makes described stereopsis produce distortion; as stretching; depression; distortion or the like, the category that also belongs to the present invention and protected.And when described stereopsis processing unit 2 carries out the instant action calculating of described stereopsis, can be undertaken by the mode of coordinates computed, and so-called coordinates computed is common coordinate system, for example relative coordinate, polar coordinates, spherical coordinates or be used for spatial coordinates calculation mode one of them.
Again when described operating body when being a plurality of, described stereopsis read module 1 will be obtained a plurality of motion images of described a plurality of operating bodies simultaneously and be transferred to described stereopsis processing unit 2, then described stereopsis processing unit 2 will produce a plurality of described motion characteristics according to described a plurality of motion images, make described CPU (central processing unit) 31 calculate the instant action of described stereopsis according to described a plurality of motion characteristics.
Moreover described system host 3 can be considered PC, notebook computer or a game host or the like, does not limit the form of its use; And, described stereopsis processing unit 2 is when reality is used, be the integrated circuit form that electrically connects with described CPU (central processing unit) 31, or be the form of firmware of burning in described CPU (central processing unit) 31, or be to read the form of software of union, or be the electronic circuit form that main passive device is formed for described CPU (central processing unit) 31.
The three-dimensional imaging unit 42 that described stereopsis display module 4 is provided with a display unit 41 and electrically connects with described display unit 41, and described CPU (central processing unit) 3 is electrically connected to described display unit 41.
Described display unit 41 its in order to show the instant action of described a plurality of stereopsis, and cathode-ray tube (CRT) (cathode ray tube such as, CRT) display, LCD (liquid crystal display, LCD), Organic Light Emitting Diode (organic light-emitting diode, OLED) display, vacuum fluorescent display (vacuum fluorescent display, VFD), plasma display panel (plasmadisplay panel, PDP), surface conductive electronics emission (surface conductionelectron-emitter, SED) display or Field Emission Display (field emission display, FED) or Electronic Paper (e-paper) etc., all belong to the category that described display unit 41 is defined, do not limit the kenel of described display unit 41; When described display unit 41 is LCD (liquidcrystal display, LCD) time, it also comprises: twisted nematic (twisted nematic, TN) LCD, vertical orientation type (vertical alignment, VA) LCD, many quadrants vertical orientation type (multi-domain vertical alignment, MVA) LCD, the vertical adjusting type of image (patterned vertical alignment, PVA) LCD, transverse electric field switches (in-plane switching, IPS) LCD, the fireworks shape is arranged (continuouspinwheel alignment continuously, CPA) LCD, optical compensation curved arrange type (opticalcompensated bend, OCB) LCD such as LCD (liquid crystal display, LCD) group that is formed one of them; And described display unit 41 is Organic Light Emitting Diode (organiclight-emitting diode, OLED) during display, it also comprises: active-matrix formula organic electric-excitation luminescent (aactive matrix organic light emitting diode, AMOLED) display, passive matrix type organic electric-excitation luminescent (passive matrix organic light emitting diode, PMOLED) Organic Light Emitting Diode such as display (organic light-emitting diode, OLED) group that forms of display one of them.
After display unit 41 produces stereopsis, receive by described three-dimensional imaging unit 42, transfer described stereopsis to a multiple video, it can make described stereopsis receive the characteristic of image according to people's naked eyes, divide into the image that left eye and right eye receive respectively, after receiving, the naked eyes parallax is described stereopsis, described again three-dimensional imaging unit 42 is for having grating (light grating) structure or fly's-eye lens panel (lenticular sheet) etc., and the described stereopsis that described display unit 41 is produced is divided into described multiple video.
See also shown in Fig. 1,2, be the calcspar and the process flow diagram one of preferred embodiment of the present invention, described stereopsis interaction systems is for to obtain described first stereopsis according to the following step:
Step 100: described stereopsis read module scanned object outward appearance obtains a plurality of object images;
When above-mentioned steps is carried out, the external image of described stereopsis read module 1 shooting object when taking, can horizontally rotate object, vertically rotate or multi-angle is rotated or the like, make described stereopsis read module 1 can obtain the image of object different angles, produce a plurality of object images.
Step 101: described stereopsis processing unit is checked the integrity degree of described a plurality of object images;
Step 102: described a plurality of object images synthesize described first stereopsis by described stereopsis processing unit;
When above-mentioned steps was carried out, described stereopsis processing unit 2 was checked described a plurality of object images, and by technology such as depth of field calculating, image joints, a plurality of object images is synthesized described first stereopsis.
Moreover, for convenient use backward, described stereopsis processing unit 2 calculates the volume of described first stereopsis simultaneously, after being described a plurality of object images of obtaining of described stereopsis read module, ratio according to described object reality, correspond to the ratio of described first stereopsis, carry out the volume calculation of described first stereopsis.
Step 103: described first stereopsis is transferred to described stereopsis display module through described CPU (central processing unit);
Step 104: described stereopsis display module shows described first stereopsis and can carry out the operation of stereopsis interaction.
When above-mentioned steps is carried out, after described CPU (central processing unit) 31 receives described first stereopsis, described first stereopsis is transferred to described display unit 41, described three-dimensional imaging unit 42 transfers described first stereopsis to multiple video, and receive the characteristic of image according to people's naked eyes, divide into the image that left eye and right eye receive respectively, after the naked eyes parallax receives, be described first stereopsis.
At this moment, described first stereopsis also can be stored in described storage element 32 inside with data such as described first stereopsis and volumes, for broadcast in the future except operation such as being available for users to amplify, dwindle.
See also shown in Fig. 1,3, be the calcspar and the flowchart 2 of preferred embodiment of the present invention, described stereopsis interaction systems is obtained described second stereopsis according to the following step
Step 200: described stereopsis read module scanned object outward appearance obtains a plurality of object images;
Step 201: described stereopsis processing unit is searched described storage element and is found out described second stereopsis identical with described a plurality of object images;
Step 202: described second stereopsis is transferred to described stereopsis display module through described CPU (central processing unit);
Step 203: described stereopsis display module shows described second stereopsis and can carry out the operation of stereopsis interaction.
When above-mentioned described step is carried out, described stereopsis read module 1 is after having scanned the image of object, described stereopsis processing unit 2 is directly by in the described storage element 32, compare the characteristic of described object image, profile for example, color or the like, when described object image has been stored in described storage element 32 inside, and comparison confirm errorless after, (stereopsis that is described object has been stored in described storage element 32 inside to then described CPU (central processing unit) 31 in advance with described second stereopsis, and be called second stereopsis) and be sent to stereopsis display module 4, be shown as acceptable second stereopsis of naked eyes via described display unit 41 with described three-dimensional imaging unit 42.
Above-mentioned Fig. 2,3 flow process are primarily aimed at stereopsis and obtain with demonstration and do explanation, at the interaction of stereopsis (i.e. first stereopsis or second stereopsis) with operating body, will be illustrated in the flow process of Fig. 4.
See also shown in Fig. 1,4, be the calcspar and the flow chart 3 of preferred embodiment of the present invention, described stereopsis interaction systems is for to carry out the stereopsis interaction according to the following step
Step 300: described stereopsis display module shows one the 3rd stereopsis;
Because the step that stereopsis shows had been explained in the step narration of earlier figures 2,3, so do not give unnecessary details; In addition, described the 3rd stereopsis in fact also comprises aforesaid described first stereopsis or described second stereopsis.
Step 301: described operating body advances to take action at described stereopsis and does;
When above-mentioned steps was carried out, described operating body also comprised face four limbs, game item of human body or the like except the pointer of general cognition, finger.
Step 302: described stereopsis read module is obtained a motion image of described operating body;
Step 303: described stereopsis processing unit is obtained a motion characteristic according to described motion image;
When above-mentioned steps is carried out, motion image when described stereopsis read module 1 is obtained the action of described operating body, because described operating body is when action, must produce action continuously, then described stereopsis processing unit 2 can calculate the motion characteristic of described operating body according to described motion image, for example: described motion image shows that described operating body moves from lower to upper, then described stereopsis processing unit will calculate the motion characteristic of described operating body, and inform described CPU (central processing unit) 31, the motion characteristic of present described operating body is mobile from lower to upper; Other still can produce corresponding motion characteristic as move left and right, rotation etc.
Step 304: described CPU (central processing unit) makes described the 3rd stereopsis produce corresponding variation according to described motion characteristic.
When above-mentioned steps is carried out, after described CPU (central processing unit) 31 receives described motion characteristic, motion characteristic is cooperated predetermined actions, make described the 3rd stereopsis produce corresponding variation according to described motion characteristic, then described stereopsis display module 4 shown described the 3rd stereopsis will be along with action, for example, motion characteristic from top to bottom, then described the 3rd stereopsis can rotate up and down, perhaps, motion characteristic is that the distance of two contact points is elongated gradually, and then described the 3rd stereopsis can be along with amplifying or the like.
It should be noted, in the present embodiment, the pairing action of described motion characteristic, promptly described CPU (central processing unit) 31 are according to described motion characteristic, when making described the 3rd stereopsis produce action, its operating state will be according to the characteristic of described application program 33, and produce different operating states, and in other words, identical described motion characteristic and described the 3rd stereopsis, during corresponding different described application program 33, will there be different actions to change.
Be described as follows with a plurality of examples of difference:
1, please cooperate simultaneously consult shown in Figure 8, synoptic diagram one for preferred embodiment of the present invention, when described application program 33 is the stereopsis browsing software, described stereopsis display module 4 shows described the 3rd stereopsis 51, do example with a S.L.R in the drawings, and described operating body 7 is done explanation with the hand of human body.
When described operating body 7 for single direction moves, as from top to bottom the time, then described application program 33 receives a basipetal motion characteristic, and makes described the 3rd stereopsis 51 produce rotation;
When described operating body 7 be 2 mutually away from, by touching the action that separates, then described application program 33 receives the motion characteristic of 2 separation as forefinger and thumb, and makes described the 3rd stereopsis 51 amplify;
When described operating body 7 contacts also displacement for single-point, behind the contact point and displacement at first described stereopsis read module 1 described operating body of calculating and described the 3rd stereopsis 51, described application program 33 makes described the 3rd stereopsis 51 produce the action that separates; In the drawings, displacement behind the camera lens of described the 3rd stereopsis 51 of described operating body 7 contacts, then described application program 33 makes described the 3rd stereopsis 51 produce the action that separates.
2, please cooperate simultaneously consult shown in Figure 9, synoptic diagram two for preferred embodiment of the present invention, when described application program 33 is three-dimensional situation simulation software, described the 4th stereopsis 52 (it should be noted, described the 3rd stereopsis 51 is only different for the stereopsis that shows with described the 4th stereopsis 52, do differentiation in order to be beneficial to explanation) but with the state in the simulating reality life, driver's seat with automotive interior is an example in the drawings, then the user is with the stereopsis of hand or palm contact bearing circle, and the state that foundation shows is operated, at this moment, described stereopsis read module 1 is obtained the action of hand, for example turns left, turn right or gear or the like, and calculate the motion characteristic of these actions, be sent to described CPU (central processing unit) 31, then the 4th stereopsis 52 will produce corresponding variation according to user's action, and for example user's hand is the bearing circle that turns right, the stereopsis that then is shown as bearing circle will reach the interaction of user and the 4th stereopsis 52 to right rotation.
3, please cooperate simultaneously and consult shown in Figure 10,11, synoptic diagram three, four for preferred embodiment of the present invention, when described application program 33 is 3D stereopsis process softwares such as CAD, no matter be any motion characteristic, the action of its correspondence changes, and belongs to the aspect that CAD uses mostly, when for example the motion characteristic of operating body 7 be approaching described the 3rd stereopsis, the surface of contact of described the 3rd stereopsis 51 will cave in gradually, meet shell in the CAD design, will penetrate, instruction design such as depression.
4, when described application program 33 is played for electronic game, described operating body will can be all four limbs of human body, stage property that recreation is corresponding or the like, the electronic game recreation that for example described application program 33 is a Grapple, and the user faces toward described the 3rd stereopsis (as the boxer, strange beast or the like) carries out an action of shaking one's fists, then described stereopsis read module 1 is obtained user's motion image, calculate motion characteristic and can be considered circular motion feature of operating body generation by described stereopsis processing unit 2, then described the 3rd stereopsis will produce by the action after hitting (as layback according to described motion characteristic, fall back, limb rotation or the like).
5, please cooperate simultaneously consult shown in Figure 11, in order to reach the purpose that many people operate simultaneously, described stereopsis read module 1 can scan or take a plurality of described operating bodies 7 simultaneously, and obtain the described motion image of each described operating body 7, and described a plurality of motion images are transferred to described stereopsis processing unit 2, then described stereopsis processing unit 2 will produce a plurality of described motion characteristics according to described a plurality of motion images, make described CPU (central processing unit) 31 calculate the instant action of described the 3rd stereopsis 51 according to described a plurality of motion characteristics; When this is applied to recreation or application program 33 such as CAD, can reach the purpose that many people operate simultaneously.
What need replenish is, above-mentioned described the 3rd stereopsis 51, the 4th stereopsis 52 are when showing, be presented at by described stereopsis display module 4 in the space of reality, therefore, before user, it will be appreciated that the stereopsis of an object, but during with operating bodies such as finger 7 contact, will not have the sensation of entity contact, but after the setting of collocation application program 33, can produce corresponding kenel and change; Therefore, it is any kenel that described application program 33 does not limit, and the electronic game of CAD software, browsing image software, spatial design software, each kenel recreation or the like such as long as can produce interactively with stereopsis, all belongs to the defined category of the present invention.
See also shown in Fig. 1,5, be the calcspar and the process flow diagram four of preferred embodiment of the present invention, described stereopsis interaction systems is for to carry out three-dimensional situation interaction according to the following step:
Step 400: described stereopsis display module shows one the 4th stereopsis;
Because the step that stereopsis shows had been explained in the step narration of earlier figures 2,3, so do not give unnecessary details; In addition in the present embodiment, described the 4th stereopsis can be according to the state of reality use, be set to the Virtual Space of each kenel, for example shown in Figure 9, be the synoptic diagram two of preferred embodiment of the present invention, described the 4th stereopsis 52 is shown as the driver's seat of an automotive interior, and Figure 13 is the synoptic diagram six of preferred embodiment of the present invention, described the 4th stereopsis 52 is shown as the driving cabin of an interior of aircraft etc., but is not the situation simulation that is defined in vehicle interior.
Step 401: described operating body advances to take action in described stereopsis and does;
In the above-mentioned steps, the user can be directly in described the 4th stereopsis 52 line operate that advances, for example Fig. 9 is shown as the driver's seat of an automotive interior, and then the user sees a virtual automobile driver seat with naked eyes, and contacts the bearing circle that described the 4th stereopsis 52 shows with hand.
Step 402: described stereopsis read module is obtained user's a motion image;
Step 403: described stereopsis processing unit is obtained a motion characteristic according to described motion image;
Step 404: described stereopsis read module calculates the operating position of user on described the 4th stereopsis;
When above-mentioned steps is carried out, be still with Fig. 9 and do explanation; When the user when the enterprising action of described the 4th stereopsis 52 are done, described stereopsis read module 1 is obtained the motion image of user's hand, calculate the action of hand and produce corresponding motion characteristic by described stereopsis processing unit 2, after described these motion characteristics are sent to described CPU (central processing unit) 31, cooperate the setting of described application program 33, make described the 4th stereopsis 52 produce corresponding variation, for example hold bearing circle and rotation, the bearing circle that then described the 4th stereopsis 52 shows will be along with rotation, perhaps hold gear shift lever and carry out gear, the gear shift lever that then described the 4th stereopsis 52 shows will be along with moving.
Step 405: described CPU (central processing unit) produces corresponding variation according to the pairing contextual status of described motion characteristic and described the 4th stereopsis.
When above-mentioned steps is carried out, the part that contacts with the user except described the 4th stereopsis 52 can change, the situation of whole the 4th stereopsis simulation will be along with variation, for example described the 4th stereopsis 52 is the vehicle in travelling, when user's steering wheel rotation, except the image of bearing circle can rotate, the shown scenery of described the 4th stereopsis 52 can the conversion along with steering wheel rotation, reaches the characteristic of virtual driving.
In addition, aforementionedly mention described application program 33 and also can be interior space design software, see also shown in Fig. 1,12, calcspar and synoptic diagram four for preferred embodiment of the present invention, the utensil that daily lifes such as general household electrical appliances are used, after 1 scanning of stereopsis read module, be stored in the described storage element 32, or be stored in advance in the described storage element 32.
Then when described application program 33 is interior space design software, described stereopsis display module 4 will show a solid space 6, and the user can move freely described stereopsis 5, i.e. the image of the stereo display of living utensil such as household electrical appliances.Perhaps, when described solid space 6 was shown as objects such as luggage case, packing box, described application program 33 can be carried out the calculating of inner space, obtains the use plot ratio in space.
See also shown in Fig. 1,7,12, be preferred embodiment calcspar of the present invention, process flow diagram six and synoptic diagram five, described stereopsis interaction systems is for to carry out SPATIAL CALCULATION according to the following step:
Step 600: described stereopsis display module produces the solid space image and defines its volume size;
When above-mentioned steps was carried out, described solid space 6 images can be shown as luggage case, inside, room, inner space or counter inner space or the like according to user's demand, and in Figure 12, described solid space 6 is shown as an indoor space.
Step 601: the stereopsis of inserting at least more than one in the described solid space;
Step 602: described operating body disposes described a plurality of stereopsis in described stereopsis space;
When above-mentioned steps was carried out, described stereopsis was the step narration of earlier figures 2 or Fig. 3, so do not give unnecessary details, then described stereopsis 5 can be described first stereopsis or described second stereopsis at this; In addition, in the present embodiment, described stereopsis 5 is shown as family life articles for use such as desk or bookcase, therefore, and mobile described a plurality of stereopsis 5 that described operating body 7 can be random.
Simultaneously, can change size, shape or ornaments angle of described a plurality of stereopsis 5 or the like according to the demand in when configuration.
Step 603: described stereopsis processing unit is according to the instant volume that calculates described a plurality of stereopsis of the action of described operating body;
When above-mentioned steps is carried out, because the position of described a plurality of stereopsis 5 and size etc., to produce different the variation along with user's change, at this moment, described stereopsis read module 1 will be according to the process step of earlier figures 4, changes described a plurality of stereopsis 5 at any time and is presented at position in the described solid space 6.
Step 604: described stereopsis processing unit is obtained the volume of described a plurality of stereopsis and is calculated the volume utilization rate of described solid space.
When above-mentioned steps was carried out, described CPU (central processing unit) 31 obtained the volume data of described a plurality of stereopsis 5, and for example, described first stereopsis will produce information such as volume data after scanning, and perhaps, described second stereopsis stores volume data in advance; Therefore, the volume that described CPU (central processing unit) 31 can cooperate described a plurality of stereopsis 5 according to the volume of described solid space 6 carries out the calculating of spatial volume utilization rate.
See also shown in Fig. 1,6,10, be the process flow diagram five and the synoptic diagram three of preferred embodiment of the present invention, described stereopsis interaction systems calculates for carry out spatial coordinate according to the following step:
Step 500: described stereopsis display module defines a solid space and show described stereopsis in described solid space;
When above-mentioned steps is carried out, described stereopsis display module 4 is according to space variables such as its projector distance, area sizes, produce the described solid space 6 that a stereopsis shows, described solid space 6 can be spheric region, rectangular area, sector region etc., and can use corresponding coordinate system, for example spheric region is then used spherical coordinates, the rectangular area can be with polar coordinates or rectangular coordinate, and sector region can use spherical coordinates or relative coordinate or the like.
Step 501: the operating body contact is arranged in the described solid space of described stereopsis read module sensing;
Step 502: described stereopsis read module is obtained the motion image of operating body and is calculated motion characteristic;
When above-mentioned steps is carried out, in the described solid space 6 of described stereopsis read module 1 sensing whether described operating body 7 is arranged, simultaneously, note down the described motion image of described operating body 7, obtain the described motion characteristic of described operating body 7 by described motion image.
Step 503: described CPU (central processing unit) is obtained the coordinate of described motion characteristic by the predefined coordinate of described solid space;
Step 504: described CPU (central processing unit) is compared all coordinates that described stereopsis comprised and is produced corresponding changes in coordinates with the coordinate of described motion characteristic.
When above-mentioned steps is carried out, described CPU (central processing unit) 31 is by the path of described motion characteristic, sense of displacement, angle or the like, compare with the coordinate system of described solid space 6, just can calculate the interactive relationship between described motion characteristic and the described stereopsis 5, for example described operating body 7 penetrates described stereopsis 5, produces the deformed movement of correspondence away from described stereopsis 5, at described stereopsis 5 peripheries.
Then, just can make described stereopsis 5 produce corresponding variation according to the characteristic of application program 33.Please assist and consult Figure 10, the described solid space 6 with a rectangle cooperates rectangular coordinate in the drawings, shows a stereopsis 5, and described stereopsis 5 can be earlier figures 2 or described described first stereopsis of Fig. 3 or described second stereopsis.
Can know simultaneously and find out, described stereopsis 5 presents the variation of a depression, be that described CPU (central processing unit) 31 senses described stereopsis 5, there is a motion characteristic to produce in Y direction, and it is instructions of a depression that described motion characteristic cooperates described application program 33, the XY plane of then described stereopsis 5 will produce the depression of a Y direction, and the concave face territory is with the cross hatched regions domain representation in the drawings.
See also shown in Fig. 1,14, for preferred embodiment calcspar of the present invention and process flow diagram seven, in this process flow diagram, be mainly integrate aforesaid each flow process after, be illustrated the application of the present invention on SPATIAL CALCULATION or volume calculations.
Step 700: system begins;
After stereopsis interaction systems action of the present invention, in the subsequent step, can carry out described step 701 earlier or carry out described step 702 earlier to described step 703.
Step 701: produce the solid space image and decide internal capacity and coordinate position;
When above-mentioned steps 701 was carried out, the user was shown by described stereopsis display module 4 after selecting the solid space image of desire demonstration by described storage element 32, defines the size and the coordinate position of described solid space simultaneously.
Step 702: by obtaining second stereopsis in the storage element;
Step 703: the scanned object outward appearance also produces first stereopsis;
Above-mentioned steps 702, when step 703 is carried out, see also the flow process narration of earlier figures 2,3, it should be noted, but described step 702, step 703 different order are successively carried out, or selected one and carry out.
Step 704: volume and the reference position of calculating stereopsis (first and second stereopsis);
Above-mentioned steps 704: when carrying out, because the selected needed described a plurality of stereopsis of user, then described CPU (central processing unit) 31 is according to described solid space size, position and the coordinate etc. of predefined, defines described a plurality of stereopsis and is positioned over position in the described solid space; This step can be with reference to the part flow process of earlier figures 6.
Step 705: stereopsis displacement rotation;
Step 706: the stereopsis ratio is adjusted;
Step 707: the instant deformation of stereopsis;
Step 708: stereopsis view transformation;
When above-mentioned steps 705 was carried out to step 708, the demand that the user can be when using select needed functional steps to carry out, and above-mentioned steps 705 please refer to the flow process narration of earlier figures 8 to the relevant narration of step 708.
Step 709: suggestion optimization object volume calculation is to meet the solid space layout;
When above-mentioned steps 709 is carried out, the user is in described solid space, with needed described a plurality of stereopsis complete configuration, comprise size, position, arrangement angles, height of described a plurality of stereopsis or the like, at this moment, the volume that described CPU (central processing unit) 31 can cooperate described a plurality of stereopsis according to the volume of described solid space carries out the calculating of spatial volume utilization rate.
Step 710: solid space layout optimization fine setting;
When above-mentioned steps 710 was carried out, described CPU (central processing unit) 31 had been finished the volume calculations of solid space, at this moment, the user can carry out the configuration fine setting of described a plurality of stereopsis, perhaps according to the demand of using, when being unsatisfied with, can be configured once more and computing for spatial configuration.
Step 711: produce the space utilization relevent information.
When above-mentioned steps 711 is carried out, stereopsis interaction systems of the present invention has been finished the calculating of space utilization rate or volume, at this moment, can produce the data relevant with described solid space, for example: described solid space is when configuration, and whether appropriately whether the spacing of described a plurality of stereopsis (this can be applicable to interior space design), described a plurality of stereopsis can completely fill up described solid space (this can be applicable to warehouse, counter, lorry, luggage case or the like).
More than explanation is just illustrative for the purpose of the present invention, and nonrestrictive, those of ordinary skills understand; under the situation of the spirit and scope that do not break away from following claims and limited, can make many modifications, change; or equivalence, but all will fall within the scope of protection of the present invention.

Claims (28)

1. stereopsis interaction systems is a kind of stereopsis interaction systems that shows a stereopsis and can a touch-control body control described stereopsis, it is characterized in that it comprises:
One stereopsis read module is provided with one or more stereopsis reading unit, and it is in order to obtaining a plurality of object images of a default object, and in order to obtain the motion image of an operating body;
One stereopsis processing unit electrically connects with described a plurality of stereopsis reading units, and it is in order to being integrated into one first stereopsis with described a plurality of object images, and by obtaining a motion characteristic in the described motion image;
One system host includes a CPU (central processing unit) that electrically connects with described stereopsis processing unit, and a storage element that electrically connects with described CPU (central processing unit), wherein;
Described storage element is in order to store described first stereopsis;
Described CPU (central processing unit) and described stereopsis processing unit electrically connect, and it is in order to calculate the instant action of described a plurality of stereopsis under described motion characteristic;
One stereopsis display module electrically connects with described CPU (central processing unit), and it is in order to show the instant action of described stereopsis.
2. stereopsis interaction systems according to claim 1, it is characterized in that, described operating body further is made as a plurality of, then described stereopsis read module is obtained described a plurality of motion images of described a plurality of operating bodies, and is produced described a plurality of motion characteristics of respective amount according to described a plurality of motion images by described stereopsis processing unit.
3. stereopsis interaction systems according to claim 1 is characterized in that, described stereopsis reading unit be a charge coupled cell or be the photo-sensitive cell that constituted of a complementary matal-oxide semiconductor one of them.
4. stereopsis interaction systems according to claim 3, it is characterized in that, described stereopsis reading unit is selected from pure linear sensitization coupling element, scans type sensitization coupling element, full figure type sensitization coupling element or entirely the charge coupled cell group that formed of biography type sensitization coupling element one of them.
5. stereopsis interaction systems according to claim 1, it is characterized in that, described stereopsis processing unit is the integrated circuit form that electrically connects with described CPU (central processing unit), or be the form of firmware of burning in described CPU (central processing unit), or be to read the form of software of union, or be the electronic circuit form that main passive device is formed for described CPU (central processing unit).
6. according to claim 1 a described stereopsis interaction systems, it is characterized in that described stereopsis display module has a three-dimensional imaging unit and a display unit, wherein:
Described display unit and described CPU (central processing unit) electrically connect, and it is in order to show described a plurality of stereopsis;
Described three-dimensional imaging unit is arranged at described display unit surface, and it is in order to transfer described a plurality of stereopsis to a multiple video, and described multiple video is described a plurality of stereopsis after naked eyes receive.
7. stereopsis interaction systems according to claim 6 is characterized in that, described three-dimensional imaging unit for activate lens arrangement, lens pillar structure, liquid crystal lens structure, liquid crystal barrier structure, optical grating construction or fly's-eye lens panel one of them.
8. stereopsis interaction systems according to claim 6, it is characterized in that, described display unit be cathode-ray tube display, LCD, vacuum fluorescent display, plasma display panel, surface-conduction-electron emission display, Field Emission Display or Electronic Paper one of them.
9. stereopsis interaction systems according to claim 8, it is characterized in that, described display unit be Twisted Nematic LCD, vertical alignment type liquid crystal display device, many quadrants vertical alignment type liquid crystal display device, the vertical adjusting type LCD of image, transverse electric field switching liquid crystal display, continuously the fireworks shape arrange group that LCD, optical compensation curved arranged LCD form one of them.
10. stereopsis interaction systems according to claim 8, it is characterized in that, described display unit be the group that formed of active-matrix formula organic electro-luminescent display, these organic light emitting diode display of passive matrix type organic electro-luminescent display one of them.
11. stereopsis interaction systems according to claim 1, it is characterized in that, the instant action that described stereopsis processing unit carries out described stereopsis in the mode of coordinates computed calculates, and be relative coordinate, polar coordinates, spherical coordinates or the mode that is used for spatial coordinates calculation one of them.
12. stereopsis interaction systems according to claim 1 is characterized in that, described stereopsis interaction systems is obtained described first stereopsis according to the following step:
Described stereopsis read module scanned object outward appearance obtains a plurality of object images;
Described stereopsis processing unit is checked the integrity degree of described a plurality of object images;
Described a plurality of object image synthesizes described first stereopsis by described stereopsis processing unit;
Described first stereopsis is transferred to described stereopsis display module through described CPU (central processing unit);
Described stereopsis display module shows described first stereopsis and carries out the operation of stereopsis interaction.
13. stereopsis interaction systems according to claim 12 is characterized in that, described stereopsis read module scanned object outward appearance obtains in the step of a plurality of object images, includes vertical scanning or horizontal scanning.
14. stereopsis interaction systems according to claim 12 is characterized in that, described a plurality of object images are synthesized in the step of described first stereopsis by described stereopsis processing unit, further comprise:
Described stereopsis processing unit calculates the volume of described first stereopsis.
15. stereopsis interaction systems according to claim 14, it is characterized in that, when described stereopsis processing unit calculates the volume of described first stereopsis, for the described a plurality of object image ratios that obtain according to described stereopsis read module are carried out volume calculation.
16. stereopsis interaction systems according to claim 1 is characterized in that, is preset with one second stereopsis in the described storage element.
17. stereopsis interaction systems according to claim 16 is characterized in that, described stereopsis interaction systems is obtained described second stereopsis according to the following step:
Described stereopsis read module scanned object outward appearance obtains a plurality of object images;
Described stereopsis processing unit is searched described storage element and is found out one second stereopsis identical with described a plurality of object images;
Described second stereopsis is transferred to described stereopsis display module through described CPU (central processing unit);
Described stereopsis display module shows described second stereopsis and can carry out the operation of stereopsis interaction.
18. stereopsis interaction systems according to claim 1 is characterized in that, described stereopsis interaction systems is for to carry out the stereopsis interaction according to the following step:
Described stereopsis display module shows one the 3rd stereopsis;
Described operating body advances to take action in described stereopsis and does;
Described stereopsis read module is obtained a motion image of described operating body;
Described stereopsis processing unit is obtained a motion characteristic according to described motion image;
Described CPU (central processing unit) makes described the 3rd stereopsis produce corresponding variation according to described motion characteristic.
19. stereopsis interaction systems according to claim 18 is characterized in that, described the 3rd stereopsis is described first stereopsis or is described second stereopsis default in the described storage element.
20. stereopsis interaction systems according to claim 1 is characterized in that, described stereopsis interaction systems is for to carry out three-dimensional situation interaction according to the following step:
Described stereopsis display module shows one the 4th stereopsis;
Described operating body advances to take action in described stereopsis and does;
Described stereopsis read module is obtained user's a motion image;
Described stereopsis processing unit is obtained a motion characteristic according to described motion image;
Described stereopsis read module calculates the operating position of user on described the 4th stereopsis;
Described CPU (central processing unit) produces corresponding variation according to the pairing contextual status of described motion characteristic and described the 4th stereopsis.
21. stereopsis interaction systems according to claim 1 is characterized in that, described stereopsis interaction systems calculates for carry out spatial coordinate according to the following step:
Described stereopsis display module defines a solid space and show described stereopsis in described solid space;
The operating body contact is arranged in the described solid space of described stereopsis read module sensing;
Described stereopsis read module is obtained the motion image of operating body and is calculated motion characteristic;
Described CPU (central processing unit) is obtained the coordinate of described motion characteristic by the predefined coordinate of described solid space;
Described CPU (central processing unit) is compared all coordinates that described stereopsis comprised and is produced corresponding changes in coordinates with the coordinate of described motion characteristic.
22. stereopsis interaction systems according to claim 21, it is characterized in that, described CPU (central processing unit) is compared all coordinates that described stereopsis comprised with the coordinate of described motion characteristic and is produced in the step of corresponding changes in coordinates, further comprises the following steps:
The coordinate of described motion characteristic is to pass through all coordinates that described stereopsis comprises;
The body of described stereopsis is back variation.
23. stereopsis interaction systems according to claim 21, it is characterized in that, described CPU (central processing unit) is compared all coordinates that described stereopsis comprised with the coordinate of described motion characteristic and is produced in the step of corresponding changes in coordinates, further comprises the following steps:
The coordinate of described motion characteristic is all coordinates away from described stereopsis comprised;
The body of described stereopsis is the variation that stretches.
24. stereopsis interaction systems according to claim 1 is characterized in that, described stereopsis interaction systems is for to carry out SPATIAL CALCULATION according to the following step:
Described stereopsis display module produces a solid space image and defines its volume size;
Insert at least more than one described stereopsis in the described solid space;
Described operating body disposes described a plurality of stereopsis in described stereopsis space;
Described stereopsis processing unit is obtained the volume of described a plurality of stereopsis and is calculated the volume utilization rate of described solid space.
25. stereopsis interaction systems according to claim 24 is characterized in that, described stereopsis is first stereopsis or is one second default in described storage element stereopsis.
26. stereopsis interaction systems according to claim 25, it is characterized in that, when described stereopsis is described first stereopsis, when described stereopsis processing unit produces in described first stereopsis, calculate the volume of described first stereopsis by described CPU (central processing unit).
27. stereopsis interaction systems according to claim 25 is characterized in that, when described stereopsis was described second stereopsis, described CPU (central processing unit) obtained its volume in advance when described second stereopsis stores.
28. stereopsis interaction systems according to claim 24, it is characterized in that, described operating body disposes in the step of described a plurality of stereopsis in described stereopsis space, described stereopsis processing unit calculates the volume of described a plurality of stereopsis immediately according to the action of described operating body.
CN 200910167142 2009-08-26 2009-08-26 three-dimensional image interactive system Expired - Fee Related CN101995943B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 200910167142 CN101995943B (en) 2009-08-26 2009-08-26 three-dimensional image interactive system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 200910167142 CN101995943B (en) 2009-08-26 2009-08-26 three-dimensional image interactive system

Publications (2)

Publication Number Publication Date
CN101995943A CN101995943A (en) 2011-03-30
CN101995943B true CN101995943B (en) 2011-12-14

Family

ID=43786185

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 200910167142 Expired - Fee Related CN101995943B (en) 2009-08-26 2009-08-26 three-dimensional image interactive system

Country Status (1)

Country Link
CN (1) CN101995943B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI496028B (en) * 2013-05-24 2015-08-11 Univ Central Taiwan Sci & Tech Cell phone with contact free controllable function
TWI563818B (en) * 2013-05-24 2016-12-21 Univ Central Taiwan Sci & Tech Three dimension contactless controllable glasses-like cell phone

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102736728A (en) * 2011-04-11 2012-10-17 宏碁股份有限公司 Control method and system for three-dimensional virtual object and processing device for three-dimensional virtual object
US8878780B2 (en) 2011-07-10 2014-11-04 Industrial Technology Research Institute Display apparatus
US20130222363A1 (en) * 2012-02-23 2013-08-29 Htc Corporation Stereoscopic imaging system and method thereof
TWI471665B (en) * 2012-04-11 2015-02-01 Au Optronics Corp 2d and 3d switchable display device
CN103677243B (en) * 2012-09-25 2017-03-01 联想(北京)有限公司 A kind of control method, device and multimedia input-output system
CN103177245B (en) * 2013-03-25 2017-02-22 深圳泰山在线科技有限公司 gesture recognition method and device
CN103714322A (en) * 2013-12-26 2014-04-09 四川虹欧显示器件有限公司 Real-time gesture recognition method and device
CN105446623A (en) * 2015-11-20 2016-03-30 广景视睿科技(深圳)有限公司 Multi-interaction projection method and system
CN107393146A (en) * 2017-03-23 2017-11-24 张建中 Commodity are shown using stereopsis and the method and vending machine of interactive operation advertising results are provided
TWI716186B (en) * 2019-11-12 2021-01-11 揚明光學股份有限公司 Projection system
CN111832987B (en) * 2020-06-23 2021-04-02 江苏臻云技术有限公司 Big data processing platform and method based on three-dimensional content

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1455339A (en) * 2002-04-30 2003-11-12 矽统科技股份有限公司 Control method and device of three-dimensional image display for non-stereo video source
CN101400002A (en) * 2007-09-24 2009-04-01 鸿富锦精密工业(深圳)有限公司 Stereo video apparatus

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1455339A (en) * 2002-04-30 2003-11-12 矽统科技股份有限公司 Control method and device of three-dimensional image display for non-stereo video source
CN101400002A (en) * 2007-09-24 2009-04-01 鸿富锦精密工业(深圳)有限公司 Stereo video apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JP特開2008-210359A 2008.09.11

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI496028B (en) * 2013-05-24 2015-08-11 Univ Central Taiwan Sci & Tech Cell phone with contact free controllable function
TWI563818B (en) * 2013-05-24 2016-12-21 Univ Central Taiwan Sci & Tech Three dimension contactless controllable glasses-like cell phone

Also Published As

Publication number Publication date
CN101995943A (en) 2011-03-30

Similar Documents

Publication Publication Date Title
CN101995943B (en) three-dimensional image interactive system
KR101074940B1 (en) Image system
JP6893868B2 (en) Force sensation effect generation for space-dependent content
CN103180893B (en) For providing the method and system of three-dimensional user interface
RU2524834C2 (en) Autostereoscopic rendering and display apparatus
CN103858074B (en) The system and method interacted with device via 3D display device
CN104471511B (en) Identify device, user interface and the method for pointing gesture
US10739936B2 (en) Zero parallax drawing within a three dimensional display
US20100128112A1 (en) Immersive display system for interacting with three-dimensional content
CN103069821B (en) Image display device, method for displaying image and image correcting method
US20110164032A1 (en) Three-Dimensional User Interface
EP3106963B1 (en) Mediated reality
KR20110082636A (en) Spatially correlated rendering of three-dimensional content on display components having arbitrary positions
CN103443746A (en) Three-dimensional tracking of a user control device in a volume
WO2007019443A1 (en) Interactive video display system
CN116719176B (en) Intelligent display system of intelligent exhibition hall
US20110149042A1 (en) Method and apparatus for generating a stereoscopic image
CN104349157A (en) 3D displaying apparatus and method thereof
Wang et al. Transitioning360: Content-aware nfov virtual camera paths for 360 video playback
US9122346B2 (en) Methods for input-output calibration and image rendering
Jáuregui et al. Design and evaluation of 3D cursors and motion parallax for the exploration of desktop virtual environments
CN113168228A (en) Systems and/or methods for parallax correction in large area transparent touch interfaces
Ma et al. A real‐time interactive rendering method for 360° tabletop integral imaging 3D display
Huang et al. Three-dimensional virtual touch display system for multi-user applications
US9465483B2 (en) Methods for input-output calibration and image rendering

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20111214

Termination date: 20160826