CN101727175B - Computer system and control method thereof - Google Patents

Computer system and control method thereof Download PDF

Info

Publication number
CN101727175B
CN101727175B CN200810172803XA CN200810172803A CN101727175B CN 101727175 B CN101727175 B CN 101727175B CN 200810172803X A CN200810172803X A CN 200810172803XA CN 200810172803 A CN200810172803 A CN 200810172803A CN 101727175 B CN101727175 B CN 101727175B
Authority
CN
China
Prior art keywords
virtual objects
computer system
entity object
display
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN200810172803XA
Other languages
Chinese (zh)
Other versions
CN101727175A (en
Inventor
张耀元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial Technology Research Institute ITRI
Original Assignee
Industrial Technology Research Institute ITRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial Technology Research Institute ITRI filed Critical Industrial Technology Research Institute ITRI
Priority to CN200810172803XA priority Critical patent/CN101727175B/en
Publication of CN101727175A publication Critical patent/CN101727175A/en
Application granted granted Critical
Publication of CN101727175B publication Critical patent/CN101727175B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a computer system and a control method thereof. The computer system comprises a containing body and a display unit, wherein the containing body is provided with an opening which enables an entity object to enter the containing body; the display unit is used for displaying a picture. The control method of the computer system comprises the following steps: recognizing a first virtual role corresponding to the entity object; detecting the spatial parameter of the entity object, which corresponds to the containing body; according to the first virtual role and the spatial parameter, obtaining the first display parameter of the entity object, which corresponds to the picture; and according to the first display parameter, drawing a first virtual object on the picture.

Description

Computer system and control method thereof
Technical field
The present invention relates to a kind of computer system and control method thereof, and be particularly related to a kind of computer system and control method thereof of handling virtual screen according to entity object.
Background technology
General picture interaction is to carry out with the mode of joystick, mouse or keyboard, the operator under order accomplish by the menu interface of software with the action of hoping to reach is many.The operator must click complicated step with mouse, perhaps writes down the fast key of keyboard and operates.
This mode of operation is very strange for most of operator, also has the door of certain degree.Make the interactive application of present picture only terminate in youthful interactive game, seldom be applied to child's interactive teaching teaching material or older's amusement.
Summary of the invention
The present invention relates to a kind of computer system and control method thereof, the program that it utilizes detecting and control, make computer system can be in virtual picture position, size, the angle of complete reaction entity object, increase the sense of reality of virtual reality.And entity object also can carry out interaction with the virtual objects that illustrated in the picture, increases many interests.
A kind of control method of computer system is proposed according to an aspect of the present invention.Computer system comprises an accommodating body and a display unit.Accommodating body has an opening.Opening is used so that an entity object gets into accommodating body.Display unit is in order to show a picture.The control method of computer system may further comprise the steps.Pairing one first virtual role of identification entity object.The detecting entity object is corresponding to a spatial parameter of accommodating body.According to first virtual role and spatial parameter, obtain one first display parameter of entity object corresponding to picture.According to first display parameter, illustrate one first virtual objects in picture.
According to a further aspect in the invention, a kind of computer system is proposed.Computer system comprises an accommodating body, a display unit, a detecting unit and a control module.Accommodating body has an opening.Opening is used so that an entity object gets into accommodating body.Display unit is in order to show a picture.Detecting unit is in order to pairing one first virtual role of identification entity object, and in order to the spatial parameter of detecting entity object corresponding to accommodating body.Control module obtains one first display parameter of entity object corresponding to picture, and according to first display parameter, illustrates one first virtual objects in picture according to first virtual role and spatial parameter.
For letting the foregoing of the present invention can be more obviously understandable, hereinafter is special lifts preferred embodiment, and combines accompanying drawing, elaborates as follows:
Description of drawings
1A figure illustrates the calcspar of the computer system of first embodiment of the invention;
1B figure illustrates the synoptic diagram of entity object, accommodating body and display unit;
1C figure illustrates the synoptic diagram of the detecting unit of 1A figure;
The 2nd figure illustrates the process flow diagram of the control method of computer system of the present invention;
The 3rd figure illustrates the detail flowchart of the step S110 of the 2nd figure;
The 4th figure illustrates the detail flowchart of the step S120 of step the 2 figure;
The 5th figure illustrates the detail flowchart of the step S160 of the 2nd figure;
The 6A~6C illustrates the synoptic diagram of the respective action of second virtual objects;
The 7th figure illustrates the synoptic diagram of the computer system of second embodiment of the invention;
The 8th figure illustrates the detail flowchart of the step S110 of second embodiment of the invention;
The 9th figure illustrates the synoptic diagram of the computer system of third embodiment of the invention;
The 10th figure illustrates the detail flowchart of the step S110 of third embodiment of the invention;
11A figure illustrates the synoptic diagram of the computer system of fourth embodiment of the invention;
11B figure illustrates the vertical view of accommodating body and ultrared ray generator;
The 12nd figure illustrates the detail flowchart of the step S120 of fourth embodiment of the invention;
The 13rd figure illustrates the synoptic diagram of the computer system of fifth embodiment of the invention;
The 14th figure illustrates the detail flowchart of the step S120 of fifth embodiment of the invention;
The 15th figure illustrates the detail flowchart of the step S160 of sixth embodiment of the invention;
The 16th figure illustrates the detail flowchart of the step S161 of the 15th figure;
The 17th figure illustrates the detail flowchart of the step S160 of seventh embodiment of the invention;
The 18th figure illustrates the detail flowchart of the step S160 of eighth embodiment of the invention.
[primary clustering symbol description]
100,200,300,400: computer system
110: accommodating body
110a: opening
110b: bottom
120: display unit
130,230,330,430,530: detecting unit
131: video capture device
132: image analyzer
140: control module
150: storage element
The 231:RFID scanner
The 232:RFID analyzer
331,332: electrical contactor
333: the Resistance Analysis device
431: ultrared ray generator
432: infrared analyzer unit
531: the ultrasonic generator
532: the ultrasonic analyzer
900: entity object
The 920:RFID label
930: identification resistance
S110~S160, S111~S112, S121~S125, S161~S162, S211~S212, S311~S212, S421~S422, S521~S522, S661~S662, S6611~S6614, S761~S762: process step
Embodiment
First embodiment
Please with reference to the 1A~1C figure; 1A figure illustrates the calcspar of the computer system 100 of first embodiment of the invention; 1B figure illustrates the synoptic diagram of entity object 900, accommodating body 110 and display unit 120, and 1C figure illustrates the synoptic diagram of the detecting unit 130 of 1A figure.Computer system 100 comprises an accommodating body 110, a display unit 120, a detecting unit 130, a control module 140 and a storage element 150.Accommodating body 110 for example is box hollow structure, tabular hollow structure or column hollow structure.In the present embodiment, accommodating body 110 is the box hollow structure.Accommodating body 110 has an opening 110a and a bottom 110b.Opening 110a is with respect to bottom 110b.Opening 110a uses so that an entity object 900 gets into accommodating bodies 110.
Display unit 120 is in order to show a picture.Display unit 120 for example is a LCD Panel or an iconoscope display screen.In the present embodiment, display unit 120 and accommodating body 110 shaped structures that is combined into one.
Detecting unit 130 is in order to identification entity object 900 pairing first virtual roles.First virtual role for example is roles such as feeding bottle, bear baby, fishhook, doggie or palm.Detecting unit 130 and in order to detecting entity object 900 corresponding to the spatial parameter of accommodating body 100.Spatial parameter for example is the locus and the anglec of rotation.The locus for example is height or horizontal level of entity object 900 etc., and the anglec of rotation for example is the rotational angle of entity object 900.In the present embodiment, detecting unit 130 comprises a video capture device 131 (being illustrated in 1C figure) and an image analyzer 132 (being illustrated in 1C figure).
Control module 140 obtains first display parameter of entity object 900 corresponding to picture according to first virtual role and spatial parameter.First display parameter for example are that first display position, one first shows size, one first moving direction and one first rate travel.Control module 140 and foundation first display parameter illustrate one first virtual objects in picture.
Below further the collocation process flow diagram specify the control method of the computer system 100 of present embodiment.Please be simultaneously with reference to 1A~2 figure, the 2nd figure illustrates the process flow diagram of the control method of computer system 100 of the present invention.At first, in step S110, detecting unit 140 identification entity objects 900 pairing first virtual roles.In the present embodiment, the mode of identification first virtual role adopts the mode of image identification.
Please with reference to the 3rd figure, it illustrates the detail flowchart of the step S110 of the 2nd figure.The step S110 of present embodiment comprises step S111~S112.In step S111, the object image of video capture device 131 acquisition entity objects 900.Then, in step S112, image analyzer 132 obtains first virtual role of entity object 900 according to the object image.For example, storage element 150 has stored the table of comparisons of image data and virtual role in advance.Image analyzer 132 can directly be compared out the object image and belong to which virtual role from the table of comparisons.If can't from the table of comparisons, look for the virtual role that is fit to, also can directly define a new virtual role.
Then, in step S120, detecting unit 130 detecting entity objects 900 are corresponding to the spatial parameter of accommodating body 100.In the present embodiment, the spatial parameter of detecting unit 130 detecting entity objects 900 comprises entity object 900 height and the horizontal level of bottom 100b relatively.
Please with reference to the 4th figure, it illustrates the detail flowchart of the step S120 of step the 2 figure.The step S120 of present embodiment comprises step S121~S125.In step S121, video capture device 131 is by the object image of the bottom 110b acquisition entity object 900 of accommodating body 110 and the background video of opening 110a.Entity object 900 for example is to comprise acceleration induction device (G-Sensor) and Wireless Transmitter.The step S120 that detects spatial parameter also can for example be the variation that comprises detecting acceleration induction device; Reach variation, obtain the anglec of rotation of entity object 900 according to the acceleration sensing device.Detecting unit 130 also can for example be to comprise wireless receiver and acceleration analysis device.Wireless receiver is in order to receive the variation of acceleration induction device.The acceleration analysis device obtains the anglec of rotation of entity object 900 according to the variation of acceleration induction device.
Then, in step S122, image analyzer 132 obtains the height of entity object 900 with respect to bottom 110b according to the size of object image.For instance, when the object image was big more, presentation-entity object 900 was low more with respect to the height of bottom 110b; Object image more hour, presentation-entity object 900 is high more with respect to the height of bottom 110b.
Then, in step S123, image analyzer 132 is judged the horizontal level of entity object 900 with respect to bottom 110b according to the position of object image with respect to background video.For instance, during near the upper left corner, presentation-entity object 900 is close to the left side at rear to the object image with respect to the horizontal level of bottom 110b with respect to the position of background video.
Then, in step S124, this is captured by different angles with reference to image a bit with reference to image by storage element 150 several of entity object 900 to be provided, and for example is six face pictures such as up, down, left, right, before and after.
Then, in step S125, image analyzer 132 therewith a bit with reference to the comparison result of image, obtains the anglec of rotation of entity object 900 according to the object image.This anglec of rotation can be the various angles of three-dimensional space 360 degree.For instance, the left side of having contained more ratio when the object image as the front of image and less ratio as image, the anglec of rotation of presentation-entity object 900 deflection left side then.
Then, in step S130, control module 140 obtains first display parameter of entity object 900 corresponding to picture according to first virtual role and the spatial parameter of entity object 900.For instance, please with reference to following table one, control module 140 obtains each item first display parameter respectively according to first virtual role and the spatial parameter of entity object 900.
Figure GSB00000513065600071
Table one
Then, in step S140, control module 140 illustrates first virtual objects according to first display parameter in picture.Above table one and 1B figure are example, and control module 140 shows the front feeding bottle pattern that dwindles on picture, and the feeding bottle pattern is illustrated in the position that picture is higher, take over.
Then, in step S150, storage element 150 provides second virtual role and second display parameter of second virtual objects that is illustrated in picture.Shown in the 1st figure, second virtual objects for example is a kind of baby's pattern.
Then, in step S160,, change first display parameter or second display parameter according to first virtual role, second virtual role, first display parameter or second display parameter.
Please with reference to the 5th figure, it illustrates the detail flowchart of the step S160 of the 2nd figure.In the present embodiment, step S160 comprises step S161~S162, and second virtual objects is a baby.At first, in step S161, control module 140 obtains the respective action of second virtual objects according to first virtual role, second virtual role and first display parameter.
Then, in step S162, control module 140 changes second display parameter according to the respective action of second virtual objects.For instance, please with reference to table two and the 6A~6C figure, the 6A~6C figure illustrates the synoptic diagram of the respective action of second virtual objects.When first virtual role is when place of keeping right that the feeding bottle and first virtual objects are positioned at picture, control module 140 changes second display parameter of second virtual objects again, makes baby's pattern towards right-hand creeping, and chases the feeding bottle feed.
Figure GSB00000513065600081
Table two
Computer system 100 and control method thereof that present embodiment is above-mentioned; The user is in the process of application entity object 900; First virtual objects on the picture will with entity object 900 synchronous operations; And entity object 900 also can be directly and second virtual objects on the picture interactive, considerably realistic.Especially be applied to more can promote interest on the products such as recreation or advertisement.
Second embodiment
Please with reference to the 7th~8 figure, the 7th figure illustrates the synoptic diagram of the computer system 200 of second embodiment of the invention, and the 8th figure illustrates the detail flowchart of the step S110 of second embodiment of the invention.Computer system 100 differences of the computer system 200 of present embodiment and first embodiment are the detailed process of detecting unit 230 and step S110, and all the other something in common no longer repeat.
In the present embodiment, entity object 900 have a frequency identification system (Radio Frequency Identification, RFID) label 920, frequency identification volume label 920 directly is embedded in the entity object 900.Detecting unit 230 comprises a RFID scanner 231 and a RFID analyzer 232.Step S110 comprises step S211~S212.In step S211,, promptly can RFID scanner 231 scan the RFID label 920 of entity object 900 when entity object 900 during through opening 110a.
Then, in step S222, RFID analyzer 232 obtains first virtual role of entity object 900 again according to the identity data of RFID label 920.For instance, storage element 150 stores the correspondence table of identity data and virtual role in advance.RFID analyzer 232 can directly be compared out identity data and belong to which virtual role from the table of comparisons.If can't from the table of comparisons, look for the virtual role that is fit to, also can directly define a new virtual role.
The 3rd embodiment
Please with reference to the 9th~10 figure, the 9th figure illustrates the synoptic diagram of the computer system 300 of third embodiment of the invention, and the 10th figure illustrates the detail flowchart of the step S110 of third embodiment of the invention.Computer system 100 differences of the computer system 300 of present embodiment and first embodiment are the detailed process of detecting unit 330 and step S110, and all the other something in common no longer repeat.
In the present embodiment, entity object 900 comprises an identification resistance 930, and detecting unit 330 comprises two electrical contactor 331,332 and Resistance Analysis devices 333.The first different virtual role of different resistance values representative of identification resistance 930.Step S110 comprises step S311~S312.In step S311, when entity object 900 passes through opening 110a, two electrical contactors 331,332 will contact and measure a resistance value of identification resistance 930.
Then, in step S312,333 of Resistance Analysis devices obtain first virtual role of entity object 900 according to resistance value.For instance, storage element 150 stores the correspondence table of resistance value and virtual role in advance.Which virtual role is Resistance Analysis device 333 can directly be compared out measurement from the table of comparisons resistance value belong to.If can't from the table of comparisons, look for the virtual role that is fit to, also can directly define a new virtual role.
The 4th embodiment
Please with reference to 11A~12 figure; 11A figure illustrates the synoptic diagram of the computer system 400 of fourth embodiment of the invention; 11B figure illustrates the vertical view of accommodating body 110 and ultrared ray generator 431, and the 12nd figure illustrates the detail flowchart of the step S120 of fourth embodiment of the invention.Computer system 100 differences of the computer system 400 of present embodiment and first embodiment are the detailed process of detecting unit 430 and step S120, and all the other something in common no longer repeat.
In the present embodiment, detecting unit 430 comprises a plurality of ultrared ray generators 431 and an infrared analyzer unit 432.Ultrared ray generator 431 is distributed in the inside of accommodating body 110.The step S121 of first embodiment~S123 is then replaced by the step S421 of present embodiment~S422.At first, in step S421, after entity object 900 got into accommodating body 110, ultrared ray generator 431 provided several infrared rays.Shown in 11A figure, ultrared ray generator 431 is distributed in differing heights.Shown in 11B figure, ultrared ray generator 431 is sentenced matrix form at sustained height and is distributed.
Then, in step S422, the situation that infrared analyzer unit 432 is interdicted according to these a little infrared rays, the locus of acquisition entity object 900.
The 5th embodiment
Please with reference to the 13rd~14 figure, the 13rd figure illustrates the synoptic diagram of the computer system 500 of fifth embodiment of the invention, and the 14th figure illustrates the detail flowchart of the step S120 of fifth embodiment of the invention.Computer system 100 differences of the computer system 500 of present embodiment and first embodiment are the detailed process of detecting unit 530 and step S120, and all the other something in common no longer repeat.
In the present embodiment, detecting unit 530 comprises a ultrasonic generator 531 and a ultrasonic analyzer 532.The step S121 of first embodiment~S123 is then replaced by the step S521 of present embodiment~S522.At first, in step S521, after entity object 900 got into accommodating body 110, ultrasonic generator 531 provided a ultrasonic.
Then, in step S522, ultrasonic analyzer 532 obtains the locus of entity object 900 according to supersonic reflection case.
The 6th embodiment
Please with reference to the 15th figure, it illustrates the detail flowchart of the step S160 of sixth embodiment of the invention.Computer system 100 differences of the computer system of present embodiment and first embodiment are the detailed process of detecting unit and step S160, and all the other something in common no longer repeat.
In the present embodiment; First display parameter comprise first display position, first and show size, first moving direction and first rate travel; Second display parameter comprise second display position, second and show size, second moving direction and second rate travel, and step S160 comprises step S661~S662.And step S661 comprises step S6611~S6614.Wherein, in step S661, control module 140 shows that according to first display position, first size, second display position and second show size, judge whether first virtual objects and second virtual objects collide.If first virtual objects and the collision of second virtual objects then get into step S662; If first virtual objects and second virtual objects do not collide, then be back to step S661.
Please with reference to the 16th figure, it illustrates the detail flowchart of the step S661 of the 15th figure.In the present embodiment, step S661 further comprises several steps S6611~S6614.At first, in step S6611, control module 140 shows size according to first display position and first, obtains one first border circle.First border circle is for can completely containing the circle of first virtual objects.
Then, in step S6612, control module 140 shows size according to second display position and second, obtains one second border circle.Second border circle is for can completely containing the circle of second virtual objects.
Then, in step S6613, control module 140 judges whether first border circle and second border circle intersect.If first border circle and second border circle intersect, then get into step S6614; If first border circle and second border circle do not intersect, then be back to step S6613.
Then, in step S6614, control module 140 definition first virtual objects and second virtual objects collide.
In step S662, control module 140 changes first moving direction, first rate travel, second moving direction and second rate travel according to first moving direction, first rate travel, second moving direction and second rate travel.For instance, if first moving direction and second moving direction approach parallel and in the same way the time, the object that then will be clashed into (first virtual objects or second virtual objects) quickens to move.Perhaps, when first moving direction and second moving direction are not parallel, then will be changed moving direction and rate travel according to operation result by two objects of mutual bump (first virtual objects and second virtual objects).
The 7th embodiment
Please with reference to the 17th figure, it illustrates the detail flowchart of the step S160 of seventh embodiment of the invention.The computer system difference of the computer system of present embodiment and the 6th embodiment is the detailed process of step S160, and all the other something in common no longer repeat.
In the present embodiment, the step S161 of first embodiment~S162 is replaced by step S761~S763, and first virtual objects for example is the combination of a fishline and bait, and second virtual objects for example is a goldfish.At first, in step S761, according to first display position and second display position, control module 140 drives second virtual objects and moves towards first virtual objects.
Then, in step S762, control module 140 judges whether first virtual objects and second virtual objects collide.If first virtual objects and the collision of second virtual objects then get into step S763; If first virtual objects and second virtual objects do not collide, then be back to step S762.
Then, in step S763, if first virtual objects and the collision of second virtual objects then trigger a scheduled event.Predetermine one for example is when goldfish touches bait, then goldfish has been angled.
The 8th embodiment
Please with reference to the 18th figure, it illustrates the detail flowchart of the step S160 of eighth embodiment of the invention.The computer system difference of the computer system of present embodiment and first embodiment is the detailed process of step S160, and all the other something in common no longer repeat.
In the present embodiment, the step S161 of first embodiment~S162 is replaced by step S861~S862, and second virtual objects is a baby.At first, in step S661, control module 140 obtains the reacting condition of second display parameter of second virtual objects according to first virtual role and second virtual role.
Then, control module 140 changes second display parameter according to the reacting condition of second display parameter.For instance, please with reference to table three, when first virtual role was doggie, the rate travel of baby's pattern of second virtual objects then increased by 1.5 times.
Figure GSB00000513065600131
Table three
Computer system that the above embodiment of the present invention disclosed and control method thereof see through the program of detecting and control, make computer system can be in virtual picture position, size, the angle of complete reaction entity object, increase the sense of reality of virtual reality.And entity object also can carry out interaction with the virtual objects that illustrated in the picture, increases many interests.
In sum, though the present invention discloses as above with preferred embodiment, so it is not in order to limit the present invention.Person of ordinary skill in the field of the present invention is not breaking away from the spirit and scope of the present invention, when doing various changes and retouching.Therefore, protection scope of the present invention is as the criterion when looking the accompanying Claim person of defining.

Claims (24)

1. the control method of a computer system; This computer system comprises an accommodating body and a display unit, and this accommodating body has an opening, and this opening is used so that an entity object gets into this accommodating body; This display unit is in order to show a picture; This picture further illustrates one second virtual objects, and this second virtual objects is corresponding to one second virtual role, and the control method of this computer system comprises:
Pairing one first virtual role of this entity object of identification;
Detect the spatial parameter of this entity object corresponding to this accommodating body;
According to this first virtual role and this spatial parameter, obtain one first display parameter of this entity object corresponding to this picture;
According to these first display parameter, illustrate one first virtual objects in this picture;
One second display parameter of this second virtual objects are provided; And
According to this first virtual role, this second virtual role, these first display parameter and this second display parameter, change these first display parameter or this second display parameter.
2. the control method of computer system as claimed in claim 1, wherein this first virtual role step of identification comprises:
Capture an object image of this entity object; And
According to this object image, obtain this first virtual role of this entity object.
3. the control method of computer system as claimed in claim 1, wherein this spatial parameter comprises that this entity object is positioned at a locus of this accommodating body.
4. the control method of computer system as claimed in claim 3, the step of wherein detecting this spatial parameter comprises:
Capture an object image of this entity object by a bottom of this accommodating body; And
According to the size of this object image, obtain the height of this entity object with respect to this bottom.
5. the control method of computer system as claimed in claim 3, the step of wherein detecting this spatial parameter comprises:
Bottom by this accommodating body captures an object image of this entity object and a background video of this opening; And
According to the position of this object image, obtain the horizontal level of this entity object with respect to this bottom with respect to this background video.
6. the control method of computer system as claimed in claim 1, wherein this spatial parameter comprises an anglec of rotation of this entity object.
7. the control method of computer system as claimed in claim 6, the step of wherein detecting this spatial parameter comprises:
Many that this entity object is provided with reference to image, and those are captured by different angles with reference to image;
Capture an object image of this entity object; And
According to this object image and those comparison results, obtain this anglec of rotation of this entity object with reference to image.
8. the control method of computer system as claimed in claim 6, wherein this entity object comprises an acceleration induction device (G-Sensor), the step of detecting this spatial parameter comprises:
Detect the variation of this acceleration induction device; And
According to the variation of this acceleration sensing device, obtain this anglec of rotation of this entity object.
9. the control method of computer system as claimed in claim 1; Wherein these first display parameter comprise one first display position, one first and show size, one first moving direction and one first rate travel; These second display parameter comprise one second display position, one second and show size, one second moving direction and one second rate travel, and the step that changes these first display parameter or these second display parameter comprises:
According to this first display position, this first demonstration size, this second display position and this second demonstration size, judge whether this first virtual objects and this second virtual objects collide; And
If this first virtual objects and the collision of this second virtual objects; Then, change this first moving direction, this first rate travel, this second moving direction and this second rate travel according to this first moving direction, this first rate travel, this second moving direction and this second rate travel.
10. the control method of computer system as claimed in claim 9, judge that wherein the step whether this first virtual objects and this second virtual objects collide comprises:
According to this first display position and this first demonstration size, obtain one first border circle;
According to this second display position and this second demonstration size, obtain one second border circle;
Judge whether this first border circle and this second border circle intersect; And
If this first border circle and this second border circle intersect, then define this first virtual objects and this second virtual objects collides.
11. the control method of computer system as claimed in claim 1, the step that wherein changes these first display parameter or these second display parameter comprises:
According to this first virtual role, this second virtual role and this first display parameter, obtain the respective action of this second virtual objects; And
According to the respective action of this second virtual objects, change this second display parameter.
12. the control method of computer system as claimed in claim 1; Wherein these first display parameter comprise one first display position and one first and show size; These second display parameter comprise one second display position and one second and show size, and the step that changes these first display parameter or these second display parameter comprises:
According to this first display position and this second display position, drive this second virtual objects and move towards this first virtual objects;
According to this first display position, this first demonstration size, this second display position and this second demonstration size, judge whether this first virtual objects and this second virtual objects collide; And
If this first virtual objects and the collision of this second virtual objects then trigger a scheduled event.
13. a computer system comprises:
One accommodating body has an opening, and this opening is used so that an entity object gets into this accommodating body;
One display unit is in order to show a picture;
One detecting unit, in order to pairing one first virtual role of this entity object of identification, and in order to detect the spatial parameter of this entity object corresponding to this accommodating body; And
One control module; According to this first virtual role and this spatial parameter; Obtain one first display parameter of this entity object corresponding to this picture; And, illustrating one first virtual objects and one second virtual objects respectively at this picture according to these first display parameter and this second display parameter, this second virtual objects is corresponding to one second virtual role; This control module changes these first display parameter or this second display parameter according to this first virtual role, this second virtual role, these first display parameter and this second display parameter.
14. computer system as claimed in claim 13, wherein this detecting unit comprises:
One video capture device is in order to capture an object image of this entity object; And
One image analyzer in order to according to this object image, obtains this first virtual role of this entity object.
15. computer system as claimed in claim 13, wherein this spatial parameter comprises that this entity object is positioned at a locus of this accommodating body.
16. computer system as claimed in claim 15, wherein this detecting unit comprises:
One video capture device is in order to be captured an object image of this entity object by a bottom of this accommodating body; And
One image analyzer according to the size of this object image, obtains the height of this entity object with respect to this bottom.
17. computer system as claimed in claim 15, wherein this detecting unit comprises:
One video capture device captures an object image of this entity object and a background video of this opening in order to the bottom by this accommodating body; And
One image analyzer according to the position of this object image with respect to this background video, obtains the horizontal level of this entity object with respect to this bottom.
18. computer system as claimed in claim 13, wherein this spatial parameter comprises an anglec of rotation of this entity object.
19. computer system as claimed in claim 18 further comprises a storage element, those are captured by different angles with reference to image this storage element with reference to image in order to store many of this entity object, and this detecting unit comprises:
One video capture device is in order to capture an object image of this entity object; And
One image analyzer according to this object image and those comparison results with reference to image, obtains this anglec of rotation of this entity object.
20. computer system as claimed in claim 18, wherein this entity object comprises an acceleration induction device (G-Sensor) and a Wireless Transmitter, and this detecting unit comprises:
One wireless receiver is in order to receive the variation of this acceleration induction device; And
One acceleration analysis device according to the variation of this acceleration sensing device, obtains this anglec of rotation of this entity object.
21. computer system as claimed in claim 13; Wherein these first display parameter comprise one first display position, one first and show size, one first moving direction and one first rate travel; These second display parameter comprise one second display position, one second and show size, one second moving direction and one second rate travel; This control module is according to this first display position, this first demonstration size, this second display position and this second demonstration size; Judge whether this first virtual objects and this second virtual objects collide; If this first virtual objects and the collision of this second virtual objects; Then this control module changes this first moving direction, this first rate travel, this second moving direction and this second rate travel according to this first moving direction, this first rate travel, this second moving direction and this second rate travel.
22. computer system as claimed in claim 21; Wherein this control module obtains one first border circle according to this first display position and this first demonstration size, and according to this second display position and this second demonstration size; Obtain one second border circle; And judge whether this first border circle and this second border circle intersect, if this first border circle and this second border circle intersect, then define this first virtual objects and this second virtual objects collides.
23. computer system as claimed in claim 13; Wherein this control module is according to this first virtual role, this second virtual role and this first display parameter; Obtain the respective action of this second virtual objects, and, change this second display parameter according to the respective action of this second virtual objects.
24. computer system as claimed in claim 13; Wherein these first display parameter comprise one first display position and one first and show size; These second display parameter comprise one second display position and one second and show size, and this control module drives this second virtual objects and moves towards this first virtual objects according to this first display position and this second display position; And according to this first display position, this first demonstration size, this second display position and this second demonstration size; Judge whether this first virtual objects and this second virtual objects collide, if this first virtual objects and the collision of this second virtual objects, then this control module triggers a scheduled event.
CN200810172803XA 2008-10-29 2008-10-29 Computer system and control method thereof Expired - Fee Related CN101727175B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200810172803XA CN101727175B (en) 2008-10-29 2008-10-29 Computer system and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200810172803XA CN101727175B (en) 2008-10-29 2008-10-29 Computer system and control method thereof

Publications (2)

Publication Number Publication Date
CN101727175A CN101727175A (en) 2010-06-09
CN101727175B true CN101727175B (en) 2012-07-04

Family

ID=42448156

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200810172803XA Expired - Fee Related CN101727175B (en) 2008-10-29 2008-10-29 Computer system and control method thereof

Country Status (1)

Country Link
CN (1) CN101727175B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108744512A (en) * 2018-06-01 2018-11-06 腾讯科技(深圳)有限公司 Information cuing method and device, storage medium and electronic device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1858677A (en) * 2006-05-23 2006-11-08 郭超逸 Photoelectric keyboard
CN1293519C (en) * 2002-12-19 2007-01-03 索尼公司 Apparatus, method and programe for precessing information

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1293519C (en) * 2002-12-19 2007-01-03 索尼公司 Apparatus, method and programe for precessing information
CN1858677A (en) * 2006-05-23 2006-11-08 郭超逸 Photoelectric keyboard

Also Published As

Publication number Publication date
CN101727175A (en) 2010-06-09

Similar Documents

Publication Publication Date Title
US8803801B2 (en) Three-dimensional interface system and method
US20060139314A1 (en) Interactive video display system
US20060132432A1 (en) Interactive video display system
CN102141885B (en) Image processing device and image processing method
JP4927148B2 (en) Computer system and control method thereof
US7492362B2 (en) Virtual space rendering/display apparatus and virtual space rendering/display method
KR101481880B1 (en) A system for portable tangible interaction
EP1368788B1 (en) Object tracking system using multiple cameras
CN101995943B (en) three-dimensional image interactive system
KR20180029995A (en) Physical model based gesture recognition
US9253468B2 (en) Three-dimensional (3D) user interface method and system
CN103197861A (en) Display control device
WO2015034973A1 (en) Dynamic displays based on user interaction states
WO2006127466A2 (en) Bounding box gesture recognition on a touch detecting interactive display
CN105320265B (en) Control method of electronic device
CN108021227B (en) Method for rapidly moving in virtual reality and virtual reality device
EP3370134B1 (en) Display device and user interface displaying method thereof
CN101727175B (en) Computer system and control method thereof
CN106293329A (en) A kind of in terminal, present the method for interface element array, device and terminal
EP3454304A1 (en) Image processing device
CN113168228A (en) Systems and/or methods for parallax correction in large area transparent touch interfaces
US11776205B2 (en) Determination of interactions with predefined volumes of space based on automated analysis of volumetric video
KR101360322B1 (en) Apparatus and method for controlling electric boards using multiple hand shape detection and tracking
US20200302643A1 (en) Systems and methods for tracking
CN102411426A (en) Operating method of electronic device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120704

Termination date: 20201029

CF01 Termination of patent right due to non-payment of annual fee