CN101727175A - Computer system and control method thereof - Google Patents

Computer system and control method thereof Download PDF

Info

Publication number
CN101727175A
CN101727175A CN200810172803A CN200810172803A CN101727175A CN 101727175 A CN101727175 A CN 101727175A CN 200810172803 A CN200810172803 A CN 200810172803A CN 200810172803 A CN200810172803 A CN 200810172803A CN 101727175 A CN101727175 A CN 101727175A
Authority
CN
China
Prior art keywords
entity object
computer system
virtual objects
display
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN200810172803A
Other languages
Chinese (zh)
Other versions
CN101727175B (en
Inventor
张耀元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial Technology Research Institute ITRI
Original Assignee
Industrial Technology Research Institute ITRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial Technology Research Institute ITRI filed Critical Industrial Technology Research Institute ITRI
Priority to CN200810172803XA priority Critical patent/CN101727175B/en
Publication of CN101727175A publication Critical patent/CN101727175A/en
Application granted granted Critical
Publication of CN101727175B publication Critical patent/CN101727175B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a computer system and a control method thereof. The computer system comprises a containing body and a display unit, wherein the containing body is provided with an opening which enables an entity object to enter the containing body; the display unit is used for displaying a picture. The control method of the computer system comprises the following steps: recognizing a first virtual role corresponding to the entity object; detecting the spatial parameter of the entity object, which corresponds to the containing body; according to the first virtual role and the spatial parameter, obtaining the first display parameter of the entity object, which corresponds to the picture; and according to the first display parameter, drawing a first virtual object on the picture.

Description

Computer system and control method thereof
Technical field
The present invention relates to a kind of computer system and control method thereof, and be particularly related to a kind of computer system and control method thereof of handling virtual screen according to entity object.
Background technology
General picture interaction is to carry out in the mode of joystick, mouse or keyboard, the operator under order finish by the menu interface of software with the action of wishing to reach is many.The operator must click complicated step with mouse, perhaps writes down the fast key of keyboard and operates.
This mode of operation is very strange for most of operator, also has the door of certain degree.Make the application of present picture interaction only terminate in youthful interactive game, seldom be applied to child's interactive teaching teaching material or older's amusement.
Summary of the invention
The present invention relates to a kind of computer system and control method thereof, the program that it utilizes Detection ﹠ Controling, make computer system can be in virtual picture position, size, the angle of complete reaction entity object, increase the sense of reality of virtual reality.And entity object also can carry out interaction with the virtual objects that illustrated in the picture, increases many interests.
A kind of control method of computer system is proposed according to an aspect of the present invention.Computer system comprises an accommodating body and a display unit.Accommodating body has an opening.Opening is with so that an entity object enters accommodating body.Display unit is in order to show a picture.The control method of computer system may further comprise the steps.Pairing one first virtual role of identification entity object.Detect the spatial parameter of entity object corresponding to accommodating body.According to first virtual role and spatial parameter, obtain one first display parameter of entity object corresponding to picture.According to first display parameter, illustrate one first virtual objects in picture.
According to a further aspect in the invention, a kind of computer system is proposed.Computer system comprises an accommodating body, a display unit, a detecting unit and a control module.Accommodating body has an opening.Opening is with so that an entity object enters accommodating body.Display unit is in order to show a picture.Detecting unit is in order to pairing one first virtual role of identification entity object, and in order to detect the spatial parameter of entity object corresponding to accommodating body.Control module obtains one first display parameter of entity object corresponding to picture, and according to first display parameter, illustrates one first virtual objects in picture according to first virtual role and spatial parameter.
For foregoing of the present invention can be become apparent, preferred embodiment cited below particularly, and in conjunction with the accompanying drawings, be described in detail below:
Description of drawings
1A figure illustrates the calcspar of the computer system of first embodiment of the invention;
1B figure illustrates the synoptic diagram of entity object, accommodating body and display unit;
1C figure illustrates the synoptic diagram of the detecting unit of 1A figure;
The 2nd figure illustrates the process flow diagram of the control method of computer system of the present invention;
The 3rd figure illustrates the detail flowchart of the step S110 of the 2nd figure;
The 4th figure illustrates the detail flowchart of the step S 120 of step the 2 figure;
The 5th figure illustrates the detail flowchart of the step S160 of the 2nd figure;
The 6A~6C illustrates the synoptic diagram of the respective action of second virtual objects;
The 7th figure illustrates the synoptic diagram of the computer system of second embodiment of the invention;
The 8th figure illustrates the detail flowchart of the step S110 of second embodiment of the invention;
The 9th figure illustrates the synoptic diagram of the computer system of third embodiment of the invention;
The 10th figure illustrates the detail flowchart of the step S110 of third embodiment of the invention;
11A figure illustrates the synoptic diagram of the computer system of fourth embodiment of the invention;
11B figure illustrates the vertical view of accommodating body and ultrared ray generator;
The 12nd figure illustrates the detail flowchart of the step S120 of fourth embodiment of the invention;
The 13rd figure illustrates the synoptic diagram of the computer system of fifth embodiment of the invention;
The 14th figure illustrates the detail flowchart of the step S120 of fifth embodiment of the invention;
The 15th figure illustrates the detail flowchart of the step S160 of sixth embodiment of the invention;
The 16th figure illustrates the detail flowchart of the step S161 of the 15th figure;
The 17th figure illustrates the detail flowchart of the step S160 of seventh embodiment of the invention;
The 18th figure illustrates the detail flowchart of the step S160 of eighth embodiment of the invention.
[primary clustering symbol description]
100,200,300,400: computer system
110: accommodating body
110a: opening
110b: bottom
120: display unit
130,230,330,430,530: detecting unit
131: video capture device
132: image analyzer
140: control module
150: storage unit
The 231:RFID scanner
The 232:RFID analyzer
331,332: electrical contactor
333: the Resistance Analysis device
431: ultrared ray generator
432: infrared analyzer unit
531: the ultrasonic generator
532: the ultrasonic analyzer
900: entity object
The 920:RFID label
930: identification resistance
S110~S160, S111~S112, S121~S125, S161~S162, S211~S212, S311~S212, S421~S422, S521~S522, S661~S662, S6611~S6614, S761~S762: process step
Embodiment
First embodiment
Please refer to the 1A~1C figure, 1A figure illustrates the calcspar of the computer system 100 of first embodiment of the invention, 1B figure illustrates the synoptic diagram of entity object 900, accommodating body 110 and display unit 120, and 1C figure illustrates the synoptic diagram of the detecting unit 130 of 1A figure.Computer system 100 comprises an accommodating body 110, a display unit 120, a detecting unit 130, a control module 140 and a storage unit 150.Accommodating body 110 for example is box hollow structure, tabular hollow structure or column hollow structure.In the present embodiment, accommodating body 110 is the box hollow structure.Accommodating body 110 has an opening 110a and a bottom 110b.Opening 110a is with respect to bottom 110b.Opening 110a is with so that an entity object 900 enters accommodating body 110.
Display unit 120 is in order to show a picture.Display unit 120 for example is a LCD Panel or an iconoscope display screen.In the present embodiment, display unit 120 and accommodating body 110 shaped structures that is combined into one.
Detecting unit 130 is in order to identification entity object 900 pairing first virtual roles.First virtual role for example is roles such as feeding bottle, bear baby, fishhook, doggie or palm.Detecting unit 130 and in order to detect the spatial parameter of entity object 900 corresponding to accommodating body 100.Spatial parameter for example is the locus and the anglec of rotation.The locus for example is the height of entity object 900 or horizontal level etc., and the anglec of rotation for example is the rotational angle of entity object 900.In the present embodiment, detecting unit 130 comprises a video capture device 131 (being illustrated in 1C figure) and an image analyzer 132 (being illustrated in 1C figure).
Control module 140 obtains first display parameter of entity object 900 corresponding to picture according to first virtual role and spatial parameter.First display parameter for example are that first display position, one first shows size, one first moving direction and one first rate travel.Control module 140 and foundation first display parameter illustrate one first virtual objects in picture.
Below further the collocation process flow diagram describe the control method of the computer system 100 of present embodiment in detail.Please be simultaneously with reference to 1A~2 figure, the 2nd figure illustrates the process flow diagram of the control method of computer system 100 of the present invention.At first, in step S110, detecting unit 140 identification entity objects 900 pairing first virtual roles.In the present embodiment, the mode of identification first virtual role adopts the mode of image identification.
Please refer to the 3rd figure, it illustrates the detail flowchart of the step S110 of the 2nd figure.The step S110 of present embodiment comprises step S111~S112.In step S111, the object image of video capture device 131 acquisition entity objects 900.Then, in step S112, image analyzer 132 obtains first virtual role of entity object 900 according to the object image.For example, storage unit 150 has been stored the table of comparisons of image data and virtual role in advance.Image analyzer 132 can directly be compared out the object image and belong to which virtual role from the table of comparisons.If can't from the table of comparisons, look for the virtual role that is fit to, also can directly define a new virtual role.
Then, in step S120, detecting unit 130 detects the spatial parameter of entity object 900 corresponding to accommodating body 100.In the present embodiment, the spatial parameter of detecting unit 130 detection entity objects 900 comprises entity object 900 height and the horizontal level of bottom 100b relatively.
Please refer to the 4th figure, it illustrates the detail flowchart of the step S120 of step the 2 figure.The step S120 of present embodiment comprises step S121~S125.In step S121, video capture device 131 is by the object image of the bottom 110b acquisition entity object 900 of accommodating body 110 and the background video of opening 110a.
Then, in step S122, image analyzer 132 obtains the height of entity object 900 with respect to bottom 110b according to the size of object image.For instance, when the object image was big more, presentation-entity object 900 was low more with respect to the height of bottom 110b; Object image more hour, presentation-entity object 900 is high more with respect to the height of bottom 110b.
Then, in step S123, image analyzer 132 is judged the horizontal level of entity object 900 with respect to bottom 110b according to the position of object image with respect to background video.For instance, during near the upper left corner, presentation-entity object 900 is close to the left side at rear to the object image with respect to the horizontal level of bottom 110b with respect to the position of background video.
Then, in step S124, this is captured by different angles with reference to image a bit with reference to image to provide several of entity object 900 by storage unit 150, for example is six face pictures such as up, down, left, right, before and after.
Then, in step S125, image analyzer 132 therewith a bit with reference to the comparison result of image, obtains the anglec of rotation of entity object 900 according to the object image.This anglec of rotation can be the various angles of three-dimensional space 360 degree.For instance, the left side of having contained more ratio when the object image as the front of image and less ratio as image, the anglec of rotation of presentation-entity object 900 deflection left side then.
Then, in step S130, control module 140 obtains first display parameter of entity object 900 corresponding to picture according to first virtual role and the spatial parameter of entity object 900.For instance, please refer to following table one, control module 140 obtains every first display parameter respectively according to first virtual role and the spatial parameter of entity object 900.
Figure G200810172803XD0000061
Figure G200810172803XD0000071
Table one
Then, in step S140, control module 140 illustrates first virtual objects according to first display parameter in picture.Above table one and 1B figure are example, and control module 140 shows the front feeding bottle pattern that dwindles on picture, and the feeding bottle pattern is illustrated in the position that picture is higher, take over.
Then, in step S150, storage unit 150 provides second virtual role and second display parameter of second virtual objects that is illustrated in picture.Shown in the 1st figure, second virtual objects for example is a kind of baby's pattern.
Then, in step S160,, change first display parameter or second display parameter according to first virtual role, second virtual role, first display parameter or second display parameter.
Please refer to the 5th figure, it illustrates the detail flowchart of the step S160 of the 2nd figure.In the present embodiment, step S160 comprises step S161~S162, and second virtual objects is a baby.At first, in step S161, control module 140 obtains the respective action of second virtual objects according to first virtual role, second virtual role and first display parameter.
Then, in step S162, control module 140 changes second display parameter according to the respective action of second virtual objects.For instance, please refer to table two and the 6A~6C figure, the 6A~6C figure illustrates the synoptic diagram of the respective action of second virtual objects.When first virtual role is when place of keeping right that the feeding bottle and first virtual objects are positioned at picture, control module 140 changes second display parameter of second virtual objects again, makes baby's pattern creep towards right-hand, and chases the feeding bottle feed.
Figure G200810172803XD0000081
Table two
Computer system 100 and control method thereof that present embodiment is above-mentioned, the user is in the process of application entity object 900, first virtual objects on the picture will with entity object 900 synchronous operations, and entity object 900 also can be directly and the second virtual objects interaction on the picture, and is considerably realistic.Especially be applied to more can promote interest on the products such as recreation or advertisement.
Second embodiment
Please refer to the 7th~8 figure, the 7th figure illustrates the synoptic diagram of the computer system 200 of second embodiment of the invention, and the 8th figure illustrates the detail flowchart of the step S110 of second embodiment of the invention.Computer system 100 differences of the computer system 200 of present embodiment and first embodiment are the detailed process of detecting unit 230 and step S110, and all the other something in common no longer repeat.
In the present embodiment, entity object 900 have a frequency identification system (Radio FrequencyIdentification, RFID) label 920, frequency identification volume label 920 directly is embedded in the entity object 900.Detecting unit 230 comprises a RFID scanner 231 and a RFID analyzer 232.Step S110 comprises step S211~S212.In step S211,, promptly can RFID scanner 231 scan the RFID label 920 of entity object 900 when entity object 900 during by opening 110a.
Then, in step S222, RFID analyzer 232 obtains first virtual role of entity object 900 again according to the identity data of RFID label 920.For instance, storage unit 150 is stored the corresponding tables of identity data and virtual role in advance.RFID analyzer 232 can directly be compared out identity data and belong to which virtual role from the table of comparisons.If can't from the table of comparisons, look for the virtual role that is fit to, also can directly define a new virtual role.
The 3rd embodiment
Please refer to the 9th~10 figure, the 9th figure illustrates the synoptic diagram of the computer system 300 of third embodiment of the invention, and the 10th figure illustrates the detail flowchart of the step S110 of third embodiment of the invention.Computer system 100 differences of the computer system 300 of present embodiment and first embodiment are the detailed process of detecting unit 330 and step S110, and all the other something in common no longer repeat.
In the present embodiment, entity object 900 comprises an identification resistance 930, and detecting unit 330 comprises two electrical contactor 331,332 and Resistance Analysis devices 333.The first different virtual role of different resistance values representative of identification resistance 930.Step S110 comprises step S311~S312.In step S311, when entity object 900 passes through opening 110a, two electrical contactors 331,332 will contact and measure a resistance value of identification resistance 930.
Then, in step S312,333 of Resistance Analysis devices obtain first virtual role of entity object 900 according to resistance value.For instance, storage unit 150 corresponding tables of store electricity resistance and virtual role in advance.Which virtual role is Resistance Analysis device 333 can directly be compared out measurement from the table of comparisons resistance value belong to.If can't from the table of comparisons, look for the virtual role that is fit to, also can directly define a new virtual role.
The 4th embodiment
Please refer to 11A~12 figure, 11A figure illustrates the synoptic diagram of the computer system 400 of fourth embodiment of the invention, 11B figure illustrates the vertical view of accommodating body 110 and ultrared ray generator 431, and the 12nd figure illustrates the detail flowchart of the step S120 of fourth embodiment of the invention.Computer system 100 differences of the computer system 400 of present embodiment and first embodiment are the detailed process of detecting unit 430 and step S120, and all the other something in common no longer repeat.
In the present embodiment, detecting unit 430 comprises a plurality of ultrared ray generators 431 and an infrared analyzer unit 432.Ultrared ray generator 431 is distributed in the inside of accommodating body 110.The step S121 of first embodiment~S123 is then replaced by the step S421 of present embodiment~S422.At first, in step S421, after entity object 900 entered accommodating body 110, ultrared ray generator 431 provided several infrared rays.Shown in 11A figure, ultrared ray generator 431 is distributed in differing heights.Shown in 11B figure, ultrared ray generator 431 is sentenced matrix form at sustained height and is distributed.
Then, in step S422, the situation that infrared analyzer unit 432 is interdicted according to these a little infrared rays, the locus of acquisition entity object 900.
The 5th embodiment
Please refer to the 13rd~14 figure, the 13rd figure illustrates the synoptic diagram of the computer system 500 of fifth embodiment of the invention, and the 14th figure illustrates the detail flowchart of the step S120 of fifth embodiment of the invention.Computer system 100 differences of the computer system 500 of present embodiment and first embodiment are the detailed process of detecting unit 530 and step S120, and all the other something in common no longer repeat.
In the present embodiment, detecting unit 530 comprises a ultrasonic generator 531 and a ultrasonic analyzer 532.The step S121 of first embodiment~S123 is then replaced by the step S521 of present embodiment~S522.At first, in step S521, after entity object 900 entered accommodating body 110, ultrasonic generator 531 provided a ultrasonic.
Then, in step S522, ultrasonic analyzer 532 obtains the locus of entity object 900 according to supersonic reflection case.
The 6th embodiment
Please refer to the 15th figure, it illustrates the detail flowchart of the step S160 of sixth embodiment of the invention.Computer system 100 differences of the computer system of present embodiment and first embodiment are the detailed process of detecting unit and step S160, and all the other something in common no longer repeat.
In the present embodiment, first display parameter comprise first display position, first and show size, first moving direction and first rate travel, second display parameter comprise second display position, second and show size, second moving direction and second rate travel, and step S160 comprises step S661~S662.And step S661 comprises step S6611~S6614.Wherein, in step S661, control module 140 shows that according to first display position, first size, second display position and second show size, judge whether first virtual objects and second virtual objects collide.If first virtual objects and the collision of second virtual objects then enter step S662; If first virtual objects and second virtual objects do not collide, then be back to step S661.
Please refer to the 16th figure, it illustrates the detail flowchart of the step S661 of the 15th figure.In the present embodiment, step S661 further comprises several steps S6611~S6614.At first, in step S6611, control module 140 shows size according to first display position and first, obtains one first border circle.First border circle is for can completely containing the circle of first virtual objects.
Then, in step S6612, control module 140 shows size according to second display position and second, obtains one second border circle.Second border circle is for can completely containing the circle of second virtual objects.
Then, in step S6613, control module 140 judges whether first border circle and second border circle intersect.If first border circle and second border circle intersect, then enter step S6614; If first border circle and second border circle do not intersect, then be back to step S6613.
Then, in step S6614, control module 140 definition first virtual objects and second virtual objects collide.
In step S662, control module 140 changes first moving direction, first rate travel, second moving direction and second rate travel according to first moving direction, first rate travel, second moving direction and second rate travel.For instance, if first moving direction and second moving direction approach parallel and in the same way the time, then the object that will be clashed into (first virtual objects or second virtual objects) quickens to move.Perhaps, when first moving direction and second moving direction were not parallel, two objects that then will be clashed into mutually (first virtual objects and second virtual objects) changed moving direction and rate travel according to operation result.
The 7th embodiment
Please refer to the 17th figure, it illustrates the detail flowchart of the step S160 of seventh embodiment of the invention.The computer system difference of the computer system of present embodiment and the 6th embodiment is the detailed process of step S160, and all the other something in common no longer repeat.
In the present embodiment, the step S161 of first embodiment~S162 is replaced by step S761~S763, and first virtual objects for example is the combination of a fishline and bait, and second virtual objects for example is a goldfish.At first, in step S761, according to first display position and second display position, control module 140 drives second virtual objects and moves towards first virtual objects.
Then, in step S762, control module 140 judges whether first virtual objects and second virtual objects collide.If first virtual objects and the collision of second virtual objects then enter step S763; If first virtual objects and second virtual objects do not collide, then be back to step S762.
Then, in step S763, if first virtual objects and the collision of second virtual objects then trigger a scheduled event.Predetermine one for example is when goldfish touches bait, then goldfish has been angled.
The 8th embodiment
Please refer to the 18th figure, it illustrates the detail flowchart of the step S160 of eighth embodiment of the invention.The computer system difference of the computer system of present embodiment and first embodiment is the detailed process of step S160, and all the other something in common no longer repeat.
In the present embodiment, the step S161 of first embodiment~S162 is replaced by step S861~S862, and second virtual objects is a baby.At first, in step S661, control module 140 obtains the reacting condition of second display parameter of second virtual objects according to first virtual role and second virtual role.
Then, control module 140 changes second display parameter according to the reacting condition of second display parameter.For instance, please refer to table three, when first virtual role was doggie, the rate travel of baby's pattern of second virtual objects then increased by 1.5 times.
Figure G200810172803XD0000131
Table three
Disclosed computer system of the above embodiment of the present invention and control method thereof see through the program of Detection ﹠ Controling, make computer system can be in virtual picture position, size, the angle of complete reaction entity object, increase the sense of reality of virtual reality.And entity object also can carry out interaction with the virtual objects that illustrated in the picture, increases many interests.
In sum, though the present invention discloses as above with preferred embodiment, so it is not in order to limit the present invention.The technician of the technical field of the invention, without departing from the spirit and scope of the present invention, when being used for a variety of modifications and variations.Therefore, protection scope of the present invention is as the criterion when looking the accompanying Claim person of defining.

Claims (34)

1. the control method of a computer system, this computer system comprises an accommodating body and a display unit, and this accommodating body has an opening, and this opening is with so that an entity object enters this accommodating body, this display unit is in order to show a picture, and the control method of this computer system comprises:
Pairing one first virtual role of this entity object of identification;
Detect the spatial parameter of this entity object corresponding to this accommodating body;
According to this first virtual role and this spatial parameter, obtain one first display parameter of this entity object corresponding to this picture; And
According to these first display parameter, illustrate one first virtual objects in this picture.
2. the control method of computer system as claimed in claim 1, wherein this first virtual role step of identification comprises:
Capture an object image of this entity object; And
According to this object image, obtain this first virtual role of this entity object.
3. the control method of computer system as claimed in claim 1, wherein this entity object have a frequency identification system (step of this first virtual role of identification comprises for Radio Frequency Identification, RFID) label:
Scan this RFID label of this entity object; And
According to this RFID label, obtain this first virtual role of this entity object.
4. the control method of computer system as claimed in claim 1, wherein this entity object comprises an identification resistance, the step of this first virtual role of identification comprises:
Measure a resistance value of this identification resistance; And
According to this resistance value, obtain this first virtual role of this entity object.
5. the control method of computer system as claimed in claim 1, wherein this spatial parameter comprises that this entity object is positioned at a locus of this accommodating body.
6. the control method of computer system as claimed in claim 5, the step that wherein detects this spatial parameter comprises:
Capture an object image of this entity object by a bottom of this accommodating body; And
According to the size of this object image, obtain the height of this entity object with respect to this bottom.
7. the control method of computer system as claimed in claim 5, the step that wherein detects this spatial parameter comprises:
Bottom by this accommodating body captures an object image of this entity object and a background video of this opening; And
According to the position of this object image, obtain the horizontal level of this entity object with respect to this bottom with respect to this background video.
8. the control method of computer system as claimed in claim 5, the step that wherein detects this spatial parameter comprises:
Many infrared rays are provided; And
According to the situation that those infrared rays are interdicted, obtain this locus of this entity object.
9. the control method of computer system as claimed in claim 5, the step that wherein detects this spatial parameter comprises:
One ultrasonic is provided; And
According to this supersonic reflection case, obtain this locus of this entity object.
10. the control method of computer system as claimed in claim 1, wherein this spatial parameter comprises an anglec of rotation of this entity object.
11. the control method of computer system as claimed in claim 10, the step that wherein detects this spatial parameter comprises:
Many that this entity object is provided with reference to image, and those are captured by different angles with reference to image;
Capture an object image of this entity object; And
According to this object image and those comparison results, obtain this anglec of rotation of this entity object with reference to image.
12. the control method of computer system as claimed in claim 10, wherein this entity object comprises an acceleration induction device (G-Sensor), and the step that detects this spatial parameter comprises:
Detect the variation of this acceleration induction device; And
According to this variation of quickening inductor, obtain this anglec of rotation of this entity object.
13. the control method of computer system as claimed in claim 1, wherein this picture further illustrates one second virtual objects, and this second virtual objects is corresponding to one second virtual role, and the control method of this computer system further comprises:
One second display parameter of this second virtual objects are provided; And
According to this first virtual role, this second virtual role, these first display parameter or this second display parameter, change these first display parameter or this second display parameter.
14. the control method of computer system as claimed in claim 13, wherein these first display parameter comprise one first display position, one first and show size, one first moving direction and one first rate travel, these second display parameter comprise one second display position, one second and show size, one second moving direction and one second rate travel, and the step that changes these first display parameter or these second display parameter comprises:
According to this first display position, this first demonstration size, this second display position and this second demonstration size, judge whether this first virtual objects and this second virtual objects collide; And
If this first virtual objects and the collision of this second virtual objects, then, change this first moving direction, this first rate travel, this second moving direction and this second rate travel according to this first moving direction, this first rate travel, this second moving direction and this second rate travel.
15. the control method of computer system as claimed in claim 14 judges that wherein the step whether this first virtual objects and this second virtual objects collide comprises:
According to this first display position and this first demonstration size, obtain one first border circle;
According to this second display position and this second demonstration size, obtain one second border circle;
Judge whether this first border circle and this second border circle intersect; And
If this first border circle and this second border circle intersect, then define this first virtual objects and this second virtual objects collides.
16. the control method of computer system as claimed in claim 13, the step that wherein changes these first display parameter or these second display parameter comprises:
According to this first virtual role, this second virtual role and this first display parameter, obtain the respective action of this second virtual objects; And
According to the respective action of this second virtual objects, change this second display parameter.
17. the control method of computer system as claimed in claim 13, wherein these first display parameter comprise one first display position and one first and show size, these second display parameter comprise one second display position and one second and show size, and the step that changes these first display parameter or these second display parameter comprises:
According to this first display position and this second display position, drive this second virtual objects and move towards this first virtual objects;
According to this first display position, this first demonstration size, this second display position and this second demonstration size, judge whether this first virtual objects and this second virtual objects collide; And
If this first virtual objects and the collision of this second virtual objects then trigger a scheduled event.
18. a computer system comprises:
One accommodating body has an opening, and this opening is with so that an entity object enters this accommodating body;
One display unit is in order to show a picture;
One detecting unit, in order to pairing one first virtual role of this entity object of identification, and in order to detect the spatial parameter of this entity object corresponding to this accommodating body; And
One control module according to this first virtual role and this spatial parameter, obtains one first display parameter of this entity object corresponding to this picture, and according to these first display parameter, illustrates one first virtual objects in this picture.
19. computer system as claimed in claim 18, wherein this detecting unit comprises:
One video capture device is in order to capture an object image of this entity object; And
One image analyzer in order to according to this object image, obtains this first virtual role of this entity object.
20. computer system as claimed in claim 18, wherein this entity object have a frequency identification system (this detecting unit comprises for Radio Frequency Identification, RFID) label:
One RFID scanner is in order to scan this RFID label of this entity object; And
One RFID analyzer according to this RFID label, obtains this first virtual role of this entity object.
21. computer system as claimed in claim 18, wherein this entity object comprises an identification resistance, and this detecting unit comprises:
Two electrical contactors are in order to contact and to measure a resistance value of this identification resistance; And
One Resistance Analysis device according to this resistance value, obtains this first virtual role of this entity object.
22. computer system as claimed in claim 18, wherein this spatial parameter comprises that this entity object is positioned at a locus of this accommodating body.
23. computer system as claimed in claim 22, wherein this detecting unit comprises:
One video capture device captures an object image of this entity object in order to the bottom by this accommodating body; And
One image analyzer according to the size of this object image, obtains the height of this entity object with respect to this bottom.
24. computer system as claimed in claim 22, wherein this detecting unit comprises:
One video capture device captures an object image of this entity object and a background video of this opening in order to the bottom by this accommodating body; And
One image analyzer according to the position of this object image with respect to this background video, obtains the horizontal level of this entity object with respect to this bottom.
25. computer system as claimed in claim 22, wherein this detecting unit comprises:
A plurality of ultrared ray generators are in order to provide many infrared rays; And
One infrared analyzer unit according to the situation that those infrared rays are interdicted, obtains this locus of this entity object.
26. computer system as claimed in claim 22, wherein this detecting unit comprises:
One ultrasonic generator is in order to provide a ultrasonic; And
One ultrasonic analyzer according to this supersonic reflection case, obtains this locus of this entity object.
27. computer system as claimed in claim 18, wherein this spatial parameter comprises an anglec of rotation of this entity object.
28. computer system as claimed in claim 27 further comprises a storage unit, those are captured by different angles with reference to image this storage unit with reference to image in order to store many of this entity object, and this detecting unit comprises:
One video capture device is in order to capture an object image of this entity object; And
One image analyzer according to this object image and those comparison results with reference to image, obtains this anglec of rotation of this entity object.
29. computer system as claimed in claim 27, wherein this entity object comprises an acceleration induction device (G-Sensor) and a Wireless Transmitter, and this detecting unit comprises:
One wireless receiver is in order to receive the variation of this acceleration induction device; And
One acceleration analysis device according to this variation of quickening inductor, obtains this anglec of rotation of this entity object.
30. computer system as claimed in claim 18, wherein this picture further illustrates one second virtual objects, this second virtual objects is corresponding to one second virtual role, this second virtual objects has one second display surface parameter, this control module changes these first display parameter or this second display parameter according to this first virtual role, this second virtual role, these first display parameter or this second display parameter.
31. computer system as claimed in claim 30, wherein these first display parameter comprise one first display position, one first shows size, one first moving direction and one first rate travel, these second display parameter comprise one second display position, one second shows size, one second moving direction and one second rate travel, this control module is according to this first display position, this first demonstration size, this second display position and this second demonstration size, judge whether this first virtual objects and this second virtual objects collide, if this first virtual objects and the collision of this second virtual objects, then this control module is according to this first moving direction, this first rate travel, this second moving direction and this second rate travel change this first moving direction, this first rate travel, this second moving direction and this second rate travel.
32. computer system as claimed in claim 31, wherein this control module is according to this first display position and this first demonstration size, obtain one first border circle, and according to this second display position and this second demonstration size, obtain one second border circle, and judge whether this first border circle and this second border circle intersect, if this first border circle and this second border circle intersect, then define this first virtual objects and this second virtual objects collides.
33. computer system as claimed in claim 30, wherein this control module is according to this first virtual role, this second virtual role and this first display parameter, obtain the respective action of this second virtual objects, and, change this second display parameter according to the respective action of this second virtual objects.
34. computer system as claimed in claim 30, wherein these first display parameter comprise one first display position and one first and show size, these second display parameter comprise one second display position and one second and show size, this control module is according to this first display position and this second display position, driving this second virtual objects moves towards this first virtual objects, and according to this first display position, this first demonstration size, this second display position and this second demonstration size, judge whether this first virtual objects and this second virtual objects collide, if this first virtual objects and the collision of this second virtual objects, then this control module triggers a scheduled event.
CN200810172803XA 2008-10-29 2008-10-29 Computer system and control method thereof Expired - Fee Related CN101727175B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200810172803XA CN101727175B (en) 2008-10-29 2008-10-29 Computer system and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200810172803XA CN101727175B (en) 2008-10-29 2008-10-29 Computer system and control method thereof

Publications (2)

Publication Number Publication Date
CN101727175A true CN101727175A (en) 2010-06-09
CN101727175B CN101727175B (en) 2012-07-04

Family

ID=42448156

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200810172803XA Expired - Fee Related CN101727175B (en) 2008-10-29 2008-10-29 Computer system and control method thereof

Country Status (1)

Country Link
CN (1) CN101727175B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019228030A1 (en) * 2018-06-01 2019-12-05 腾讯科技(深圳)有限公司 Information prompting method and device, storage medium and electronic device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004199496A (en) * 2002-12-19 2004-07-15 Sony Corp Information processor and method, and program
CN100476701C (en) * 2006-05-23 2009-04-08 郭超逸 Photoelectric keyboard

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019228030A1 (en) * 2018-06-01 2019-12-05 腾讯科技(深圳)有限公司 Information prompting method and device, storage medium and electronic device
US11439906B2 (en) 2018-06-01 2022-09-13 Tencent Technology (Shenzhen) Company Limited Information prompting method and apparatus, storage medium, and electronic device

Also Published As

Publication number Publication date
CN101727175B (en) 2012-07-04

Similar Documents

Publication Publication Date Title
US20220382379A1 (en) Touch Free User Interface
US8803801B2 (en) Three-dimensional interface system and method
US7170492B2 (en) Interactive video display system
CN102141885B (en) Image processing device and image processing method
US20060139314A1 (en) Interactive video display system
JP4927148B2 (en) Computer system and control method thereof
CN103246351A (en) User interaction system and method
CN103999018B (en) The user of response three-dimensional display object selects the method and system of posture
WO2005091651A2 (en) Interactive video display system
EP1517228A2 (en) Gesture recognition method and touch system incorporating the same
JP4323180B2 (en) Interface method, apparatus, and program using self-image display
CN103455212A (en) Intelligent mirror cum display solution
CN103858074A (en) System and method for interfacing with a device via a 3d display
US9253468B2 (en) Three-dimensional (3D) user interface method and system
NZ525717A (en) A method of tracking an object of interest using multiple cameras
WO2015034973A1 (en) Dynamic displays based on user interaction states
CN102893293A (en) Position capture input apparatus, system, and method therefor
CN105320265B (en) Control method of electronic device
EP3370134B1 (en) Display device and user interface displaying method thereof
CN103197861A (en) Display control device
CN108021227B (en) Method for rapidly moving in virtual reality and virtual reality device
CN113238705A (en) Virtual keyboard interaction method and system
CN101727175B (en) Computer system and control method thereof
CN106293329A (en) A kind of in terminal, present the method for interface element array, device and terminal
Pinhanez et al. Applications of steerable projector-camera systems

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120704

Termination date: 20201029

CF01 Termination of patent right due to non-payment of annual fee