CN101388138B - Interaction image system, interaction apparatus and operation method thereof - Google Patents

Interaction image system, interaction apparatus and operation method thereof Download PDF

Info

Publication number
CN101388138B
CN101388138B CN2007101538261A CN200710153826A CN101388138B CN 101388138 B CN101388138 B CN 101388138B CN 2007101538261 A CN2007101538261 A CN 2007101538261A CN 200710153826 A CN200710153826 A CN 200710153826A CN 101388138 B CN101388138 B CN 101388138B
Authority
CN
China
Prior art keywords
image
module
interactive device
reference point
main frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2007101538261A
Other languages
Chinese (zh)
Other versions
CN101388138A (en
Inventor
林卓毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to CN2007101538261A priority Critical patent/CN101388138B/en
Publication of CN101388138A publication Critical patent/CN101388138A/en
Application granted granted Critical
Publication of CN101388138B publication Critical patent/CN101388138B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Position Input By Displaying (AREA)

Abstract

The invention relates to an interactive image system, which comprises a host machine of a memory device which is provided with a first wireless module and is used to store and get software, at least one reference point of light which is used to generate preset optical spectrum and an interactive device, wherein the interactive device comprises a second wireless module, an image module, a modulation module and a processing unit, wherein the image module obtains the image of the reference point through a sampling frequency and forms a first image and a second image, and motion vector of the reference point between the first image and the second image is calculated out and output, a processor controls the second wireless module and transmits the motion vector to the first wireless module, and controls the modulation module to modulate the sampling frequency of the image module in real time. The invention further provides an interactive device and an operating method thereof.

Description

Interaction image system, interactive device and How It Works thereof
Technical field
The present invention relates to a kind of interaction image system, interactive device and How It Works thereof, particularly relate to a kind of interaction image system, interactive device and the How It Works thereof of the sampling frequency of the image module in the interactive device that pass through to change with the saving electric energy.
Background technology
Existing game remote device, can be applicable to the light gun recreation as the device among the TaiWan, China patent I267754 of " indicator positioning device of video camera " by name, described indicator positioning device is in the photography indicator device control circuit to be set, and described control circuit connects video camera, computing unit and communication interface respectively.Described communication interface is connected to main frame, and described video camera the place ahead is provided with filter, and screen is provided with a plurality of light-emitting components that can supply video camera to take.When the user uses described photography indicator device to carry out the performed operation of main frame, described photo-opportunity is taken display screen, and because of filter that video camera had with the light source filtering beyond the spectrum that light-emitting component sent, so the light source of light-emitting component only can appear in the captured picture of video camera, this picture being transferred to computing unit calculates the coordinate figure of video camera aiming point and is sent to main frame, the control that makes main frame can utilize this coordinate figure to be correlated with again.
Yet on reality was used, in order to promote the property convenient for control of telechiric device, telechiric device utilized and carries out data transmission between radio communication and main frame, and used the whole required electric energy of battery module supply telechiric device.Because telechiric device comprises a plurality of dissipative cells, therefore, the power consumption that must as far as possible reduce each element is to prolong the life-span of battery module.Usually in order to promote the precision that computing unit calculates the aiming point coordinate, video camera is preferably and uses higher Image Acquisition frequency to obtain image, but hi-vision obtains the computation burden that rate can increase computing unit relatively, and then the power consumption of increase telechiric device integral body, and the serviceable life of reducing battery module.
For these reasons, be necessary further to improve the function mode of above-mentioned telechiric device really,, prolong the serviceable life of battery module so that save the power consumption of telechiric device integral body.
Summary of the invention
The purpose of this invention is to provide a kind of interaction image system, interactive device and How It Works thereof, by modulating the sampling frequency of image module in the interactive device in real time, to save the power consumption of interactive device integral body.
For achieving the above object, the invention provides a kind of interaction image system, described system comprises main frame, at least one reference point and interactive device.Described main frame comprises that first wireless module reaches the storage device in order to access software.Described reference point produces the light of default spectrum, for example, but is not limited to special spectrums such as infrared light or ultraviolet light.Described interactive device comprises second wireless module, image module, modulation module and processing unit, described second wireless module in order to and first wireless module between carry out data transmission; Described image module obtains described reference point image to form first image and second image with sampling frequency, calculates and export the motion-vector of described reference point image between described first image and described second image; Described modulation module is in order to modulate the sampling frequency of described image module; Described processing unit couples described image module, and control second wireless module motion-vector is transferred to first wireless module, and according to the pre-conditioned control modulation module sampling frequency of modulation image module in real time.
According to another characteristics of the present invention, the present invention also provides a kind of interactive device that is used for interaction image system, described interaction image system comprises the reference point of the light of described interactive device, main frame and the default spectrum of at least one generation, and described interactive device connects the wireless module that main frame comprises by radio communication.Described interactive device comprises wireless module, image module, modulation module and processing unit.Wireless module in order to and the wireless module of main frame between carry out data transmission.Image module obtains described reference point image to form first image and second image with a sampling frequency, calculates and export the motion-vector of described reference point image between first image and second image.Modulation module is in order to the sampling frequency of modulation image module.Processing unit couples image module, and the control wireless module transfers to the wireless module of main frame with motion-vector, and according to the pre-conditioned control modulation module sampling frequency of modulation image module in real time.
The present invention provides a kind of How It Works of interactive device in addition, described interactive device is used for interaction image system, described interaction image system comprises the reference point of the light of interactive device, main frame and the default spectrum of at least one generation, and interactive device connects the wireless module that main frame comprises by radio communication.Described How It Works comprises the following steps: to provide image module, described image module obtains described reference point image to form first image and second image with a sampling frequency, calculates and export the motion-vector of described reference point image between first image and second image; Wireless module is provided, carries out data transmission between the wireless module of described wireless module and main frame; Processing unit is provided, and described processing unit control wireless module transfers to the wireless module of main frame with motion-vector, and according to the sampling frequency of the pre-conditioned module of modulation image in real time.
Interaction image system of the present invention and interactive device, also comprise in order to the state of sensing interaction device and produce the action sensing module of electric signal, described electric signal is for for example, but be not limited to, potential difference signal or current signal, described processing unit then calculates the acceleration of described interactive device according to described electric signal, and the control wireless module transfers to main frame with described acceleration.
In interaction image system, interactive device and the How It Works thereof of the invention described above, described pre-conditioned comprising: main frame transfers to the frequency of interactive device and selects signal, the motion-vector of the reference point image that it can be determined by the software of the storage device institute access of main frame, the image module of described interactive device is exported between first image and second image and/or the acceleration of the processing unit interactive device of trying to achieve according to described electric signal.Processing unit can be according to above-mentioned one or more combination control modulation module in pre-conditioned sampling frequency of modulation image module in real time, so that save the whole energy consumption of interactive device.
According to interaction image system of the present invention, interactive device and How It Works thereof, can obtain the motion-vector of reference point image between first image and second image in order to make image module, its sampling frequency is preferably and was higher than for 60 frame/seconds; Based on the hardware transmission speed, sampling frequency can reach for 200 frame/seconds.
According to interaction image system of the present invention and interactive device, described main frame can be game host or computer system main frame; Described software can be Games Software or computer software; Described interactive device can be telepilot or indicator positioning device.
Description of drawings
Fig. 1 has shown the synoptic diagram of a kind of embodiment of interaction image system provided by the invention.
Fig. 2 has shown the structural representation of a kind of embodiment of interactive device provided by the invention.
Fig. 3 has shown the running synoptic diagram of the interactive device in a kind of embodiment provided by the invention.
Fig. 4 has shown the process flow diagram of the How It Works in a kind of embodiment provided by the invention.
Fig. 5 has shown the process flow diagram that utilizes the vernier on the interactive device control image display to move in a kind of embodiment provided by the invention.
Fig. 6 a has shown that interactive device in a kind of embodiment provided by the invention is in the time the turn clockwise digital picture of an angle time institute sensing of shooting.
Fig. 6 b has shown that the interactive device in a kind of embodiment provided by the invention spends the digital picture of time institute's sensing at when shooting dextrorotation gyration greater than 180.
Fig. 7 has shown that interactive device in a kind of embodiment provided by the invention is in the make a video recording digital picture of time institute's sensing of different distance.
Fig. 8 has shown that the sensing position of the interactive device in a kind of embodiment provided by the invention moves the digital picture of time institute's sensing.
Fig. 9 has shown in the interaction image system in a kind of embodiment provided by the invention, the proportionate relationship of distance between the image-forming range of image sensing module and image sensing module and screen.
Figure 10 has shown in the interaction image system in a kind of embodiment provided by the invention, the change in location of vernier on the image display screen and the synoptic diagram of the change in location of reference point image on sensing array.
Figure 11 has shown in the interaction image system of a kind of embodiment provided by the invention, the graph of a relation of the sampling frequency of image module and the motion-vector of vernier.
Main description of reference numerals
10 main frames, 1,14 reference point
16 first wireless modules, 18 storage devices
20 interactive devices, 21 image modules
210 optical filtering modules, 212 image sensing modules
214 image processing modules, 22 processing units
23 modulation modules, 24 action sensing modules
25 second wireless modules, 26 multiplexer (MUXs
80 image displays, 801 screens
802 verniers, 90 CD-RW discsCD-RW
L, l reference point image pitch are from the θ anglec of rotation
Δ S motion-vector Δ S XThe horizontal component of motion-vector
Δ S YThe vertical component SA sensing array of motion-vector
A image sensing scope DI digital picture
B1~B4 step 150~720 steps
F image sensing module image-forming range X Scale, Y ScaleScale parameter
Distance between F image sensing module and screen
The distance at D, d reference point image averaging coordinate and sensing array center
i 12, i 14, i 12', i 14' reference point image
I 12, I 14, I 12', I 14', I 12", I 14", I 12
Figure 2007101538261_0
, I 14
Figure 2007101538261_1
The reference point image
(X, Y), (X 0, Y 0), (X i, Y i) average coordinates of reference point image
Embodiment
In order to allow above-mentioned and other purposes of the present invention, feature and the advantage can be more obvious, embodiment of the present invention cited below particularly, and cooperate appended diagram, be described in detail below.In addition, in the explanation of this instructions, similar elements is to represent with identical Reference numeral, in this prior statement.
With reference to shown in Figure 1, it has shown the synoptic diagram of a kind of embodiment of interaction image system of the present invention, and described system comprises main frame 10, two reference point 12 and 14, interactive device 20 and image display 80.The embodiment of described main frame 10 for example, game host or computer system main frame, described main frame comprises first wireless module 16 and storage device 18, described storage device 18 is in order to the access software in CD-RW discsCD-RW 90, mobile USB flash disk or the memory storage for example, and the executing state of software is shown on the image display 80 controls for the user.Can show vernier 802 on the screen 801 of image display 80, what its embodiment for example included, but not limited to the point of impact of light gun and software controls vernier etc.The embodiment of reference point 12,14 can be, but be not limited to, the reference point of the different shape that light source rearranged of specific wavelengths such as a plurality of infrared light light emitting diodes (wavelength for example can be 940 nanometers), laser diode or ultraviolet light, described light source can be electrically connected to image display 80 or main frame 10, can also supply electric energy required when luminous voluntarily with power supply independently.In addition, the number of reference point is not limited to two, can comprise one or more reference point.The image that described interactive device 20 obtains reference point 12,14 changes to judge interactive device 20 and the relative position and/or the angle of reference point, with the action of relative control vernier 802 on screen 801.Interactive device 20 is in order to control the performed software of main frame 10, for example Games Software or computer software.When main frame 10 performed be Games Software the time, interactive device 20 for example can be used as into, but be not limited to, light gun, cue, golf clubs, tennis racket, bat, racket or table tennis bat etc. are so that control the carrying out of recreation; When main frame 10 performed be computer software the time, interactive device 20 for example can be used as pointer (vernier) locating device so that control the carrying out of described computer software.
Shown in Fig. 1,2 and 3, Fig. 2 has shown the structural representation of a kind of embodiment of interactive device 20 provided by the invention, and described interactive device 20 comprises image module 21, processing unit 22, modulation module 23, action sensing module 24 and second wireless module 25.Image module 21 comprises optical filtering module 210, image sensing module 212 and image processing module 214, described optical filtering module 210 is in order to filtering special spectrum (for example spectrum of specific wavelength such as infrared light spectrum and ultraviolet spectrum) light in addition, make described image sensing module 212 only receive light, and get rid of the interference that other light sources caused from reference point 12,14; Described image sensing module 212 has sensing array (not drawing), lighting angle according to its angular field of view and reference point 12,14 can define the image sensing scope " A " (Fig. 1), and with a sampling frequency (for example 60~200 frame/seconds) sensing reference point 12,14 light that produced, to form digital picture.Be understandable that Fig. 1 only exemplarily draws image sensing scope " A ", in fact described image sensing scope " A " may be wider.When image sensing module 212 points to the image sensing scopes " A " in the time, because described image module 21 has optical filtering module 210, thereby, make the image that only has reference point 12,14 on the described digital picture, the embodiment of image sensing module 212 comprises charge-coupled device (CCD) imageing sensor and complementary metal oxide semiconductor (CMOS) (CMOS) imageing sensor.Image processing module 214 is according to a plurality of digital pictures from image sensing module 212, the motion-vector of calculating reference point 12,14 images between described digital picture, and export processing unit 22 to.Be understandable that the digital picture that image sensing module 212 is exported also can be handled by processing unit 22, to calculate the motion-vector of reference point 12,14 images between digital picture.
The embodiment of described action sensing module 24 comprises, but be not limited to, acceleration transducer (accelerometer) and gyro sensor (gyro sensor), in order to the two dimension and/or the three-dimensional acceleration change of sensing interaction device 20, and output potential difference signal or current signal.For example, when interactive device 20 is set at bat in the recreation, but 24 sensing users' of action sensing module (not drawing) the state of swinging, and produce electric signal (potential difference signal or current signal).Processing unit 22, and is controlled second wireless module 25 described acceleration is sent to first wireless module 16 of main frame 10 calculating the acceleration of interactive device 20 according to described electric signal, with the operation of relative Control Software.
Described modulation module 23 is in order to the sampling frequency according to pre-conditioned modulation image module 21, for example in one embodiment, described modulation module 23 is by the described sampling frequency of multiplexer (MUX 26 modulation, as shown in Figure 3, and in order to make image module 21 can calculate the motion-vector of reference point 12,14 images between a plurality of digital pictures, the sampling frequency of employing module 21 is preferably and was higher than for 60 frame/seconds, and based on the hardware transmission speed, described sampling frequency can reach for 200 frame/seconds.Be understandable that modulation module 23 does not limit and is arranged at processing unit 22 inside, shown in Figure 2 only is a kind of embodiment.
The present invention is in order to modulate the pre-conditioned of described image module 21 sampling frequencies, described pre-conditioned comprising: main frame 10 transfers to the frequency of interactive device 20 and selects signal, described signal can be automatically determined by the software of 18 accesses of storage device of main frame 10, for example, the sampling frequency of dynamic gaming is very fast, the sampling frequency of static recreation is slower, or can be set up on their own by the user; The motion-vector of reference point 12,14 images between digital picture that the image module 21 of interactive device 20 is tried to achieve; And the acceleration of the processing unit 22 described interactive device 20 of trying to achieve etc.Processing unit 22 can be according to above-mentioned one or more combination (Fig. 3) control modulation module 23 in pre-conditioned sampling frequency of modulation image module 21 in real time.Thus, when main frame 10 performed softwares do not need image module 21 to obtain image at a high speed, can be by reducing the whole energy consumption that sampling frequency is saved interactive device 20.
With reference to Fig. 1 and 4, Fig. 4 has shown the process flow diagram of How It Works of the interactive device of one embodiment of the present invention, described How It Works comprises the following steps: to provide image module, described image module obtains the reference point image to form first image and second image with a sampling frequency, calculates and the motion-vector (step B1) of output reference point image between first image and second image; Wireless module is provided, carries out data transmission (step B2) between described wireless module and described main frame; The action sensing module is provided, and described action sensing module is in order to the state of the described interactive device of sensing and produce electric signal (step B3); Processing unit is provided, described processing unit calculates the acceleration of described interactive device according to described electric signal, control described wireless module described motion-vector and acceleration are sent to main frame, and according to pre-conditioned sampling frequency (step B4) of modulating described image module in real time.At first, the image sensing module 212 of image module 21 obtains the spectral signal of reference point 12,14 by optical filtering module 210, to form a plurality of digital pictures, image processing module 214 calculates according to described digital picture and the motion-vector of output reference point 12,14 images between digital picture, and its account form will be in aftermentioned paragraph illustrated (step B1); Simultaneously, described action sensing module 24 sensing users control the state of interactive device 20, and produce potential difference signal or current signal (step B3); Then processing unit 22 calculates the acceleration of interactive device 20 according to described electric signal, and control described second wireless module signal of described acceleration and described motion-vector is transferred to main frame 10, so that carry out relative control, and according to the sampling frequency (step B4) of above-mentioned one or more combination control modulating unit 23 modulation image modules 21 in pre-conditioned.
In order to further specify the present invention, illustrate a kind of embodiment of the motion-vector of the vernier 802 on the interactive device 20 computed image displays 80 below, but it is not in order to restriction the present invention.
To shown in Figure 8, in this embodiment, interactive device 20 moves in order to the vernier 802 on the screen 801 of control image display 80 with reference to Fig. 5, and described vernier 802 for example can be represented the point of impact or the vernier (cursor) in order to carry out software control of light gun.Reference point 12,14 is with two different area but the identical reference point of shape represents that for example reference point 12 is represented with big asterisk, is presented on then to be I on the digital picture DI 12Reference point 14 represents with little asterisk, is presented on then to be I on the digital picture DI 14The control method of vernier 802 comprises the following steps: to provide two reference point producing default spectrum, and defines preset range (step 150); Provide image module to point in the described preset range (step 250); Utilize described image module to receive and preset spectrum and form digital picture (step 300); Judge the image space and the shape of the reference point on the described digital picture, and produce first parameter (step 400); Carry out distance and angle compensation (step mule 500) at described first parameter; The sensing position of mobile image module in described preset range, and produce second parameter (step 600); And according to described first parameter after the compensation and the displacement of the reference point image space between the described digital picture of second calculation of parameter with move (step 700) relative to the control vernier.Wherein, in step 700, carry out distance and angle compensation (step 710) and can select importing the sensitivity that scale parameter (step 720) moves with control vernier 802 at second parameter simultaneously, wherein, step 720 also can be selected and will not implement.
Refer again to Fig. 1,5 and 6a shown in, interactive device 20 is preferably to be set with in described image processing module 214 in advance before dispatching from the factory and is preset to image position parameter and default image-forming range parameter, when described parameter can be interactive device 20 distance reference points 12,14 for predeterminable range (for example 3 meters), the preset reference dot image I of the reference point of being obtained according to image sensing module 212 12,14 12And I 14The parameter preset of being tried to achieve, shown in Fig. 6 a, with this as distance and the benchmark during angle compensation.According to the formed plane space coordinate of the sensing array of image sensing module 212, for example, be the formed planimetric coordinates of initial point with the center "+" of sensing array, definition is preset to image position parameter and default image-forming range parameter.For example: the described image position parameter that is preset to can comprise in the plane space coordinate, reference point 12 and the 14 default image I that form 12And I 14Coordinate, its average coordinates (X 0, Y 0) and described two default image I 12And I 14The angle of inclination of line; Described default image-forming range parameter can comprise that reference point 12 and 14 form default image I 12And I 14Between distance L and average coordinates (X thereof 0, Y 0) with the distance D of sensing array central point "+".
At first, making reference point 12,14 produce the light of default spectrum, for example, is infrared light spectrum in the present embodiment, so, then can around reference point 12,14, determine image sensing scope " A " (step 150) according to the lighting angle of the angular field of view and the reference point 12,14 of image sensing module 212; Then, image sensing module 212 with interactive device 20 points to described image sensing scope " A " interior any one (step 250), because image sensing module 212 used in the present invention only can sense default spectrum, therefore the formed digital picture DI image (step 300) that only has reference point 12,14 comprises I shown in Fig. 6 a 12' and I 14' first image.Because when interactive device 20 obtained described digital picture, described interactive device 20 had rotated angle θ along clockwise direction, thereby the image I of reference point 12' and I 14' reference point the image I taken the photograph at aforementioned predeterminable range with image sensing module 212 12And I 14Between produced the deviation of anglec of rotation θ relatively, therefore, cause the reference point image I 12' and I 14' average coordinates (X Y) is different from preset reference dot image I 12And I 14Average coordinates (X 0, Y 0), though this moment, described image sensing module 212 pointed to same position.
Shown in Fig. 1,5,6a and 6b, described image processing module 214 is judged the reference point image I 12' and I 14' position and shape, and produce first parameter, described first parameter comprises: the first image space parameter, the first image-forming range parameter and imaging form parameter (step 400).214 of image processing modules are according to the described first image space parameter (reference point image I for example 12' and I 14' average coordinates and the angle of inclination of line) with described image position parameter (the preset reference dot image I for example that is preset to 12And I 14And the angle of inclination of line) angle compensation (step 500) that advances to slip a line of the angular deviation between, the mode of its compensation is represented with (1) formula:
X ′ Y ′ = cos ( θ ) - sin ( θ ) sin ( θ ) cos ( θ ) X Y - - - ( 1 )
Wherein, θ represents the first image space parameter and is preset to anglec of rotation deviation between the parameter of image position; X, Y represent the average coordinates of the preceding first image space parameter of angle compensation; X ', Y ' do not draw) average coordinates of expression compensation back reference point image space parameter then.Therefore, reference point 12,14 images after compensation are the image for being tried to achieve under same benchmark then, in view of the above as user during in the same distance shooting of the described image display of distance 80, image sensing module 212 all can obtain identical result when any anglec of rotation operation.
When described misalignment angle θ forms the reference point image I greater than 180 degree 12" and I 14" time, shown in Fig. 6 b, if the reference point image I 12, I 14Between do not have otherness (having identical size and shape), can't judge the reference point image I 12" and I 14" be by the reference point image I 12' and I 14' (it still is that translation forms that Fig. 6 a) rotates formation.Therefore in the present embodiment, by using two reference point 12,14 of different area, and the imaging form parameter of being tried to achieve according to image processing module 214 (for example, the area size of reference point image) first identification reference point 12,14 become the position separately of image, and then carry out angle compensation.Thus, even surpassing 180 degree, the anglec of rotation of image sensing module 212 still can correctly carry out angle compensation.
With reference to shown in Figure 7, it has shown the mode of present embodiment middle distance compensation.The image sensing module 212 of interactive device 20 is when predeterminable range is made a video recording, and the preset reference dot image that can obtain reference point 12 and 14 is I 12And I 14, when the distance of 12,14 of described interactive device 20 and reference point strengthened gradually, its obtained image then can diminish gradually, and its average coordinates can be gradually near the center "+" of image sensing array, I as shown in FIG. 12
Figure 2007101538261_2
And I 14
Figure 2007101538261_3
, but this kind skew is not to represent the user to change the sensing position of interactive device 20, if described skew is not proofreaied and correct, then may take place to judge situation for moving horizontally by accident because of the change of camera distance.In the present invention, suppose image I in the described default image-forming range parameter 12And I 14Between distance be L, its average coordinates (X 0, Y 0) and the center "+" of sensing array between distance be D; Image I in the first image-forming range parameter 12 With I 14
Figure 2007101538261_5
Between distance be l, the distance between the center "+" of its average coordinates and sensing array is d, so can utilize the proportionate relationship of (2) formula, the deviation (step 500) that compensation is caused because of the camera distance difference:
D L = d l - - - ( 2 )
With reference to shown in Figure 8, the image space of reference point forms i after having compensated 12And i 14, it has been compensated for as the image of being tried to achieve under same default distance and angle reference, and its average coordinates is shown as (X i, Y i).Then the sensing position of moving described interactive device 20 in described image sensing scope " A " is to obtain second image (step 600), and described second image comprises reference point image i 12' and i 14', this moment, 212 of described image sensing modules were sent to described image processing module 214 with second image that it sensed, 214 of described image processing modules produce second parameter according to described second image, and described second parameter comprises second image space parameter and the second image-forming range parameter of reference point 12,14 on second image behind the sensing position of moving described image sensing module 212.The described second image space parameter for example, is the formed plane space of initial point with the induction arrays center on the formed plane space of the sensing array of image sensing module 212, the average coordinates of reference point 12,14 images that form; The described second image-forming range parameter is that the described reference point on the formed plane space of the sensing array of image sensing module 212 forms the distance between image.214 of described image processing modules are according to the first image space parameter after the described compensation and the second image space parameter, calculating reference point image i 12And i 14Move to i 12' and i 14' motion-vector Δ S, when calculating, must utilize aforesaid compensation way to carry out the compensation (step 710) of angle and range deviation at described second parameter, vernier control in the hope of correct because the compensation way of second parameter is identical with first parameter, does not repeat them here.Then, processing unit 22 controls second wireless module 25 utilizes wireless mode, and described motion-vector Δ S is sent to image-display units 80.Have application software in image-display units 80 is preferably, described software, then can move on screen 801 relative to controlling vernier 802 (step 700) behind the motion-vector signal of accepting from second wireless module 25 in order to control user interface and vernier 802.In addition, carrying out reference point image i 12And i 14The calculating of motion-vector Δ S the time, can select to import one group of scale parameter X Scale, Y ScaleIn order to adjust the motion sensitivity of vernier 802, for example, this moment, motion-vector Δ S then can represent with (3) formula (step 720):
ΔS = ( ΔS X X scale , ΔS Y Y scale ) - - - ( 3 )
Wherein, Δ S XBe illustrated in the motion-vector component of horizontal direction; Δ S YBe illustrated in the motion-vector component of vertical direction.Hence one can see that, the X in (3) formula ScaleAnd Y ScaleDuring increase, the motion sensitivity of vernier 802 is reduced, that is the sensing displacement of interactive device 20 must increase the distance that could make that vernier 802 mobile phases are same; Otherwise, the X in (3) formula ScaleAnd Y ScaleWhen reducing, the motion sensitivity of vernier 802 is raise, that is the sensing displacement of interactive device 20 less relatively can make vernier 802 mobile phases with distance, so can promote the practicality of interactive device 20 of the present invention.
With reference to shown in Fig. 9 to 11, below illustrate the motion-vector of reference point between a plurality of images that processing unit 22 is tried to achieve according to the image processing module 214 of interactive device 20, a kind of embodiment of the sampling frequency of control modulation module 23 modulation image modules 21.Fig. 9 has shown the proportionate relationship apart from F of 801 of the image-forming range f of image sensing module 212 and image sensing module 212 and screens, in one embodiment, image-forming range f=32 millimeter, 801 of image sensing module 212 and screens apart from F=1.6 rice, that is F/f=50.Cooperate Figure 10 as can be known, distance and reference point image I that vernier 802 moves on the screen 801 of image display 80 12Ratio in the displacement of the sensing array SA of image sensing module 212 is 50 times, that is when image processing module 214 was tried to achieve 1 micron motion-vector, expression vernier 802 had moved 50 microns on the screen 801 of image display 80.
Then a kind of embodiment of the sampling frequency boundary value of described image module 21 is set in explanation, when each pixel of the sensing array SA of image sensing module 212 is of a size of 15 microns * 15 microns, be the motion-vector of the reference point 12 between 200 figure/seconds and per two figure when being 2 pixels in sampling frequency then, can try to achieve the per second change in location of vernier 802 on the screen 801 of image display 80 is (15 * 10 -3/ 25.4) * and 2 * 50 * 200=11.81 inch per second, in like manner can obtain respectively in the per second change in location of sampling frequency when being 100 frame/seconds and 50 frames/second and be respectively 5.91 inch per seconds and 2.95 inch per seconds.Please refer to Figure 11, it has shown the graph of a relation of the sampling frequency and the change in location of vernier 802 on the screen 801 of image display 80 of image module 21, in the present embodiment 2.95 inch per seconds and 5.91 inch per seconds are set at the boundary value that changes image module 21 sampling frequencies, that is when the change in location of vernier 802 on screen 801 was lower than 2.95 inch per seconds, the described image module 21 of modulation module 23 modulation obtained image with the sampling frequency of 50 frame/seconds; When described change in location was lower than 5.91 inch per seconds and is higher than 2.95 inch per seconds, described image module 21 obtained image with the sampling frequency of 100 frame/seconds; When described change in location was higher than 5.91 inch per seconds, described image module 21 obtained image with the sampling frequency of 200 frame/seconds.
Be understandable that, shown ratio value, boundary value, boundary value number and the sampling rate numerical value of Fig. 9 to 11 only is a kind of embodiment, be not that described ratio value, boundary value, boundary number and sampling frequency all can set up on their own according to actual product in order to qualification the present invention.In addition, the acceleration that described processing unit 22 is tried to achieve according to the action sensing module 24 of interactive device 20, the mode of the sampling frequency of control modulation module 23 modulation image modules 21, be that the electric signal of described action sensing module 24 outputs is compared with one or more boundary values, described processing unit 22 is according to the sampling frequency of comparative result control modulation module 23 modulation image modules 21.Thus, when the acceleration of the change in location of vernier 802 and/or interactive device 20 was low, the described image module 21 of modulation module 23 modulation obtained image with lower sampling frequency, so that effectively save the power consumption of interactive device 20 integral body.
In addition, if have other light sources near the described interaction image system, its partly or all spectrum and reference point 12,14 sent spectra overlapping the time, for example, Halogen lamp LED or sunshine light source can utilize cycle or aperiodic signal modulation reference point 12,14 light that sent.In one embodiment, modulating unit (not drawing) can be located to reference point 12,14 provide in the device of electric energy, for example, in main frame 10 or image display 80, the sampling frequency of image module 21 is preferably with the modulating frequency of modulating unit and becomes multiple to concern, so that the sampling of described image and reference point 12,14 sampling forms synchronously, for example, in one embodiment, the sampling frequency of image module 21 is 200 hertz (per 5 milliseconds of samplings once), reference point 12,14 modulating frequency is 20 hertz (per 50 milliseconds of samplings once), when image module 21 every samplings the 10th time, reference point 12,14 all light, so that image module 21 can obtain reference point 12, the 14 default spectral signals that produced; Simultaneously the rectification device is set in addition in interactive device 20, thus, can gets rid of the interference of other light sources image identification in order to the modulated light signal of identification.
In sum, because in the existing telechiric device, the sampling frequency of imageing sensor is fixed, it is higher and need change the problem of battery module often to have a whole power consumption.With device and How It Works thereof,, can effectively reduce the whole energy consumption of interactive device, according to games system of the present invention, recreation to increase its practicality by the sampling frequency of modulation image module in real time.
Though the present invention is described with aforementioned preferred implementation, in order to limit the present invention, any the ordinary technical staff in the technical field of the invention without departing from the spirit and scope of the present invention, can not make various changes and modification to described content.Therefore protection scope of the present invention is when being limited by appended claim.

Claims (38)

1. an interaction image system is characterized in that, described system comprises:
Main frame, described main frame comprises first wireless module and storage device, described storage device is in order to access software;
At least one reference point, described reference point produces the light of default spectrum; And
Interactive device, described interactive device comprises:
Second wireless module, in order to and described first wireless module between carry out data transmission;
Image module obtains described reference point image to form first image and second image with sampling frequency, calculates and export the motion-vector of described reference point image between described first image and described second image;
Modulation module is in order to modulate the sampling frequency of described image module; And
Processing unit is controlled described second wireless module described motion-vector is transferred to described first wireless module, and the sampling frequency of modulating described image module in real time according to the described modulation module of pre-conditioned control.
2. interaction image system according to claim 1 is characterized in that, the described pre-conditioned frequency selection signal that transfers to described interactive device for described main frame.
3. interaction image system according to claim 2 is characterized in that, described frequency selects signal to be determined by the software of the storage device institute access of described main frame.
4. interaction image system according to claim 1 is characterized in that, the described pre-conditioned described motion-vector of exporting for the image module of described interactive device.
5. interaction image system according to claim 1, it is characterized in that, described interactive device also comprises the action sensing module, described action sensing module is in order to the state of the described interactive device of sensing and produce electric signal, wherein said processing unit calculates the acceleration of described interactive device according to described electric signal, and controls described second wireless module described acceleration is transferred to described first wireless module.
6. interaction image system according to claim 5 is characterized in that, described pre-conditioned described acceleration of trying to achieve for the processing unit of described interactive device.
7. interaction image system according to claim 5 is characterized in that, described action sensing module is selected from one in acceleration transducer and the gyro sensor.
8. interaction image system according to claim 5 is characterized in that, described electric signal is selected from one in potential difference signal and the current signal.
9. interaction image system according to claim 1 is characterized in that, described reference point is the infrared light light emitting diode, and described default spectrum is infrared light spectrum.
10. interaction image system according to claim 9 is characterized in that, described image module also comprises:
The infrared light optical filtering is in order to the light beyond the described default spectrum of filtering;
Image sensing module, described image sensing module obtains described reference point image to form described first image and described second image with sampling frequency; And
Image processing module, described image processing module calculates and exports the motion-vector of described reference point image between described first image and described second image.
11. interaction image system according to claim 10 is characterized in that, described image sensing module is selected from one in ccd image sensor and the cmos image sensor.
12. interaction image system according to claim 1 is characterized in that, the sampling frequency scope of described image module is between 60 frame/seconds to 200 frame/seconds.
13. interaction image system according to claim 1 is characterized in that, described reference point is electrically connected to described main frame.
14. interaction image system according to claim 1 is characterized in that, described system also comprises in order to show the image display of the software that described main frame is performed.
15. interaction image system according to claim 14 is characterized in that, described reference point is electrically connected to described image display.
16. interaction image system according to claim 1 is characterized in that, described modulation module also comprises the multiplexer (MUX in order to the sampling frequency of modulating described image module.
17. interaction image system according to claim 1 is characterized in that, described main frame is selected from one in game host and the computer system main frame.
18. interaction image system according to claim 1 is characterized in that, described interactive device is selected from one in telepilot and the indicator positioning device.
19. interaction image system according to claim 1 is characterized in that, described reference point is selected from one in light emitting diode and the laser diode.
20. interactive device, it is characterized in that, described device is used for interaction image system, described system comprises described interactive device, main frame and at least one is in order to produce the reference point of the light of presetting spectrum, described interactive device connects the wireless module that described main frame comprises by radio communication, and described interactive device comprises:
Wireless module, described wireless module in order to and the wireless module of described main frame between carry out data transmission;
Image module, described image module obtains described reference point image to form first image and second image with sampling frequency, calculates and export the motion-vector of described reference point image between described first image and second image;
Modulation module, described modulation module is in order to modulate the sampling frequency of described image module; And
Processing unit, described processing unit are controlled described wireless module described motion-vector are transferred to the wireless module of described main frame, and the sampling frequency of modulating described image module in real time according to the described modulation module of pre-conditioned control.
21. interactive device according to claim 20 is characterized in that, the described pre-conditioned frequency selection signal that transfers to described interactive device for described main frame.
22. interactive device according to claim 20 is characterized in that, the described pre-conditioned motion-vector of exporting for the image module of described interactive device.
23. interactive device according to claim 20, it is characterized in that, described interactive device also comprises in order to the state of the described interactive device of sensing and produces the action sensing module of electric signal, wherein said processing unit calculates the acceleration of described interactive device according to described electric signal, and controls the wireless module that described wireless module transfers to described acceleration in described main frame.
24. interactive device according to claim 23 is characterized in that, described pre-conditioned acceleration of trying to achieve for the processing unit of described interactive device.
25. interactive device according to claim 23 is characterized in that, described action sensing module is selected from one in acceleration transducer and the gyro sensor.
26. interactive device according to claim 23 is characterized in that, described electric signal is selected from one in potential difference signal and the current signal.
27. interactive device according to claim 20 is characterized in that, described default spectrum is infrared light spectrum, and described image module also comprises:
The infrared light optical filtering, described infrared light optical filtering is in order to the light beyond the described default spectrum of filtering;
Image sensing module, described image sensing module obtains described reference point image to form described first image and described second image with sampling frequency; And
Image processing module, described image processing module calculates and exports the motion-vector of described reference point image between described first image and described second image.
28. interactive device according to claim 27 is characterized in that, described image sensing module is selected from one in ccd image sensor and the cmos image sensor.
29. interactive device according to claim 20 is characterized in that, the sampling frequency scope of described image module is between 60 frame/seconds to 200 frame/seconds.
30. interactive device according to claim 20 is characterized in that, described modulation module also comprises the multiplexer (MUX in order to the sampling frequency of modulating described image module.
31. interactive device according to claim 20 is characterized in that, described device is selected from one in telepilot and the indicator positioning device.
32. the How It Works of an interactive device, it is characterized in that, described interactive device is used for interaction image system, described system comprises that described interactive device, main frame and at least one are used to produce the reference point of the light of default spectrum, and described interactive device connects the wireless module that described main frame comprises by radio communication, and described How It Works comprises the following steps:
Provide image module to obtain described reference point image to form first image and second image, calculate and export the motion-vector of described reference point image between described first image and described second image with sampling frequency;
Wireless module is provided, carries out data transmission between the wireless module of described wireless module and described main frame; And
Processing unit is provided, and described processing unit is controlled described wireless module described motion-vector is transferred to the wireless module of described main frame, and according to pre-conditioned sampling frequency of modulating described image module in real time.
33. How It Works according to claim 32 is characterized in that, described method comprises the following steps: that also receiving the frequency that described main frame sends by described wireless module selects signal, wherein said pre-conditioned be described frequency selection signal.
34. How It Works according to claim 32 is characterized in that, the described pre-conditioned described motion-vector of exporting for the image module of described interactive device.
35. How It Works according to claim 32 is characterized in that, described method also comprises the following steps:
Provide the action sensing module in order to the described interactive device of sensing state and produce electric signal;
Calculate the acceleration of described interactive device according to described electric signal; And
Described acceleration is transferred to the wireless module of described main frame by described wireless module.
36. How It Works according to claim 35 is characterized in that, described pre-conditioned described acceleration for the described interactive device of trying to achieve according to described electric signal.
37. How It Works according to claim 32 is characterized in that, the sampling frequency scope of described image module is between 60 frame/seconds to 200 frame/seconds.
38. an interaction image system is characterized in that, described system comprises:
Main frame, described main frame comprises first wireless module and storage device, described storage device is in order to access software;
At least one reference point, described reference point produces the light of default spectrum; And
Interactive device, described interactive device comprises:
Second wireless module, described second wireless module in order to and described first wireless module between carry out data transmission;
Image module, described image module obtain and export described reference point image to form first image and second image with sampling frequency;
Modulation module, described modulation module is in order to modulate the sampling frequency of described image module; And processing unit, described processing unit calculates the motion-vector of described reference point image between described first image and described second image, control described second wireless module described motion-vector is transferred to described first wireless module, and the sampling frequency of modulating described image module in real time according to the described modulation module of pre-conditioned control.
CN2007101538261A 2007-09-12 2007-09-12 Interaction image system, interaction apparatus and operation method thereof Expired - Fee Related CN101388138B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2007101538261A CN101388138B (en) 2007-09-12 2007-09-12 Interaction image system, interaction apparatus and operation method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2007101538261A CN101388138B (en) 2007-09-12 2007-09-12 Interaction image system, interaction apparatus and operation method thereof

Publications (2)

Publication Number Publication Date
CN101388138A CN101388138A (en) 2009-03-18
CN101388138B true CN101388138B (en) 2011-06-29

Family

ID=40477536

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2007101538261A Expired - Fee Related CN101388138B (en) 2007-09-12 2007-09-12 Interaction image system, interaction apparatus and operation method thereof

Country Status (1)

Country Link
CN (1) CN101388138B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102053728B (en) * 2009-11-06 2012-10-31 原相科技股份有限公司 Control device for image display and control method thereof
TWI442269B (en) * 2011-08-09 2014-06-21 J Mex Inc Control device and method using control device for controlling screen
CN103135755B (en) * 2011-12-02 2016-04-06 深圳泰山在线科技有限公司 Interactive system and method
TWI540470B (en) 2012-07-13 2016-07-01 原相科技股份有限公司 Interactive image system and remote controller adapted thereto
CN103576894B (en) * 2012-07-25 2017-03-01 原相科技股份有限公司 Interactive image system and the remote control being suitable for this interactive image system
US10067576B2 (en) 2013-02-19 2018-09-04 Pixart Imaging Inc. Handheld pointer device and tilt angle adjustment method thereof
US9804689B2 (en) 2013-02-19 2017-10-31 Pixart Imaging Inc. Handheld pointer device and pointer positioning method thereof
CN104216532B (en) * 2013-05-31 2018-04-10 原相科技股份有限公司 Hand-held indicator device and its angle of inclination bearing calibration
CN104238555B (en) * 2013-06-18 2017-09-22 原相科技股份有限公司 The remote control system of directional type robot

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1533180A (en) * 2003-03-19 2004-09-29 松下电器产业株式会社 Remote control device and method based on video frequency picture detection
CN1577220A (en) * 2003-07-28 2005-02-09 株式会社东芝 Electronic apparatus, screen control method and screen control program
CN1853158A (en) * 2003-07-24 2006-10-25 阿登蒂斯公司 Method and system for gestural control of an apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1533180A (en) * 2003-03-19 2004-09-29 松下电器产业株式会社 Remote control device and method based on video frequency picture detection
CN1853158A (en) * 2003-07-24 2006-10-25 阿登蒂斯公司 Method and system for gestural control of an apparatus
CN1577220A (en) * 2003-07-28 2005-02-09 株式会社东芝 Electronic apparatus, screen control method and screen control program

Also Published As

Publication number Publication date
CN101388138A (en) 2009-03-18

Similar Documents

Publication Publication Date Title
CN101388138B (en) Interaction image system, interaction apparatus and operation method thereof
US8553094B2 (en) Interactive image system, interactive apparatus and operating method thereof
CN102547355B (en) Image sensor and methods of operating same
CN101472095B (en) Cursor control method and device using the method
CN102508565B (en) Remote control cursor positioning method and device, remote control and cursor positioning system
CN106911888A (en) A kind of device
CN107438804A (en) A kind of Wearable and UAS for being used to control unmanned plane
CN108885487B (en) Gesture control method of wearable system and wearable system
US11665334B2 (en) Rolling shutter camera pipeline exposure timestamp error determination
US20190114803A1 (en) Locating method, locator, and locating system for head-mounted display
CN108592951A (en) A kind of coalcutter inertial navigation Initial Alignment Systems and method based on optical flow method
US11030793B2 (en) Stylized image painting
CN114514744B (en) Multidimensional rendering
CN101900552B (en) Longitude-latitude camera videogrammetric method and system
CN109254587A (en) Can under the conditions of wireless charging steadily hovering small drone and its control method
US20240125596A1 (en) Method for Determining Foldable Screen Included Angle and Associated Devices of Method
CN106155359A (en) Tool height follows the trail of the optical navigator of speed
CN108151738A (en) Codified active light marked ball with attitude algorithm
US20230412779A1 (en) Artistic effects for images and videos
JP2021515455A (en) Camera module and its super-resolution video processing method
CN107560637B (en) Method for verifying calibration result of head-mounted display device and head-mounted display device
CN101452349A (en) Cursor controller on image display apparatus, method and image system
CN106817530B (en) Anti-shake camera
CN110428381A (en) Image processing method, image processing apparatus, mobile terminal and storage medium
CN105783734B (en) Offset, torsion measurement method and the device of gas chamber piston and T baffle

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20110629