CN107754310B - Handheld device and positioning method thereof - Google Patents

Handheld device and positioning method thereof Download PDF

Info

Publication number
CN107754310B
CN107754310B CN201711124233.2A CN201711124233A CN107754310B CN 107754310 B CN107754310 B CN 107754310B CN 201711124233 A CN201711124233 A CN 201711124233A CN 107754310 B CN107754310 B CN 107754310B
Authority
CN
China
Prior art keywords
tilt angle
pointing
coordinate
reference point
cursor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711124233.2A
Other languages
Chinese (zh)
Other versions
CN107754310A (en
Inventor
程瀚平
黄昭荐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to CN201711124233.2A priority Critical patent/CN107754310B/en
Publication of CN107754310A publication Critical patent/CN107754310A/en
Application granted granted Critical
Publication of CN107754310B publication Critical patent/CN107754310B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03542Light pens for emitting or receiving light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1018Calibration; Key and button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8011Ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Abstract

The invention provides a handheld device and a positioning method thereof. The positioning method of the handheld device comprises the following steps: obtaining the position of a first image frame of a reference point; calculating a first pointing coordinate according to the position of the first image frame and a first inclination angle representing an inclination angle of the handheld device; calculating a second directional coordinate according to the position of the first image frame and a second inclination angle generated by the movement of the first image frame based on the reference point; calculating a first offset vector between the first directional coordinate and the second directional coordinate, and generating a unit compensation vector; obtaining the position of a second image frame of the reference point; calculating a third pointing coordinate according to the position of the second image frame and the second inclination angle; and outputting a pointing coordinate according to the third pointing coordinate and the unit compensation vector.

Description

Handheld device and positioning method thereof
The application is a divisional application of an invention patent application named as a handheld pointing device and a cursor positioning method thereof, which is original science and technology limited company, has an application date of 2013, 12 and 18 months, and an application number of 201310699986.1.
Technical Field
The present invention relates to a method for positioning a device, and more particularly, to a method for positioning a handheld device.
Background
The handheld pointing device is widely applied to various interactive remote control games, such as a light gun game, a baseball game, a tennis game, and the like, because the handheld pointing device can calculate the pointing coordinates of the handheld pointing device by analyzing the position of the reference point in the sensed image area, and transmit the pointing coordinates to the game host to perform related game operations.
The distance between the image sensor installed on the handheld pointing device and the display screen and the rotation angle of the handheld pointing device during image capturing all affect the calculation of the pointing coordinate. Therefore, in order to increase the use hand feeling of the handheld pointing device, the handheld pointing device is generally provided with a device for detecting the tilt angle at the same time, so as to detect the rotation angle of the handheld pointing device at a proper time and correspondingly update the tilt angle used by the handheld pointing device to calculate the pointing coordinate. Therefore, the relative movement relation between the handheld pointing device and the reference point can be accurately calculated and judged, and misjudgment is avoided.
However, when the handheld pointing device updates the currently used tilt angle, the handheld pointing device immediately controls the cursor to move according to the detected tilt angle and the calculated imaging position of the reference point in the image sensing region. Therefore, the cursor suddenly jumps on the screen of the display device, and the operation hand feeling of the user is reduced, and the user is inconvenient to use.
Disclosure of Invention
In view of the above, the present invention provides a positioning method for a handheld device and the handheld device, in which the positioning method can actively update the displacement between the front and the back of the tilt angle according to the handheld device to correct the pointing coordinate generated by the handheld device, so as to avoid the occurrence of a jumping point.
The embodiment of the invention provides a positioning method of a handheld device. First, a handheld device, such as a handheld pointing device, captures a first image frame of a reference point when the handheld pointing device updates a first tilt angle currently used to a second tilt angle. Secondly, a first pointing coordinate is calculated according to the imaging position of the reference point in the first image frame and the first inclination angle. And then, calculating a second directional coordinate according to the imaging position of the reference point in the first image frame and the second inclination angle. And then, capturing a second image frame of the reference point to calculate a third pointing coordinate according to the imaging position of the reference point in the second image frame and the second inclination angle. And then, calculating the cursor location according to the third pointing coordinate, the first pointing coordinate and the second pointing coordinate so as to correspondingly generate cursor parameters for controlling the cursor to be positioned on the display device.
Another embodiment of the present invention provides a cursor positioning method for a handheld pointing device, including the following steps. First, when the handheld pointing device updates the currently used first tilt angle to a second tilt angle, a first image frame of a reference point is captured. Next, an angular difference between the first and second tilt angles is calculated. And then, when the calculated angle difference is larger than a preset angle, calculating a first pointing coordinate according to the imaging position of the reference point in the first image frame and the first inclination angle. And then, calculating a second directional coordinate according to the imaging position of the reference point in the first image frame and the second inclination angle, and correspondingly generating a first offset vector between the first directional coordinate and the second directional coordinate. And then, when the handheld pointing device moves subsequently, the positioning calculation of the cursor is carried out according to the calculated first offset vector and the pointing coordinate calculated by the movement of the handheld pointing device. Then, cursor parameters for controlling the movement of the cursor are correspondingly generated according to the calculation result.
In another embodiment of the present invention, a cursor positioning method for a handheld pointing device is provided. Firstly, the handheld pointing device updates the currently used first tilt angle to a second tilt angle at a first time. Then, the handheld pointing device calculates a first pointing coordinate and a second pointing coordinate of the imaging position of the corresponding reference point in the first image frame by using the first inclination angle and the second inclination angle respectively at the first time. Thereafter, at a second time, a third pointing coordinate corresponding to the imaging position of the reference point on the second image frame is calculated using the second tilt angle, wherein the second time is after the first time. And then, calculating the cursor location according to the third pointing coordinate, the first pointing coordinate and the second pointing coordinate so as to correspondingly generate cursor parameters for controlling the cursor to be positioned on the display device.
The embodiment of the invention also provides a handheld pointing device, which comprises an image acquisition unit, an acceleration unit and a processing unit. The image capturing unit is used for sequentially capturing a plurality of image frames of the reference point. The acceleration unit is used for sensing a plurality of acceleration values of the handheld pointing device in a multi-axis direction and correspondingly generating an acceleration vector. The processing unit is coupled to the image capturing unit and the acceleration unit. The processing unit is used for calculating cursor positioning according to the imaging positions of the reference points on the image frames and the currently used first inclination angle.
When the handheld pointing device calculates and updates the currently used first inclination angle to be a second inclination angle according to the acceleration values, the processing unit drives the image capturing unit to capture the first image frame of the reference point, and calculates a first pointing coordinate and a second pointing coordinate of the corresponding reference point by using the first inclination angle and the second inclination angle respectively. And then, the processing unit drives the image capturing unit to capture a second image frame of the reference point, and correspondingly generates a cursor parameter for controlling the cursor to be positioned on the display device according to the imaging position of the reference point in the second image frame, the first pointing coordinate, the second pointing coordinate and the second inclination angle.
In addition, an embodiment of the present invention further provides a computer-readable medium recording a set of computer-executable programs, wherein when the computer-readable medium is read by a processor, the processor can execute the steps of the cursor positioning method.
In summary, the embodiments of the present invention provide a handheld pointing device and a cursor positioning method of the handheld pointing device, and the handheld pointing device and the cursor positioning method are suitable for controlling a movement operation of a cursor on a display device. The cursor positioning method can actively correct the pointing coordinate calculated by using the updated inclination angle when the handheld pointing device calculates the cursor positioning after updating the inclination angle, so that the cursor is gradually moved to the position corresponding to the actual pointing direction of the current handheld pointing device within the preset correction time or correction times by the pointing coordinate calculated by the inclination angle before updating. Therefore, the cursor jumping situation can be effectively avoided, and the convenience and the stability of the operation of a user are improved.
For a better understanding of the nature and technical content of the present invention, reference should be made to the following detailed description of the invention and the accompanying drawings, which are provided for purposes of illustration only and are not intended to limit the scope of the invention.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
fig. 1 is a schematic view illustrating a handheld pointing device applied to an interactive system according to an embodiment of the present invention.
Fig. 2 is a functional block diagram of a handheld pointing device according to an embodiment of the present invention.
Fig. 3 is a flowchart illustrating a cursor positioning method of a handheld pointing device according to an embodiment of the present invention.
Fig. 4A to 4B are schematic diagrams illustrating positions of reference points sensed when the handheld pointing device provided by the embodiment of the invention moves respectively.
Fig. 4C is a schematic diagram illustrating a position change of a reference point calculated by the handheld pointing device according to different tilt angles according to the embodiment of the invention.
Fig. 4D is a schematic diagram illustrating a relative relationship between a reference point position and a cursor position on a display device according to an embodiment of the present invention.
Fig. 5 is a schematic diagram illustrating a change in a position of a cursor on a display screen when a handheld pointing device provided by an embodiment of the present invention moves.
Fig. 6 is a flowchart illustrating a cursor positioning correction method of a handheld pointing device according to an embodiment of the present invention.
Fig. 7 is a schematic diagram illustrating a position change of a moving cursor on a display screen of a handheld pointing device according to an embodiment of the present invention.
Fig. 8 is a flowchart illustrating a method for positioning a cursor of a pointing device according to another embodiment of the present invention.
Fig. 9 is a flowchart illustrating a method for positioning a cursor of a pointing device according to another embodiment of the present invention.
Description of the reference numerals
10: hand-held pointing device
11: image acquisition unit
12: acceleration unit
13: processing unit
14: input unit
15: storage unit
16: communication unit
20: display device
21: reference point
23. 23a to 23d, 25a, 33a to 33N, 35 a; cursor
X, Y, Z: axial direction
Figure GDA0002519828620000051
Offset vector
(x1, y1), (x2, y2), (x3, y 3): pointing coordinate
Figure GDA0002519828620000052
Pointing coordinate
d. d1, d2, d 3: distance between two adjacent plates
F1, F2: image frame
111. 111a, 111 b: operating range
1111. 1111a, 1111 b: center point of operating range
113. 113a, 113 b: reference point image
TA, TB, TC, TD: point in time
S301 to S317: flow of steps
S601 to S621: flow of steps
S801 to S817: flow of steps
S901 to S919: flow of steps
Detailed Description
Hereinafter, the present invention will be described in detail by illustrating various exemplary embodiments thereof with the aid of the drawings. The inventive concept may, however, be embodied in many different forms and should not be construed as limited to the exemplary embodiments set forth herein. Moreover, in the drawings, like reference numerals may be used to denote similar components.
[ Handheld pointing device embodiment ]
The handheld pointing device of the present invention can be applied to cursor positioning on a display device. Referring to fig. 1, fig. 1 is a schematic view illustrating a handheld pointing device applied to an interactive system according to an embodiment of the present invention. The interactive system comprises a handheld pointing device 10 and a display device 20. The display device 20 is further provided with a reference point 21 for the handheld pointing device 10 to control the movement of a cursor 23 on the screen of the display device 20.
In the present embodiment, the display device 20 has a software and hardware architecture capable of executing and displaying software programs. The display device 20 may be, for example, but not limited to, a projection display device, a gaming machine display screen, a television screen, and a computer display screen. However, in practice, the interactive system may further include a host (not shown) according to practical application requirements, such as a game host or a computer host. The host can be used for reading and executing software programs, such as game software, for example, light gun games, baseball games, tennis games, and the like. The host computer can also display the execution state of the software program on the display device 20 for the user to browse and control.
The reference point 21 is disposed near the display device 20, and is used for the handheld pointing device 10 to determine a pointing position of the handheld pointing device 10, and further determine a moving direction and a moving amount of the handheld pointing device 10 relative to the reference point 21.
The reference point 21 may be implemented by a plurality of light emitting diodes with specific wavelengths, such as infrared light emitting diodes (IRLEDs), laser diodes, or ultraviolet light emitting diodes, arranged in various shapes. In addition, the light emitting diodes may be electrically connected to the display device 20 to obtain the power required for emitting light, or may be self-powered by an independent power source. In addition, only one reference point is used in the embodiment, but a person having ordinary knowledge in the field of the present invention can set the number of the reference points 21 according to design requirements, for example, 1, 2 or more. That is, fig. 1 is only used to illustrate the operation of the handheld pointing device 10, and is not intended to limit the present invention.
When the handheld pointing device 10 points to the position of the reference point 21, the image capturing unit 11 is driven to sequentially capture a plurality of image frames corresponding to the reference point 21. The pointing coordinate of the handheld pointing device 10 pointing to the display device 20 is calculated according to the imaging position of the reference point 21 on one of the image frames and the currently calculated tilt angle of the handheld pointing device 10. Then, the handheld pointing device 10 calculates the cursor location of the cursor 23 on the screen of the display device 20 according to the pointing coordinates. The hand-held pointing device 10 transmits the cursor parameter for controlling the cursor generated according to the position change of the reference point 21 to the display device 20 in a wireless manner. Accordingly, the handheld pointing device 10 can control the position of the cursor 23 on the display device 20.
In the present embodiment, the handheld pointing device 10 can determine whether to update the first tilt angle (i.e., the rotation angle of the handheld pointing device 10) currently calculated by the handheld pointing device 10 to the second tilt angle according to the movement change of the reference point 21 in the imaging position of the image frames. In one embodiment, the handheld pointing device 10 can determine whether the handheld pointing device 10 is currently in a stationary state or a moving state according to whether the reference point 21 is greatly moved at the imaging position of the image frames, so as to determine whether to update the tilt angle currently calculated by the handheld pointing device 10. In another embodiment, the handheld pointing device 10 can also determine whether the handheld pointing device 10 is currently in a stationary or moving state according to whether the pointing coordinate calculated by the first tilt angle is moved greatly according to the imaging position of the reference point 21 in the image frames and whether the pointing coordinate is moved greatly, so as to determine whether to update the tilt angle currently calculated by the handheld pointing device 10.
Specifically, the large movement refers to the movement change of the reference point 21 or the calculated pointing coordinate in a moment (i.e. a short time, such as several seconds, several milliseconds, or two or more consecutive frames). That is, the displacement amount (i.e. displacement variation value), the moving speed or the acceleration of the reference point 21 at the imaging position of the continuous image frames, or the displacement amount, the moving speed or the acceleration of the pointing coordinate calculated according to the continuous image frames, etc. are referred to.
In one embodiment, the handheld pointing device 10 may use an inertial sensor to calculate the tilt angle of the handheld pointing device 10. However, when the user moves the handheld pointing device 10, the force applied to the handheld pointing device 10 by the user affects the result of the inertial sensor determining the gravity direction. Therefore, the tilt angle of the handheld pointing device 10 must be accurately calculated and updated after the influence of the force applied by the user is eliminated. That is, when the handheld pointing device 10 is not moved by the user (i.e. it is detected that the reference point 21 is not moved by a large amount), it is regarded that the handheld pointing device 10 is not affected by the external force, so that the current rotation angle of the handheld pointing device 10 can be accurately sensed and calculated, and the first tilt angle currently used by the handheld pointing device 10 is updated to the second tilt angle.
When the handheld pointing device 10 determines to update the currently used first tilt angle to the second tilt angle, the handheld pointing device 10 captures a first image frame corresponding to the reference point 21. The handheld pointing device 10 calculates a first pointing coordinate according to the first tilt angle currently used and the imaging position of the reference point 21 in the first image frame. The handheld pointing device 10 calculates the cursor positioning according to the first pointing coordinate to correspondingly generate the cursor parameter for controlling the cursor 23 to be located on the display device 20.
Meanwhile, the handheld pointing device 10 calculates a second directional coordinate according to the currently used second tilt angle and the imaging position of the reference point 21 in the first image frame. Then, when the handheld pointing device 10 calculates the third pointing coordinate according to the imaging position of the reference point 21 in the second image frame and the second tilt angle, the handheld pointing device 10 determines whether to perform correction compensation on the third pointing coordinate calculated by the subsequent cursor positioning according to the offset difference between the first pointing coordinate and the second pointing coordinate (i.e. the offset of the cursor 23).
When the handheld pointing device 10 determines that the offset difference between the first pointing coordinate and the second pointing coordinate is greater than or equal to a first preset offset threshold, the handheld pointing device 10 performs calibration compensation on the third pointing coordinate. The handheld pointing device 10 calculates the cursor positioning according to the third pointing coordinate, the first pointing coordinate and the second pointing coordinate, so as to generate a cursor parameter for controlling the cursor 23 to be located on the display device 20 correspondingly. When the handheld pointing device 10 determines that the offset difference between the first pointing coordinate and the second pointing coordinate is smaller than the first preset offset threshold, the handheld pointing device 10 directly calculates the cursor location according to the third pointing coordinate, so as to correspondingly generate the cursor parameter for controlling the cursor 23 to be located on the display device 20.
Therefore, the influence of the jumping point on the operation feeling of the user when the handheld pointing device 10 updates the tilt angle is avoided.
In addition, the handheld pointing device 10 can also determine whether to perform correction compensation on the pointing coordinate calculated by the updated tilt angle according to the magnitude of the angle difference between the currently used first tilt angle and the updated second tilt angle.
For example, when the angle difference between the currently used first tilt angle and the updated second tilt angle is larger than a predetermined angle, the handheld pointing device 10 can determine to perform the calibration compensation on the pointing coordinate (i.e., the third pointing coordinate) calculated by using the updated tilt angle in the subsequent cursor positioning.
Then, when the handheld pointing device 10 determines that the pointing coordinate calculated after updating the first tilt angle to the second tilt angle needs to be corrected and compensated, the handheld pointing device 10 will complete the cursor positioning correction within a preset correction time or correction times, so that the cursor 23 moves from the movement route corresponding to the first tilt angle to the movement route corresponding to the second tilt angle. Therefore, the cursor 23 on the display device 20 can be accurately controlled to move according to the position of the handheld pointing device 10 relative to the display device 20, and meanwhile, the operation hand feeling of the user is prevented from being influenced by jumping points generated by updating the inclination angle.
It should be noted that the handheld pointing device 10 of the present embodiment may further determine whether to perform a calibration compensation (e.g., correspondingly setting a first preset offset threshold and a preset angle) and a calibration compensation method (e.g., a compensation amount and a preset calibration time for each calibration) on the pointing coordinate calculated by using the updated tilt angle each time the tilt angle is updated according to the state of the software program executed by the display device 20 and the resolution of the display device 20.
Specifically, the handheld pointing device 10 may store a plurality of sets of calibration parameters in advance according to the resolution of the display device 20 and the type of the software program.
For example, if the type of the software program executed by the display device 20 needs to be precisely controlled (e.g., a dynamic image), a smaller first predetermined offset threshold and/or a smaller value of the predetermined angle are correspondingly set, so that the handheld pointing device 10 executes a calibration procedure of the pointing coordinate after updating the tilt angle, thereby improving the directivity of the handheld pointing device 10.
For another example, if the accuracy required by the type of the software program executed by the display device 20 is low (for example, a still image), a higher first preset offset threshold and/or a higher value of the preset angle are/is correspondingly set, so that the handheld pointing device 10 does not need to correct the pointing coordinate after updating the tilt angle, or does not need to execute the correction procedure of the pointing coordinate every time the tilt angle is updated, thereby reducing the number of times of correction, and further reducing the complexity of calculation of the pointing coordinate every time the handheld pointing device 10 calculates the pointing coordinate.
Then, when the handheld pointing device 10 is started, the handheld pointing device 10 can automatically connect with the display device 20, and obtain the type of the software program currently executed by the display device 20. The handheld pointing device 10 determines whether to perform calibration compensation on the pointing coordinate calculated by the updated tilt angle and selects appropriate calibration parameters according to the type of the software program currently executed by the display device 20. Thereby improving the usability and convenience of the handheld pointing device 10.
In more detail, referring to fig. 2 and fig. 1 at the same time, fig. 2 is a functional block diagram of a handheld pointing device according to an embodiment of the present invention. The handheld pointing device 10 includes an image capturing unit 11, an acceleration unit 12, a processing unit 13, an input unit 14, a storage unit 15, and a communication unit 16. The image capturing unit 11, the acceleration unit 12, the input unit 14, the storage unit 15 and the communication unit 16 are respectively coupled to the processing unit 13.
It should be noted that, in another embodiment, the acceleration unit 12 may be integrated with the image capturing unit 11, and the acceleration unit 12 is coupled in series to the processing unit 13 through the image capturing unit 11. That is, at least one of the image capturing unit 11, the acceleration unit 12, the input unit 14, the storage unit 15 and the communication unit 16 may be coupled to the processing unit 13 in series with another component in other embodiments.
The image capturing unit 11 is configured to capture an image frame corresponding to the position of the reference point 21 when the handheld pointing device 10 points to the reference point 21, and sequentially generate a plurality of image frames (image frames). Specifically, the image capturing unit 11 sequentially generates a plurality of image frames having the reference point 21 image by sensing the light generated by the reference point 21 according to a predetermined image sampling frequency (e.g. 200 image frames per second).
The image capturing unit 11 can filter out the light other than the specific light wave through a filter unit (not shown), so that the image capturing unit 11 only senses the light with the specific light wave emitted from the reference point 21.
In the present embodiment, the image capturing unit 11 may be implemented by a charge-coupled device (CCD), an image sensor or a Complementary Metal Oxide Semiconductor (CMOS) image sensor, and those skilled in the art can design the image capturing unit according to actual use situations, which is not limited herein.
The acceleration unit 12 is used for sensing a plurality of acceleration values of the handheld pointing device 10 in multiple axes (e.g., X-axis, Y-axis, Z-axis, etc.) and generating an acceleration vector (acceleration vector). The acceleration unit 12 of the present embodiment can be, for example, a gravity sensor (G-sensor), an accelerometer (also called an accelerometer), and is built in the handheld pointing device 10, but in other embodiments, the acceleration unit 12 can also be implemented by a plug-in module. The present invention can be designed by the ordinary people in the technical field of the present invention according to the practical use situation, and the present embodiment is not limited thereto.
The processing unit 13 is configured to receive the image frames generated by the image capturing unit 11, and calculate an imaging position of the reference point 21 in one of the image frames according to the one of the image frames. The processing unit 13 calculates an imaging position of the reference point 21 in one of the image frames and a first tilt angle currently used by the handheld pointing device 10 to generate a pointing coordinate of the reference point. The processing unit 13 calculates the cursor location on the screen of the display device 20 according to the pointing coordinates of the reference point 21 to generate a cursor parameter for controlling the cursor. Subsequently, the processing unit 13 transmits the cursor parameters to the display device 20 through wireless transmission by using the communication unit 16, so as to cooperate with a software program (e.g. game software) executed by the display device 20 to relatively control the motion of the cursor 23 on the display device 20.
More specifically, the processing unit 13 determines whether the reference point 21 is displaced, i.e. whether the reference point 21 is moved substantially, according to the image frames. When the processing unit 13 determines that the reference point 21 has not moved significantly, the acceleration reading unit 12 senses the acceleration values generated by the handheld pointing device 10 in multiple axes. The processing unit 13 calculates and updates the first tilt angle currently used by the handheld pointing device 10 to be the second tilt angle according to the acceleration values. Then, the processing unit 13 calculates pointing coordinates of the handheld pointing device 10 corresponding to the display device 20 by using the second tilt angle and the imaging position of the reference point 21 corresponding to one of the image frames, so as to calculate the cursor positioning.
It should be noted that, in an embodiment, the processing unit 13 may sense a plurality of acceleration values of the handheld pointing device 10 in the X-axis direction, the Y-axis direction and the Z-axis direction according to the acceleration unit 12, and calculate a current rotation angle of the handheld pointing device 10 by calculating an included angle between any two axis directions, so as to correspondingly update the currently used first tilt angle as the second tilt angle.
When the processing unit 13 determines that the reference point 21 has moved significantly, the processing unit 13 determines not to update the first tilt angle currently used by the handheld pointing device 10 by determining that the acceleration unit 12 cannot accurately measure the handheld pointing device 10. The processing unit 13 further calculates the pointing coordinate of the handheld pointing device 10 corresponding to the display device 20 by using the first tilt angle and the imaging position of the reference point 21 corresponding to one of the image frames. The processing unit 13 correspondingly generates cursor parameters for controlling the cursor 23 according to the calculated pointing coordinates, and transmits the cursor parameters to the display device 20 through the communication unit 16 in a wireless transmission manner.
The way for the processing unit 13 to calculate the tilt angles of the handheld pointing device 10, such as the first tilt angle and the second tilt angle, is further described below.
For example, the image frames generated by the image capturing unit 11 corresponding to the positions of the reference points 21 may be rectangular, and the long sides of the image frames are parallel to the X axis, and the short sides of the image frames are parallel to the Y axis. When the processing unit 13 determines that the reference point 21 has not moved greatly, the processing unit 13 can drive the acceleration unit 12 to sense acceleration values Vx, Vy, and Vz of the handheld pointing device 10 in the X-axis direction, the Y-axis direction, and the Z-axis direction respectively in a three-dimensional space (three-dimensional space) shown in fig. 1. The acceleration unit 12 can generate an acceleration vector according to the sensing result
Figure GDA0002519828620000111
To generate an acceleration sensing signal, wherein the acceleration sensing signal may represent a ratio of any two acceleration values, such as the acceleration value Vx and the acceleration value Vy. The processing unit 13 calculates the current tilt angle of the handheld pointing device 10 when receiving the acceleration sensing signal.
In more detail, the processing unit 13 can calculate the acceleration vector of the handheld pointing device 10 by using the following equations (1) to (3)
Figure GDA0002519828620000115
The included angle between the two axes is obtained to obtain the current tilt angle of the handheld pointing device 10,
Figure GDA0002519828620000112
Figure GDA0002519828620000113
Figure GDA0002519828620000114
wherein Vx represents an acceleration value sensed by the acceleration unit 12 in the X axis direction; vy represents an acceleration value sensed by the acceleration unit 12 in the Y axis direction; | gxy | represents a gravitational acceleration value calculated from the acceleration value Vx and the acceleration value Vy.
The processing unit 13 corrects the image frame using the calculation results of the formulas (1) and (2) and the formula (4) such that the coordinate system of the corrected image frame is the same as the coordinate system of the display device 20.
Figure GDA0002519828620000121
Wherein X represents the X-axis coordinate of the imaging position of the reference point 21 in one of the image frames; y represents the Y-axis coordinate of the imaging position of the reference point 21 in one of the image frames; x' represents the X-axis coordinate of the imaging position of the corrected reference point 21 in one of the image frames; y' represents the Y-axis coordinate of the imaging position of the corrected reference point 21 in one of the image frames. The processing unit 13 may then calculate the pointing coordinates of the handheld pointing device 10 with respect to the reference point 21 or the display device 20 from x 'and y'.
Then, the processing unit 13 can calculate the cursor location according to the calculated pointing coordinates to correspondingly generate the cursor parameters for controlling the cursor 23 on the display device 20. Then, the processing unit 13 transmits the cursor parameter or the relative movement vector information corresponding to the cursor 23 to the display device 20 by using the communication unit 16, so as to control the motion of the cursor 23 on the display device 20.
It should be noted that those skilled in the art should understand that the acceleration unit 12 of the handheld pointing device 10 of the present invention can also be used for sensing acceleration values of two dimensions only, for example, acceleration values Vx and Vx only. In other words, the acceleration sensing method of the handheld pointing device 10 is only an embodiment, and the invention is not limited thereto. In addition, it is a matter of the prior art to calculate the pointing coordinate of the handheld pointing device 10 on the screen of the display device 20 according to the imaging position of one or more reference points in the captured image, which is not a main improvement feature of the present invention, and therefore will not be described herein again.
The input unit 14 is used for a user of the handheld pointing device 10 to set an image sampling frequency and correction parameters, such as a preset correction time, a correction time, and a correction amount of each cursor position. For example, the user may set the image sampling frequency of the reference point according to the preset calibration time and set the calibration times of the cursor according to the image sampling frequency. For another example, the user can determine the number of times of cursor calibration according to the preset image sampling frequency. The image sampling frequency may be set according to the frame update frequency of the display device 20.
In practice, the input unit 14 may be a key interface (keypad), a finger navigation device (optical navigation device) or a button (button) for activating the display device 20 to display a setting interface, so as to allow the user to set the preset calibration time, the image sampling frequency and/or the calibration times of the cursor. If the handheld pointing device 10 has a handheld display screen (not shown), the preset calibration time, the image sampling frequency and/or the calibration times of the cursor and the compensation amount for each calibration can be displayed on the handheld display screen. The handheld display screen can also be a touch screen.
The storage unit 15 is configured to store parameters required by the operation of the handheld pointing device 10, such as the first pointing coordinate, the second pointing coordinate, the third pointing coordinate, the first tilt angle, the second tilt angle, the first preset offset threshold, the preset angle, and the cursor parameter. The storage unit 15 can also store the preset calibration time, the image sampling frequency and the calibration times of the cursor according to the operation requirement of the handheld pointing device 10.
In the embodiment, the processing unit 13 may be implemented by a program code compiling method using a processing chip such as a microcontroller (microcontroller) or an embedded controller (embedded controller), but the embodiment is not limited thereto. The storage unit 15 can be implemented by a volatile or non-volatile memory chip such as a flash memory chip, a read only memory chip or a random access memory chip, but the embodiment is not limited thereto. The communication unit 16 may transmit the motion vector information to the display device 20 by bluetooth transmission, but the embodiment is not limited thereto.
It should be noted that the internal components of the handheld pointing device 10 of the present embodiment can be added, removed, adjusted or replaced according to the functional and design requirements, and the invention is not limited thereto. In other words, the types, physical architectures, implementations and/or connection manners of the image capturing unit 11, the acceleration unit 12, the processing unit 13, the input unit 14, the storage unit 15 and the communication unit 16 are set according to the types, physical architectures, implementations and/or operational requirements of the handheld pointing device 10, and the embodiment is not limited thereto.
The present embodiment further provides a cursor positioning method for the handheld pointing device 10 after updating the tilt angle, so as to describe the operation of the handheld pointing device 10 more specifically. Please refer to fig. 3 and fig. 1, fig. 2, and fig. 4A to fig. 4D. Fig. 3 is a schematic flowchart illustrating a cursor positioning method of a handheld pointing device according to an embodiment of the present invention. Fig. 4A to 4B are schematic diagrams illustrating positions of reference points sensed by a handheld pointing device according to an embodiment of the invention. Fig. 4C is a schematic diagram illustrating a position change of a reference point calculated by the handheld pointing device according to different tilt angles according to the embodiment of the invention. Fig. 4D is a schematic diagram illustrating a relative relationship between a reference point position and a cursor position on a display device according to an embodiment of the present invention.
In step S301, when the processing unit 13 of the handheld pointing device 10 updates the currently used first tilt angle θ 1 to be the second tilt angle θ 2, the first image frame F1 corresponding to the reference point 21 is captured.
Specifically, the processing unit 13 may determine whether to update the first tilt angle θ 1 used by the handheld pointing device 10 to the second tilt angle θ 2 by determining whether the reference point 21 moves greatly in the continuous image frame according to the plurality of images captured by the image capturing unit 11 corresponding to the position of the reference point 21.
The processing unit 13 may determine whether to update the first tilt angle θ 1 of the handheld pointing device 10 to the second tilt angle θ 2 according to a plurality of image frames generated by the image capturing unit 11 capturing the image corresponding to the position of the reference point 21.
In one embodiment, the processing unit 13 may update the currently used first tilt angle θ 1 to be the second tilt angle θ 2 when it is calculated that the displacement variation of the reference point 21 generated at the imaging positions of any two consecutive image frames captured by the handheld pointing device 10 is smaller than a preset displacement threshold (e.g., 1 pixel). In another embodiment, the processing unit 13 may update the currently used first tilt angle θ 1 to be the second tilt angle θ 2 when it is calculated that the speed variation of the reference point 21 generated at the imaging positions of any two consecutive frames captured by the handheld pointing device is smaller than a preset speed threshold (e.g., 1 pixel/unit time). In another embodiment, the processing unit 13 may also update the currently used first inclination angle θ 1 to be the second inclination angle θ 2 when sensing that the magnitude (magnitude) of the acceleration vector (acceleration vector) generated by the acceleration values of the handheld pointing device 10 in the multiple axes is equal to a gravity acceleration value (g) of the handheld pointing device 10.
In other words, when the processing unit 13 determines that the reference point 21 does not move greatly (i.e. the handheld pointing device 10 is currently in a stationary state), the active reading acceleration unit 12 senses a plurality of acceleration values of the handheld pointing device 10 in a plurality of axial directions (e.g. the X-axis direction, the Y-axis direction, and the Z-axis direction) to update the currently used first tilt angle θ 1 to the second tilt angle θ 2.
In step S303, the processing unit 13 calculates a first pointing coordinate according to the imaging position of the reference point 21 in the first image frame F1 and the first tilt angle θ 1
Figure GDA0002519828620000141
As shown in fig. 4A, the first pointing coordinate
Figure GDA0002519828620000142
Indicating the pointing position vector of the handheld pointing device 10 corresponding to the display device 20 in the captured first image frame F1, and the first pointing coordinate
Figure GDA0002519828620000143
Is (x1, y 1).
The processing unit 13 will calculate the cursor location according to the first pointing coordinate, and correspondingly generate the cursor parameter for controlling the cursor 23 to be located on the display device 20. Subsequently, the processing unit 13 wirelessly transmits the cursor parameter to the display device 20 by using the communication unit 16 to correspondingly control the cursor 23 to the position on the display device 20.
Incidentally, the first pointing coordinate
Figure GDA0002519828620000151
The calculation of (c) is as follows. First, the processing unit 13 defines the operation range 111 mapped in the first image frame F11 corresponding to the display device 20 according to the center point "+" of the first image frame F1 and the imaging position of the reference point image 113 in the first image frame F1. The operation range 111 corresponds to the display device 20 at a predetermined display scale, and the processing unit 13 uses the reference point image 113 as an origin and is defined in the first image frame F1 according to the predetermined display scale in the operation range 111. The processing unit 13 can also further define the center point 1111 of the operation range 111, and calculate the pointing position vector of the center point "+" of the first image frame F1 in the operation range 111 by using the above equations (1) to (4) to obtain the first pointing coordinate using the center point 1111 of the operation range 111 as the origin and the first tilt angle θ 1 of the handheld pointing device 10
Figure GDA0002519828620000152
It is noted that the first pointing coordinate is obtained
Figure GDA0002519828620000153
When is not defined inThe center point 1111 is necessary, and the corresponding rotation angle can be calculated directly according to the relative relationship between the center point "+" of the first image frame F1 and the imaging position of the reference point image 113 in the first image frame F1 or the imaging characteristics of the reference point image 113, so as to obtain the first pointing coordinate
Figure GDA0002519828620000154
The center point "+" is the center of the sensing array in the image capturing unit 11 in this embodiment. In other words, the first pointing coordinate
Figure GDA0002519828620000155
Indicating the pointing coordinate position of the center (i.e., the center point "+") of the sensing array in the image capturing unit 11 corresponding to the coordinate system of the display device 20 in the first image frame F1.
In step S305, the processing unit 13 calculates a second directional coordinate according to the imaging position of the reference point 21 in the first image frame F1 and the second tilt angle θ 2
Figure GDA0002519828620000156
As shown in fig. 4B, the second directional coordinates
Figure GDA0002519828620000157
Indicating that the center point "+" of the sensing array in the image capturing unit 11 is mapped on the pointing position vector of the operating range 111a of the first image frame F1 on the display device 20, and the second pointing coordinate
Figure GDA0002519828620000158
Is (x2, y 2). Second directional coordinate
Figure GDA0002519828620000159
The processing unit 13 uses the center point 1111a of the operation range 111a as the origin and the second tilt angle θ 2 to calculate the pointing position vector of the center point "+" of the first image frame F1 in the operation range 111a, wherein the operation range 111a is based on the reference point image113a, as defined in.
The processing unit 13 can accordingly be based on the first pointing coordinate
Figure GDA00025198286200001510
And second directional coordinates
Figure GDA00025198286200001511
Calculating a first offset vector of the orientation coordinate in the same image frame after updating the inclination angle
Figure GDA00025198286200001512
As shown in fig. 4C. The processing unit 13 will also shift the first offset vector
Figure GDA00025198286200001513
Stored in the storage unit 15.
In step S307, the second image frame F2 with the reference point 21 is captured to calculate a third pointing coordinate according to the imaging position of the reference point 21 in the second image frame F2 and the second tilt angle θ 2
Figure GDA0002519828620000161
The capturing time of the second frame F2 is later than the capturing time of the first frame F1. As shown in fig. 4D, the third directional coordinate
Figure GDA0002519828620000162
Indicating that the center point "+" of the sensing array in the image capturing unit 11 is mapped on the pointing position vector of the operating range 111b of the second image frame F2 on the display device 20, and the third pointing coordinate
Figure GDA0002519828620000163
Is (x3, y3), wherein the operation range 111b is defined according to the reference point image 113 b.
Subsequently, in step S309, the processing unit 13 determines whether the angle difference θ d between the first inclination angle θ 1 and the second inclination angle θ 2 is smaller than a preset angle (e.g., 20 degrees). When the processing unit 13 determines that the angle difference θ d is smaller than a preset angle (e.g., 20 degrees), step S313 is performed. On the contrary, when the processing unit 13 determines that the angle difference θ d is greater than the preset angle (for example, 20 degrees), step S311 is executed.
In step S311, the processing unit 13 determines the first pointing coordinate
Figure GDA0002519828620000164
And second directional coordinates
Figure GDA0002519828620000165
First offset vector therebetween
Figure GDA0002519828620000166
Whether less than a first preset offset threshold (e.g., 10 pixels). When the processing unit 13 determines the first offset vector
Figure GDA0002519828620000167
If the deviation is smaller than the first preset deviation threshold (e.g. 10 pixels), step S315 is executed. On the contrary, when the processing unit 13 determines the first offset vector
Figure GDA0002519828620000168
If the deviation is greater than the first preset deviation threshold, step S313 is executed. The first predetermined offset threshold may be set according to a predetermined angle difference, for example, a pixel value corresponding to a 20 degree angle difference.
In step S313, the processing unit 13 processes the first pointing coordinate according to the second pointing coordinate
Figure GDA0002519828620000169
First pointing coordinate
Figure GDA00025198286200001610
And second directional coordinates
Figure GDA00025198286200001611
Cursor positioning is calculated. Specifically, the processing unit 13 is based on the third pointing coordinate
Figure GDA00025198286200001612
And a first offset vector
Figure GDA00025198286200001613
Generating compensated third pointing coordinate
Figure GDA00025198286200001614
Then, the processing unit 13 follows the compensated third pointing coordinate
Figure GDA00025198286200001615
Calculating cursor positioning to compensate for first pointing coordinate
Figure GDA00025198286200001616
And the second directional coordinate
Figure GDA00025198286200001617
The offset between.
Compensated third pointing coordinate
Figure GDA00025198286200001618
The calculation formula (5) is as follows:
Figure GDA00025198286200001619
wherein the content of the first and second substances,
Figure GDA00025198286200001620
representing the compensated third pointing coordinate;
Figure GDA00025198286200001621
representing a third pointing coordinate;
Figure GDA00025198286200001622
representing a first offset vector.
In step S315, the processing unit 13 directly determines the third pointing coordinate according to the third pointing coordinate
Figure GDA00025198286200001623
Cursor positioning is calculated. In other words, when the processing unit 13 determines that the angle difference θ d is smaller than the predetermined angle and the first offset vector
Figure GDA00025198286200001624
When the deviation is smaller than the first preset deviation threshold, the processing unit 13 does not align the third pointing coordinate
Figure GDA00025198286200001625
Compensating, but directly from the third pointing coordinate
Figure GDA00025198286200001626
Cursor positioning is calculated.
Subsequently, in step S317, the processing unit 13 generates cursor parameters for controlling the movement of the cursor 23 according to the calculation result of the cursor positioning in step S313 or step S315. The processing unit 13 wirelessly transmits the cursor parameters to the display device 20 by using the communication unit 16 to correspondingly control the movement of the cursor 23.
It is worth mentioning that the third pointing coordinate is shown in fig. 4D
Figure GDA0002519828620000171
The cursor is located within the operation range 111 of the first frame F2, so that the display device 20 sets the display position of the cursor 23 on the screen according to the display ratio when receiving the cursor parameter. Therefore, when the handheld pointing device 10 drives the communication unit 16 to transmit the cursor parameter and the predetermined display ratio for controlling the cursor 23 to the display device 20, the display device 20 calculates the display position of the cursor 23 on the screen of the display device 20 according to the current display ratio (i.e. the resolution of the display device 20). Those skilled in the art should understand the way that the display device 20 calculates the position of the cursor 23 on the display screen of the display device 20 according to the current display ratio and the cursor parameter, and therefore, the description thereof is omitted here.
Specifically, the reference point images 113, 113a, and 113b are respectively represented by a dot as shown in fig. 4A to 4C, but the reference point images 113, 113a, and 113b can also be represented by a cross or an asterisk, and the embodiment is not limited thereto. In addition, if the interactive system in fig. 2 uses two or more reference points 21, the average coordinate between the positions of the reference point images in the image frame can be used as the positions of the reference point images 113, 113a, and 113b in the embodiment of the image frame. The processing unit 13 of the handheld pointing device 10 can calculate the ratio between the reference point image and the preset imaging distance parameter by using the imaging distance parameter to compensate the position positioning deviation caused by the difference of the shooting distances. In the present invention, a person skilled in the art should know how to set the preset imaging parameters and the preset imaging distance parameters and compensate the position calculation results of the reference point images 113, 113a, and 113b by using the preset imaging parameters and the preset imaging distance parameters, and thus the details thereof are not repeated herein.
To more clearly illustrate the operation of the cursor positioning method for the handheld pointing device 10. Referring to fig. 5 and fig. 1 at the same time, fig. 5 is a schematic diagram illustrating a position change of a cursor on a display screen according to an embodiment of the present invention.
The position of the cursor 23a corresponds to the pointing coordinate of the handheld pointing device 10 calculated by using the first tilt angle θ 1 at the time point TA. The position of the cursor 23b corresponds to the pointing coordinate of the handheld pointing device 10 calculated by using the first tilt angle θ 1 at the time point TB. The position of the cursor 23c corresponds to the pointing coordinate of the handheld pointing device 10 calculated by using the first tilt angle θ 1 at the time point TC. At the time point TC, the handheld pointing device 10 simultaneously updates the first tilt angle θ 1 to the second tilt angle θ 2, and calculates the first pointing coordinate according to the first tilt angle θ 1 and the second tilt angle θ 2
Figure GDA0002519828620000181
And second directional coordinates
Figure GDA0002519828620000182
First offset vector therebetween
Figure GDA0002519828620000183
The cursor 23d corresponds to the hand-held pointing device 10 at the time point TDUsing the second tilt angle theta 2 and the first offset vector recorded after updating the tilt angle
Figure GDA0002519828620000184
Calculated resulting compensated third pointing coordinate
Figure GDA0002519828620000185
The position of the cursor 25a is a pointing coordinate corresponding to the hand-held pointing device 10 calculated with the second inclination angle θ 2. That is, after the handheld pointing device 10 updates the tilt angle, if the calculated pointing coordinates are not compensated for, the position of the cursor corresponds to the position of the cursor 25a on the display device 20. As shown in fig. 5, if the calculated pointing coordinate is not compensated, the cursor 23c may jump to the position of the cursor 25a, thereby reducing the operation feeling of the user.
Therefore, in the process of calculating the positioning of the cursor by using the method of the present embodiment, the offset of the pointing coordinate calculated after updating the tilt angle according to the updated tilt angle is compensated, so that the cursor moves a distance d from the position of the cursor 25a to the position of the cursor 23d, thereby effectively avoiding the occurrence of the jumping point condition.
In summary, the handheld pointing device 10 of the present embodiment can determine whether compensation for the pointing coordinate calculated by the second tilt angle θ 2 is needed (e.g., whether the cursor jumping point condition is obvious on the display 20) after the used first tilt angle θ 1 is updated to the second tilt angle θ 2. When the handheld pointing device 10 determines to compensate the pointing coordinate calculated by the second tilt angle θ 2, the pointing coordinate calculated subsequently is compensated by the offset difference between the second tilt angles θ 2 according to the first tilt angle θ 1.
In order to improve the operation feeling of the user and accurately control the cursor movement, the present embodiment further provides a cursor positioning correction compensation method. The cursor positioning, correcting and compensating method can enable the cursor to smoothly move from the current path of the previous step to the current actual moving path of the handheld pointing device 10 after the inclination angle is updated within the preset correcting time or the preset correcting times, so as to avoid the occurrence of the jumping point condition and maintain the directivity.
The following will further describe the details of the execution flow of the cursor positioning correction compensation method. Referring to fig. 6 and fig. 7 in combination with fig. 2, fig. 6 is a schematic flow chart illustrating a cursor positioning correction method of a handheld pointing device according to an embodiment of the present invention. Fig. 7 is a schematic diagram illustrating a position change of a moving cursor on a display screen of a handheld pointing device according to an embodiment of the present invention.
In step S601, when the processing unit 13 updates the first tilt angle θ 1 to the second tilt angle θ 2, the processing unit 13 immediately starts a cursor calibration procedure to enable the handheld pointing device 10 to enter a cursor calibration mode.
In step S603, the processing unit 13 sets the predetermined calibration times to N, the compensation vector to C, and the calibration coordinate
Figure GDA0002519828620000191
The corrected coordinates
Figure GDA0002519828620000192
For the pointing coordinate to be compensated, for example, a third pointing coordinate calculated according to the imaging position of the reference point (not shown) in the captured second image frame F2 and the second tilt angle θ 2
Figure GDA0002519828620000193
The processing unit 13 will then correct the number of times N, the compensation vector C and the corrected coordinates
Figure GDA0002519828620000194
Temporarily stored in the storage unit 15.
The processing unit 13 determines the first offset vector
Figure GDA0002519828620000195
Whether it is greater than a second preset offset threshold. If the processing unit 13 determines the first offset vector
Figure GDA0002519828620000196
Is greater than the secondPresetting an offset threshold, and setting N as a first offset vector
Figure GDA0002519828620000197
Dividing by C, wherein C is a preset compensation value; if the processing unit 13 determines the first offset vector
Figure GDA0002519828620000198
Setting C as the first offset vector when the second offset is smaller than the second preset offset threshold
Figure GDA0002519828620000199
Dividing by N, wherein N is a predetermined number of corrections.
It should be noted that the second predetermined offset threshold and the first predetermined offset threshold may be set to be the same or different according to the actual operation requirement of the handheld pointing device 10 and the type of software program executed by the display device 20.
In short, when the first offset vector is
Figure GDA00025198286200001910
If the offset value is larger than the second preset offset threshold, which means that the angle change is large and the required compensation vector is large, the processing unit 13 will automatically select to slowly correct the compensation pointing coordinate with a fixed compensation amount, so as to avoid the occurrence of a jumping point condition and reduce the operation hand feeling of the user. When the first offset vector is
Figure GDA00025198286200001911
If the angle change is smaller than the second preset offset threshold, the processing unit 13 will automatically select to correct the pointing coordinate within the preset correction times.
Then, when the processing unit 13 determines to determine the first offset vector
Figure GDA00025198286200001912
And N to obtain C, the processing unit 13 may calculate C by the following formula:
Figure GDA00025198286200001913
wherein C represents a compensation vector;
Figure GDA00025198286200001914
representing a first offset vector;
Figure GDA00025198286200001915
representing a first pointing coordinate;
Figure GDA00025198286200001916
representing a second directional coordinate; n represents a preset number of corrections and is a preset fixed value. As shown in equation (6), the larger N means that each compensation vector C is smaller; and a smaller N means a larger compensated vector C per time.
For example, the processing unit 13 may set N according to an image sampling frequency or a preset time inputted by a user through an operation interface provided by the input unit 14. In one embodiment, if the user can set five image frames to complete the cursor calibration procedure according to the image sampling frequency, the processing unit 13 correspondingly sets N to 5 and then sets the first offset vector according to N
Figure GDA00025198286200001917
C is calculated. In another embodiment, if the user sets a predetermined calibration time to be 5 seconds (i.e. the hand-held pointing device 10 completes the cursor calibration procedure within 5 seconds) and the image sampling frequency is 5 frames per second, the processing unit 13 sets N to be 25 correspondingly and sets N and the first offset vector according to N
Figure GDA0002519828620000201
C is calculated.
When the processing unit 13 determines to determine the first offset vector
Figure GDA0002519828620000202
And C to obtain N, the processing unit 13 may calculate N by the following formula:
Figure GDA0002519828620000203
wherein, C represents a compensation vector and is a preset fixed value;
Figure GDA0002519828620000204
representing a first offset vector;
Figure GDA0002519828620000205
representing a first pointing coordinate;
Figure GDA0002519828620000206
representing a second directional coordinate; n represents the number of corrections. As shown in equation (7), the larger C means the smaller the number of corrections N; and a smaller C means a larger number of corrections N.
For example, the processing unit 13 may set C according to the resolution of the display device 20 inputted by the user through the operation interface provided by the input unit 14. In one embodiment, the user may set that only one degree of correction is performed at a time, for example, according to the resolution of the display device 20, wherein each degree is three pixel units, the processing unit 13 sets C to 3 and sets the first offset vector according to C
Figure GDA0002519828620000207
And calculating N.
In addition, as mentioned above, the user of the handheld pointing device 10 can also set N and C through the input unit 14 according to the required accuracy of the type of software program executed by the display device 20 or the resolution of the display device 20.
In step S605, the processing unit 13 determines whether the handheld pointing device 10 needs to update the second tilt angle θ 2 to the third tilt angle θ 3. When the processing unit 13 determines whether the handheld pointing device 10 needs to update the second tilt angle θ 2 to the third tilt angle θ 3, step S607 is executed. Otherwise, when the processing unit 13 determines that the handheld pointing device 10 does not update the second tilt angle θ 2 to the third tilt angle θ 3 (i.e., the handheld pointing device 10 still maintains the current rotation angle or the handheld pointing device 10 continuously moves), step S611 is executed.
In step S607, the processing unit 13 calculates a second offset vector generated by the current rotation of the handheld pointing device 10
Figure GDA0002519828620000208
In detail, the processing unit 13 may first retrieve the third image frame F3. The processing unit 13 calculates a fourth pointing coordinate according to the second and third tilt angles θ 2 and θ 3, respectively, and the imaging position of the reference point 21 on the third image frame F3
Figure GDA0002519828620000209
And fifth directional coordinate
Figure GDA00025198286200002010
The processing unit 13 then determines the fourth pointing coordinate
Figure GDA00025198286200002011
And fifth directional coordinate
Figure GDA00025198286200002012
Calculating a second offset vector
Figure GDA00025198286200002013
The capturing time of the third frame F3 is later than the capturing time of the second frame F2.
In other words, the processing unit 13 actively determines whether the handheld pointing device 10 is operated again by the user to rotate to generate a new tilt angle during the calibration process, and updates the offset calibration cursor position generated by the third tilt angle θ 3 according to the second tilt angle θ 2, thereby improving the directivity of the handheld pointing device 10 and avoiding the occurrence of a skip point condition.
In step S609, when the processing unit 13 determines that the tilt angle has been updated again, the processing unit 13 calculates the corrected coordinates
Figure GDA0002519828620000211
Second offset vector
Figure GDA0002519828620000212
And C to produce compensated pointing coordinates, e.g. compensated third pointing coordinates
Figure GDA0002519828620000213
The compensated pointing coordinate
Figure GDA0002519828620000214
The calculation method of (c) is as follows:
Figure GDA0002519828620000215
wherein the content of the first and second substances,
Figure GDA0002519828620000216
representing the compensated pointing coordinates;
Figure GDA0002519828620000217
representing the corrected coordinates; c represents a compensation vector;
Figure GDA0002519828620000218
representing a first pointing coordinate;
Figure GDA0002519828620000219
representing a second directional coordinate;
Figure GDA00025198286200002110
representing a third pointing coordinate;
Figure GDA00025198286200002111
representing a fourth pointing coordinate;
Figure GDA00025198286200002112
represents a fifth directional coordinate; n represents the number of corrections.
In step S611, when the processing unit 13 determines that the tilt angle is not updated again, the processing unit 13 calculates a calibrationPositive coordinate
Figure GDA00025198286200002113
And C to generate compensated pointing coordinates
Figure GDA00025198286200002114
E.g. compensated third pointing coordinate
Figure GDA00025198286200002115
Compensated pointing coordinate
Figure GDA00025198286200002116
The calculation method of (c) is as follows:
Figure GDA00025198286200002117
wherein the content of the first and second substances,
Figure GDA00025198286200002118
representing the compensated pointing coordinates;
Figure GDA00025198286200002119
representing the corrected coordinates; c represents a compensation vector;
Figure GDA00025198286200002120
representing a first pointing coordinate;
Figure GDA00025198286200002121
representing a second directional coordinate;
Figure GDA00025198286200002122
representing a third pointing coordinate; n represents the number of corrections.
In step S613, the processing unit 13 compensates the pointing coordinate according to the compensated pointing coordinate
Figure GDA00025198286200002123
Correspondingly, a cursor parameter corresponding to the control cursor located on the display device 20 is generated and output. The processing unit 13 willThe cursor parameter is output such that the cursor on the display device 20 is translated by a distance d1 (the position of the cursor 33b in fig. 7) from the original position (the position of the cursor 33a in fig. 7). The position of the cursor 33a corresponds to the pointing coordinate calculated by the handheld pointing device 10 using the first inclination angle θ 1. The position of the cursor 35a is a position corresponding to the pointing coordinate calculated by the handheld pointing device 10 using the second inclination angle θ 2.
Meanwhile, in step S615, the processing unit 13 resets the calibration coordinates
Figure GDA00025198286200002124
The new pointing coordinate is, for example, a sixth pointing coordinate, wherein the sixth pointing coordinate is calculated according to the imaging position of the reference point 21 in the captured fourth image frame F4 and the second tilt angle θ 2 or the third tilt angle θ 3 (if the handheld pointing device 10 has updated the second tilt angle θ 2 to be the third tilt angle θ 3). In step S617, the processing unit 13 executes N-1 (i.e., decrements the correction count). A processing unit 13 for setting the corrected coordinates
Figure GDA0002519828620000221
And the decremented N is stored in the storage unit 15. The processing unit 13 then determines whether N is equal to zero in step S619, i.e., whether the cursor calibration procedure is completed.
If the processing unit 13 determines that N is equal to zero, i.e. the cursor calibration procedure is completed, step S621 is executed. Otherwise, if the processing unit 13 determines that N is not equal to zero, that is, the cursor calibration procedure is not completed, the processing unit 13 returns to step S605. That is, the handheld pointing device 10 re-captures the image position of the reference point 21 in the fifth image frame F5 and the second tilt angle θ 2 or the third tilt angle θ 3 in the fifth image frame F5, calculates the seventh pointing coordinate of the handheld pointing device 10 relative to the reference point 21, sets the seventh pointing coordinate as the calibration coordinate, and calculates the compensated pointing coordinate according to the calibration coordinate and C, so that the cursor on the display device 20 continuously translates the distance d2 from the position of the cursor 33b in fig. 7 until the cursor moves to the position of the cursor 33C.
Then, the processing unit 13 re-executes steps S605 to S619, sequentially retrieves N-2 image frames (not shown) to calculate and continuously compensate the subsequent pointing coordinate, and calculates the cursor positioning according to the compensated pointing coordinate, and so on until N is equal to zero.
When the processing unit 13 completes the cursor calibration procedure, the cursor will be displayed by the first pointing coordinate on the display device 20 as shown in FIG. 7
Figure GDA0002519828620000222
Moves N times to the current pointing position of the corresponding handheld pointing device 10 (e.g., the position of the cursor 33 a). In other words, the cursor will be displayed on the display device 20 from the corresponding first pointing coordinate
Figure GDA0002519828620000223
The positions of the pointing devices 10 (e.g., the position of the cursor 33 a) are translated by distances d1, d2, d3, … dN to the pointing position of the pointing reference point behind the nth image frame, for example, the position of the cursor 33N.
Then, in step S621, the processing unit 13 directly calculates the cursor location according to the imaging position of the reference point 21 in the captured image frame and the currently used tilt angle during the subsequent movement, so as to improve the accuracy of cursor control.
It should be noted that, in the process of cursor calibration, if the handheld pointing device 10 continuously updates the tilt angle, the processing unit 13 correspondingly adjusts the calibration compensation amount, i.e. C, in an accumulated manner, so as to maintain the directivity of the handheld pointing device 10. The processing unit 13 may also actively generate an offset vector according to the angle difference between the post-update tilt angle and the pre-update tilt angle and/or the post-update tilt angle, and determine whether to add the offset vector to the compensation calculation of the pointing coordinate.
In addition, the processing unit 13 can also communicate with the display device 20 through the communication unit 16 during operation to obtain information related to the software program executed by the display device 20, such as the type and status of the software program, the frame update frequency, and the resolution required by the display device 20 to execute the software program. The processing unit 13 can also determine whether to perform the cursor calibration procedure and execute the setting of the calibration parameters in the cursor calibration procedure according to the obtained information. The calibration parameters include a preset offset threshold (e.g., a first preset offset valve and a second preset offset threshold), a preset angle, calibration times, calibration time, and a compensation amount for each calibration.
In practice, the program code corresponding to the cursor positioning method of fig. 3 and the cursor positioning correction calculation method of fig. 6 may be designed on a microcontroller or an embedded controller to execute the cursor positioning method of fig. 3 and the cursor positioning correction calculation method of fig. 6 when the processing unit 13 is in operation, but the embodiment is not limited thereto.
It should be noted that fig. 3 is only used to describe a cursor positioning method of the handheld pointing device 10, and thus fig. 3 is not intended to limit the invention. Similarly, fig. 6 is only used to describe one embodiment of the cursor positioning correction of the handheld pointing device 10, and is not intended to limit the invention. Fig. 4A to 4D are only used to illustrate the calculation of the pointing coordinate of the handheld pointing device 10 and the relationship between the operation range of the corresponding display device 20 and the center point "+" of the sensing array in the image capturing unit 11, and are not intended to limit the present invention. Fig. 5 and fig. 7 are only used to describe the operation of the handheld pointing device 10 and the cursor positioning correction calculation with reference to fig. 3 and fig. 6, respectively, and are not intended to limit the present invention.
[ Another embodiment of a Cursor positioning method for a hand-held pointing device ]
From the above embodiments, the present invention can also be summarized as a cursor positioning method, which can be applied to the handheld pointing device in the interactive system of the above embodiments. Referring to fig. 8 and fig. 1 and 2 at the same time, fig. 8 is a schematic flow chart illustrating a method for positioning a cursor of a handheld pointing device according to another embodiment of the present invention. The cursor positioning method of fig. 8 can be implemented by firmware programming and executed by the processing unit 13 of the handheld pointing device 10.
First, in step S801, the processing unit 13 of the handheld pointing device 10 determines whether to update the currently used first tilt angle to the second tilt angle. When the processing unit 13 determines to update the currently used first tilt angle to the second tilt angle, step S803 is executed. On the contrary, when the processing unit 13 determines that the currently used first tilt angle is not updated to the second tilt angle, the process returns to step S801.
The handheld pointing device 10 can determine whether the reference point 21 moves greatly in the image frames according to the image frames sequentially generated by the image capturing unit 11 capturing the images corresponding to the reference point 21. The handheld pointing device 10 determines to update the currently used first tilt angle to a second tilt angle when the determination result shows that the reference point 21 has not moved significantly (e.g., the handheld pointing device 10 is in a static state).
Additionally, in other embodiments, the handheld pointing device 10 may determine to update the currently used first tilt angle to the second tilt angle by determining whether the pointing coordinate calculated according to the reference point 21 has moved greatly. For example, when the handheld pointing device 10 determines that the pointing coordinate calculated according to the reference point 21 does not move greatly, the currently used first tilt angle is updated to the second tilt angle.
Next, in step S803, when the processing unit 13 determines to update the currently used first tilt angle to the second tilt angle, the processing unit 13 drives the image capturing unit 11 to capture the first image frame of the reference point 21.
Thereafter, in step S805, the processing unit 13 calculates an angle difference between the first tilt angle and the second tilt angle.
In step S807, the processing unit 13 determines whether the angle difference between the first tilt angle and the second tilt angle is smaller than a predetermined angle, such as 20 degrees. When the processing unit 13 determines that the angle difference between the first inclination angle and the second inclination angle is smaller than the preset angle, step S809 is performed. On the contrary, when the processing unit 13 determines that the angle difference between the first inclination angle and the second inclination angle is larger than the preset angle, step S811 is performed.
In step S809, the processing unit 13 drives the image capturing unit 11 to capture a second frame corresponding to the reference point 21, wherein the capturing time of the second frame is later than the capturing time of the first frame. The processing unit 13 will also calculate the cursor location directly according to the imaging position of the reference point 21 in the second image frame and the second tilt angle. That is, when the angle difference between the first tilt angle and the second tilt angle is smaller than a predetermined angle, for example, 20 degrees, the processing unit 13 determines that the jumping point is not obvious, and calculates the cursor location directly according to the imaging position of the reference point 21 in the second image frame and the pointing coordinate calculated by the second tilt angle without compensation.
In step S811, the processing unit 13 calculates a first pointing coordinate according to the imaging position of the reference point 21 in the first image frame and the first tilt angle. In step S813, the processing unit 13 calculates a second directional coordinate according to the imaging position of the reference point 21 in the first image frame and the second tilt angle. The detailed calculation method of the first directional coordinate and the second directional coordinate is the same as that of the foregoing embodiment, and therefore, the detailed description is omitted.
Then, in step S815, the processing unit 13 performs cursor positioning calculation based on the offset between the first pointing coordinate and the second pointing coordinate and the pointing coordinate calculated by the movement of the handheld pointing device 10 during the subsequent movement of the handheld pointing device.
In step S817, the processing unit 13 correspondingly generates cursor parameters for controlling the cursor 23 to move on the screen of the display device 20 according to the calculation result of cursor positioning in step S809 or step S815.
Subsequently, the processing unit 13 wirelessly transmits the cursor parameter to the display device 20 through the communication unit 16 to correspondingly control the movement of the cursor 23 on the screen of the display device 20.
Fig. 8 is only used to describe a cursor positioning method of the handheld pointing device 10, and fig. 8 is not intended to limit the present invention. One skilled in the art can select a determination method for determining to update the first tilt angle to the second tilt angle according to actual operation requirements, for example, determine whether the handheld pointing device 10 is in a moving state according to a displacement change, a speed change, or an acceleration change of the reference point in the continuous image frames, or a movement change of the pointing coordinate calculated according to the imaging position of the reference point in the multiple continuous image frames, or an acceleration vector generated by the handheld pointing device 10 in multiple axial acceleration values. The cursor positioning correction calculation method in the above embodiment may be executed in step S815 to correct the pointing coordinate corresponding to the positioning of the cursor 23.
[ still another embodiment of a cursor positioning method of a handheld pointing device ]
From the above embodiments, the present invention can also be summarized as a cursor positioning method, which can be applied to the handheld pointing device in the interactive system of the above embodiments. Referring to fig. 9 and fig. 1 and 2 at the same time, fig. 9 is a schematic flow chart illustrating a method for positioning a cursor of a handheld pointing device according to another embodiment of the present invention. The cursor positioning method of fig. 9 can be implemented by firmware programming and executed by the processing unit 13 of the handheld pointing device 10.
In step S901, the processing unit 13 of the handheld pointing device 10 updates the currently used first tilt angle to the second tilt angle at the first time. Specifically, the processing unit 13 drives the acceleration unit 12 to sense a plurality of acceleration values of the handheld pointing device 10 in multiple axes (e.g., X-axis, Y-axis, and Z-axis) and generate acceleration vectors accordingly. The processing unit 13 calculates an included angle between the acceleration vector of the handheld pointing device 10 and each axial direction by using the above equations (1) to (3) according to the acceleration vector, so as to obtain the current tilt angle of the handheld pointing device 10.
During the first time, the processing unit 13 of the handheld pointing device 10 also captures a first image frame of the reference point 21 corresponding to the driving image capturing unit 11.
In step S903, in the first time period, the processing unit 13 of the handheld pointing device 10 calculates a first pointing coordinate and a second pointing coordinate corresponding to an imaging position of the reference point 21 in the first image frame by using the first tilt angle and the second tilt angle, respectively.
Meanwhile, the processing unit 13 calculates the cursor location according to the first pointing coordinate to correspondingly generate the cursor parameter for controlling the cursor 23 to be located at the position of the display device 20. The processing unit 13 then uses the communication unit 16 to wirelessly transmit the cursor parameter corresponding to the first time of the cursor 23 to the display device 20, so that the cursor 23 is fixed at the first pointing coordinate. The detailed calculation and positioning manner of the cursor 23 and the cursor position control manner are the same as those of the previous embodiments, and therefore are not described again.
In step S905, the processing unit 13 calculates a first offset vector between the first directional coordinate and the second directional coordinate.
Then, in step S907, the processing unit 13 generates a unit compensation vector according to the calculated first offset vector. In one embodiment, the processing unit 13 may determine to calculate the unit compensation vector by a fixed number of corrections or a fixed correction compensation amount according to the first offset vector or the angle difference between the first tilt angle and the second tilt angle as described in the previous embodiment. If the pointing coordinate generated by the handheld pointing device 10 is corrected by a fixed number of times, the processing unit 13 may divide the first offset vector by the number of times of correction or the correction time to generate a unit compensation vector. If the pointing coordinate generated by the handheld pointing device 10 is corrected by a fixed correction compensation amount, the processing unit 13 can set a unit compensation vector according to the correction compensation amount. The processing unit 13 simultaneously and divides the first offset vector by the unit compensation vector to generate the required number of corrections.
It should be noted that the fixed calibration times or calibration time may be set by the processing unit 13 according to the image sampling frequency or the preset time. In other embodiments, the fixed correction times or correction time and the fixed correction compensation amount may also be set by the processing unit 13 according to the type of software program executed by the display device 20, such as a game software program. The detailed setting method has been described in the foregoing embodiments, and a person skilled in the art of the present invention can deduce the setting method from the above description, so that the detailed description is omitted.
Next, in step S909, at the second time, the processing unit 13 captures the image position of the reference point corresponding to the second image frame, and calculates a third pointing coordinate of the image position of the reference point corresponding to the second image frame according to the image position of the reference point corresponding to the second image frame and the second tilt angle. The second time is after the first time, that is, the capturing time of the second frame is later than the capturing time of the first frame.
In step S911, the processing unit 13 starts a cursor calibration procedure at a second time to correspondingly calculate the cursor location according to the third pointing coordinate and the unit compensation vector. In detail, during the second time, the processing unit 13 may correct the third pointing coordinate by the cursor positioning correction calculation method shown in fig. 6.
In step S913, the processing unit 13 calculates the display position of the cursor 23 on the display device 20 at the second time. In detail, the handheld pointing device 10 correspondingly generates the cursor parameter for controlling the cursor 23 to be located on the display device 20 according to the result of the calibration calculation of the third pointing coordinate. The processing unit 13 wirelessly transmits the cursor parameter to the display device 20 through the communication unit 16 to control the display position of the cursor 23 on the screen of the display device 20 at the second time.
In step S915, the processing unit 13 captures a third image frame corresponding to the reference point at a third time, and calculates a fourth pointing coordinate of the imaging position of the third image frame corresponding to the reference point by using the second tilt angle. The third time is after the second time, that is, the capturing time of the third frame is later than the capturing time of the second frame. In addition, the time length between the second time and the third time can be configured according to the correction times or the correction time.
In step S917, the processing unit 13 calculates a display position of the cursor on the display device 20 at the third time according to the fourth pointing coordinate. Then, in step S919, the processing unit 13 correspondingly generates a cursor parameter for controlling the cursor 23 to be located on the display device 20. The processing unit 13 transmits the cursor parameter to the display device 20 through the communication unit 16 in a wireless transmission manner, so as to correspondingly control the display position of the cursor 23 on the screen of the display device 20 at the third time.
It should be noted that, during the second time, the processing unit 13 may also determine whether to correct the pointing coordinate calculated by the second tilt angle according to the offset vector or the angle difference between the first tilt angle and the second tilt angle as described in the foregoing embodiment. If the offset vector between the first directional coordinate and the second directional coordinate is smaller than the first offset vector (e.g., 5 pixels) or the angle difference between the first inclination angle and the second inclination angle is smaller than the predetermined angle (e.g., 20 degrees), the processing unit 13 may directly calculate the cursor location according to the third directional coordinate without performing the cursor calibration procedure, and correspondingly generate the cursor parameter for controlling the cursor 23 to be located on the display device 20.
In addition, in the embodiment, the processing unit 13 can record the calculated first tilt angle, second tilt angle, first pointing coordinate, second pointing coordinate, third pointing coordinate, first offset vector and unit compensation vector in the storage unit 15 respectively. One skilled in the art of the present invention can also determine whether to update the first tilt angle to the second tilt angle in the processing unit 13 by using a firmware design method in the first time period according to the actual operation requirement. That is, the processing unit 13 can determine whether to update the tilt angle of the handheld pointing device 10 by determining whether the reference point 21 or the pointing coordinate corresponding to the reference point moves significantly to detect whether the handheld pointing device 10 is currently moving or in a static state.
It should be noted that fig. 9 is only used to describe a cursor positioning method of the handheld pointing device 10, and thus fig. 9 is not intended to limit the present invention.
In addition, the present invention can also utilize a computer readable recording medium storing computer program codes corresponding to the cursor positioning method of fig. 3, 8 and 9 and the cursor positioning correction method of fig. 6 for executing the above steps when being read by a processor. The computer readable medium can be a floppy disk, a hard disk, a compact disk, a flash drive, a magnetic tape, a database accessible via a network, or a storage medium with similar functions as those of the prior art.
[ possible effects of the embodiment ]
In summary, embodiments of the present invention provide a handheld pointing device and a cursor positioning method of the handheld pointing device, which are suitable for controlling a movement operation of a cursor on a display device. The cursor positioning method can actively correct the pointing coordinate calculated by the updated inclination angle when the handheld pointing device calculates the cursor positioning after updating the inclination angle, so that the cursor is positioned at the actual pointing position of the current handheld pointing device from the pointing coordinate calculated by the inclination angle before updating within the preset correction time or correction times, thereby avoiding the cursor jumping condition and improving the convenience and the stability of the operation of a user.
The cursor positioning method can also judge whether the pointing coordinate calculated by the updated inclination angle needs to be corrected and compensated according to the accuracy required by the type of the currently executed software program of the display device and the resolution of the display device, so as to increase the practicability and the application of the handheld pointing device.
The above description is only an example of the present invention, and is not intended to limit the scope of the present invention.

Claims (15)

1. A pointing method for a handheld device, the method comprising:
obtaining the position of a first image frame of a reference point;
calculating a first pointing coordinate according to the position of the first image frame and a first inclination angle representing an inclination angle of the handheld device;
calculating a second directional coordinate according to the position of the first image frame and a second inclination angle generated by the movement of the first image frame based on the reference point;
calculating a first offset vector between the first directional coordinate and the second directional coordinate, and generating a unit compensation vector;
obtaining the position of a second image frame of the reference point;
calculating a third pointing coordinate according to the position of the second image frame and the second inclination angle; and
and outputting a pointing coordinate according to the third pointing coordinate and the unit compensation vector.
2. The pointing method of claim 1, wherein the unit compensation vector is calculated by a predetermined number of corrections according to the first offset vector.
3. The pointing method of claim 2, wherein the first offset vector represents an angular difference between the first tilt angle and the second tilt angle.
4. The pointing method of claim 3, wherein the angular difference is reduced by performing the pointing method multiple times.
5. The pointing method of claim 4, wherein the pointing coordinate is outputted when the angle difference is smaller than a predetermined angle.
6. The method as claimed in claim 1, wherein the position of the first frame and the position of the second frame of the reference point are formed in frames captured by an image sensor of the handheld device.
7. The pointing method of claim 6, wherein the second pointing coordinate is calculated when the tilt angle of the handheld device is updated from the first tilt angle to the second tilt angle.
8. The method as claimed in claim 7, wherein whether the tilt angle of the handheld device is updated from the first tilt angle to the second tilt angle is determined according to whether the reference point substantially moves.
9. The pointing method of claim 7, wherein the reference point is substantially moved by determining whether the reference point image position formed in the frame captured by the handheld device is moved.
10. The method as claimed in claim 7, wherein the determining whether to update the tilt angle of the handheld device from the first tilt angle to the second tilt angle is determined according to whether the reference point is smaller than a predetermined offset threshold value in an offset amount calculated between the first frame position and the second frame position of any two consecutive frames.
11. The method of claim 7, wherein the determining whether to update the tilt angle of the handheld device from the first tilt angle to the second tilt angle is based on whether the reference point has a velocity between the first frame position and the second frame position of any two consecutive frames that is less than a predetermined velocity threshold.
12. The method of claim 7, wherein determining whether to update the tilt angle of the handheld device from the first tilt angle to the second tilt angle is determined according to whether the reference point is equal to a gravitational acceleration of the handheld device at an acceleration level calculated between the first frame position and the second frame position of any two consecutive frames; wherein the acceleration vector of the hand-held device is generated according to the acceleration induced by the multi-axis of the hand-held device.
13. A handheld device, the handheld device comprising:
the image acquisition unit is used for acquiring a multi-frame image comprising a reference point;
a processing unit coupled to the image capturing unit for executing a positioning method;
obtaining the position of a first image frame of the reference point;
calculating a first pointing coordinate according to the position of the first image frame and a first inclination angle representing the rotation angle of the handheld device;
calculating a second directional coordinate according to the position of the first image frame and a second inclination angle generated according to the movement of the position of the first image frame of the reference point;
calculating a first offset vector according to the first directional coordinate and the second directional coordinate, and generating an offset unit compensation vector;
obtaining the position of a second image frame of the reference point;
calculating a third pointing coordinate according to the position of the second image frame and the second inclination angle; and
and outputting a pointing coordinate according to the third pointing coordinate and the offset unit compensation vector.
14. The handheld device of claim 13, wherein the processing unit drives the image capturing unit to capture a first image frame including the reference point, and the first and second directional coordinates are calculated according to the first and second tilt angles of the first image frame.
15. The handheld device as claimed in claim 13, wherein the processing unit calculates the offset unit compensation vector according to the first offset vector at a predetermined number of times of correction.
CN201711124233.2A 2013-12-18 2013-12-18 Handheld device and positioning method thereof Active CN107754310B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711124233.2A CN107754310B (en) 2013-12-18 2013-12-18 Handheld device and positioning method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310699986.1A CN104731373B (en) 2013-12-18 2013-12-18 Hand-held indicator device and its cursor positioning method
CN201711124233.2A CN107754310B (en) 2013-12-18 2013-12-18 Handheld device and positioning method thereof

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201310699986.1A Division CN104731373B (en) 2013-12-18 2013-12-18 Hand-held indicator device and its cursor positioning method

Publications (2)

Publication Number Publication Date
CN107754310A CN107754310A (en) 2018-03-06
CN107754310B true CN107754310B (en) 2020-09-15

Family

ID=53455340

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201310699986.1A Expired - Fee Related CN104731373B (en) 2013-12-18 2013-12-18 Hand-held indicator device and its cursor positioning method
CN201711124233.2A Active CN107754310B (en) 2013-12-18 2013-12-18 Handheld device and positioning method thereof

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201310699986.1A Expired - Fee Related CN104731373B (en) 2013-12-18 2013-12-18 Hand-held indicator device and its cursor positioning method

Country Status (1)

Country Link
CN (2) CN104731373B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106920225A (en) * 2015-12-24 2017-07-04 宝成工业股份有限公司 The position finding and detection method of lasting pincers
CN107426935B (en) * 2017-09-11 2023-05-16 北京小米移动软件有限公司 Frame structure, electronic equipment shell and electronic equipment
CN109683775B (en) * 2018-12-12 2021-07-06 歌尔科技有限公司 Projection-based interaction method, projection equipment and storage medium
CN110044309B (en) * 2019-04-08 2021-07-16 天津字节跳动科技有限公司 Measuring method and device
CN112214117A (en) * 2019-07-10 2021-01-12 周海涛 Image processing method and chip of air mouse, air mouse and system
CN112148139B (en) * 2020-09-28 2022-07-26 联想(北京)有限公司 Gesture recognition method and computer readable storage medium
CN112799576A (en) * 2021-02-22 2021-05-14 Vidaa美国公司 Virtual mouse moving method and display device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200638975A (en) * 2005-05-10 2006-11-16 Pixart Imaging Inc Orientation point orientating method of orientation device and device thereof
CN1973316A (en) * 2004-04-30 2007-05-30 希尔克瑞斯特实验室公司 Free space pointing devices with tilt compensation and improved usability
TW200829007A (en) * 2006-12-28 2008-07-01 Pixart Imaging Inc Cursor controlling method and apparatus using the same
CN101482782A (en) * 2009-02-06 2009-07-15 袁鸿军 Cursor positioning system and method
CN101881617A (en) * 2009-05-06 2010-11-10 鼎亿数码科技(上海)有限公司 Gyro space-location method
WO2013019071A1 (en) * 2011-08-01 2013-02-07 Lg Innotek Co., Ltd. The method for generating pointer movement value and pointing device using the same
TW201342130A (en) * 2012-04-06 2013-10-16 Pixart Imaging Inc Image positioning method and interactive imaging system using the same

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101398721A (en) * 2007-09-26 2009-04-01 昆盈企业股份有限公司 Control method for moving speed of cursor of air mouse
KR101893601B1 (en) * 2011-07-22 2018-08-31 삼성전자 주식회사 Input apparatus of display apparatus, display system and control method thereof
TW201305854A (en) * 2011-07-26 2013-02-01 Chip Goal Electronics Corp Remote controllable image display system, controller, and processing method therefor

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1973316A (en) * 2004-04-30 2007-05-30 希尔克瑞斯特实验室公司 Free space pointing devices with tilt compensation and improved usability
TW200638975A (en) * 2005-05-10 2006-11-16 Pixart Imaging Inc Orientation point orientating method of orientation device and device thereof
TW200829007A (en) * 2006-12-28 2008-07-01 Pixart Imaging Inc Cursor controlling method and apparatus using the same
CN101482782A (en) * 2009-02-06 2009-07-15 袁鸿军 Cursor positioning system and method
CN101881617A (en) * 2009-05-06 2010-11-10 鼎亿数码科技(上海)有限公司 Gyro space-location method
WO2013019071A1 (en) * 2011-08-01 2013-02-07 Lg Innotek Co., Ltd. The method for generating pointer movement value and pointing device using the same
TW201342130A (en) * 2012-04-06 2013-10-16 Pixart Imaging Inc Image positioning method and interactive imaging system using the same

Also Published As

Publication number Publication date
CN107754310A (en) 2018-03-06
CN104731373B (en) 2017-12-15
CN104731373A (en) 2015-06-24

Similar Documents

Publication Publication Date Title
CN107754310B (en) Handheld device and positioning method thereof
US9250799B2 (en) Control method for information input device, information input device, program therefor, and information storage medium therefor
US8957909B2 (en) System and method for compensating for drift in a display of a user interface state
US20180234673A1 (en) Online compensation of thermal distortions in a stereo depth camera
JP5463790B2 (en) Operation input system, control device, handheld device, and operation input method
EP1764141A1 (en) Video game program and video game system
US10379627B2 (en) Handheld device and positioning method thereof
US9838573B2 (en) Method for guiding controller to move to within recognizable range of multimedia apparatus, the multimedia apparatus, and target tracking apparatus thereof
KR20210010437A (en) Power management for optical positioning devices
JP6204686B2 (en) Information processing program, information processing system, information processing apparatus, and information processing execution method
US10067576B2 (en) Handheld pointer device and tilt angle adjustment method thereof
TWI522848B (en) Pointer device and pointer positioning method thereof
US20180059811A1 (en) Display control device, display control method, and recording medium
EP3032380A1 (en) Image projection apparatus, and system employing interactive input-output capability
US20070103463A1 (en) Simulation apparatus
JP5973788B2 (en) Information processing program, information processing apparatus, information processing system, and information processing method
US20140132787A1 (en) Motion Detection Device and Motion Detection Method Having Rotation Calibration Function
JP6500159B1 (en) Program, information processing apparatus, information processing system, information processing method, and head mounted display
KR20050063469A (en) Three dimensional input device using geo-magnetic sensor and data processing method therefor
JP6185301B2 (en) Information processing program, information processing apparatus, information processing system, and method for calculating indicated position
KR101547512B1 (en) Method and system for performing fine pointing by using display pattern
JP6533946B1 (en) Program, information processing apparatus, information processing system, information processing method and head mounted display
TW201445359A (en) Handheld pointer device and tilt angle adjustment method thereof
JP2014021001A (en) Position measurement device, position measurement method, and position measurement program
CN114757248A (en) Method for using mobile phone as motion sensing sensor, storage medium and mobile phone

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant