CN107728881B - Cursor positioning method of handheld pointing device - Google Patents

Cursor positioning method of handheld pointing device Download PDF

Info

Publication number
CN107728881B
CN107728881B CN201710954368.5A CN201710954368A CN107728881B CN 107728881 B CN107728881 B CN 107728881B CN 201710954368 A CN201710954368 A CN 201710954368A CN 107728881 B CN107728881 B CN 107728881B
Authority
CN
China
Prior art keywords
coordinate
cursor
pointing
time
handheld
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710954368.5A
Other languages
Chinese (zh)
Other versions
CN107728881A (en
Inventor
程瀚平
黄昭荐
梁家钧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to CN201710954368.5A priority Critical patent/CN107728881B/en
Publication of CN107728881A publication Critical patent/CN107728881A/en
Application granted granted Critical
Publication of CN107728881B publication Critical patent/CN107728881B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03541Mouse/trackball convertible devices, in which the same ball is used to track the 2D relative movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Abstract

The invention provides a cursor positioning method of a handheld pointing device, which comprises the following steps: generating a first pointing coordinate at a first time, and correspondingly fixing the display position of the cursor at the first pointing coordinate according to the first pointing coordinate; generating a second directional coordinate at a second time, wherein the second time is after the first time; calculating a first movement vector between the first directional coordinate and the second directional coordinate; generating a unit compensation vector according to the first motion vector; generating a third pointing coordinate at a third time, wherein the third time is after the second time; calculating a second movement vector between the third directional coordinate and the second directional coordinate; and calculating the display position of the cursor at the third time according to the first pointing coordinate, the unit compensation vector and the second movement vector. The invention can improve the convenience and stability of the operation of the user.

Description

Cursor positioning method of handheld pointing device
RELATED APPLICATIONS
The invention is a divisional application of Chinese patent application No.201310309281.4 filed on 22.7.7.2013.
Technical Field
The present invention relates to a positioning method of a pointing device, and more particularly, to a cursor positioning method of a handheld pointing device.
Background
The conventional handheld remote control device can calculate the pointing coordinate of the handheld remote control device by analyzing the position of a reference light source arranged near the display device in a built-in image sensing area. The hand-held remote control device transmits the calculated coordinates of the pointing point to a host computer, so that the host computer can correspondingly control an object, such as a cursor, displayed on a picture of the display device to perform related software operations.
When a user uses the handheld remote control device, the user usually hangs and holds the handheld remote control device to control a cursor displayed on a display screen of the display device, so as to perform related software operations. However, when the hand-held remote control device is held in the air, the hand-held remote control device inevitably moves due to hand shake or vibration, so that the position of the cursor displayed on the display device is unstable, and the operation hand feeling of the user is not good.
Although, the conventional handheld remote control device generally has a lock point positioning function, and the cursor can be fixed at the current display position of the display device by detecting whether the user is in a stable cursor state or by operating a lock point key or the like. However, when the lock point positioning mode is released, the hand-held remote control device uses an absolute positioning mode and correspondingly controls the cursor to move according to the position of the currently calculated reference point in the image sensing area. Therefore, the situation that the cursor suddenly jumps on the picture of the display device can occur, so that the operation hand feeling is reduced, and the inconvenience in use of a user is also caused.
Disclosure of Invention
In view of this, an embodiment of the present invention provides a cursor positioning method for a handheld pointing device, which can actively switch a cursor positioning mode between an absolute positioning mode and a relative positioning mode when the handheld pointing device leaves a lock point positioning mode, so as to avoid a cursor jumping situation and improve an operation feel of a user.
The embodiment of the invention provides a cursor positioning method of a handheld pointing device. First, a first digital image of a reference point is extracted to calculate a first pointing coordinate according to an imaging position of the reference point in the first digital image. And secondly, generating a cursor parameter for correspondingly controlling a cursor to be positioned on a display device according to the first pointing coordinate. And then, when the handheld pointing device enters a locking point positioning mode, the handheld pointing device records a first pointing coordinate, and simultaneously fixes the cursor at the first pointing coordinate in a mode of fixedly outputting a cursor parameter or not outputting the cursor parameter. And then, when the handheld pointing device leaves the lock point positioning mode, extracting a second digital image of the reference point, and calculating a second pointing coordinate according to the imaging position of the reference point in the second digital image so as to correspondingly generate a movement vector between the first pointing coordinate and the second pointing coordinate. Then, the hand-held pointing device calculates the cursor location according to the movement vector and the first pointing coordinate to compensate the offset between the first pointing coordinate and the second pointing coordinate, and correspondingly generates a cursor parameter for controlling the cursor movement according to the calculation result.
The embodiment of the invention also provides a cursor positioning method of the handheld pointing device, which comprises the following steps. First, a first digital image of a reference point is extracted to calculate a first pointing coordinate according to an imaging position of the reference point on the first digital image. And secondly, generating a cursor parameter for correspondingly controlling a cursor on a display device according to the first pointing coordinate. And then, when the handheld pointing device enters a locking point positioning mode, the handheld pointing device records a first pointing coordinate, and simultaneously fixes the cursor on the display device in a mode of fixedly outputting a cursor parameter or not outputting the cursor parameter. And then, when the handheld pointing device leaves the locking point positioning mode, the handheld pointing device is used for positioning calculation of the cursor on the basis of the first pointing coordinate as a starting point and the pointing coordinate displacement generated by the movement of the handheld pointing device when the handheld pointing device moves subsequently. Then, cursor parameters for controlling the cursor movement are correspondingly generated according to the calculation result.
The embodiment of the invention also provides a cursor positioning method of the handheld pointing device, which comprises the following steps. First, a first pointing coordinate is generated at a first time, and the display position of the cursor is fixed at the first pointing coordinate according to the first pointing coordinate. Second, a second directional coordinate is generated at a second time, wherein the second time is after the first time. Thereafter, a first movement vector between the first directional coordinate and the second directional coordinate is calculated. Subsequently, a unit compensation vector is generated based on the first motion vector. Generating a third pointing coordinate at a third time, wherein the third time is after the second time. Then, a second movement vector between the third directional coordinate and the second directional coordinate is calculated. Then, the display position of the cursor at the third time is calculated according to the first pointing coordinate, the unit compensation vector and the second movement vector.
The embodiment of the invention also provides a cursor positioning method of the handheld pointing device, which comprises the following steps. First, a first pointing coordinate is generated at a first time to enter a lock point positioning mode, and a display position of a cursor is fixed at the first pointing coordinate. Second, a second directional coordinate is generated at a second time to exit the lock point locating mode, wherein the second time is after the first time. Thereafter, a first movement vector between the first directional coordinate and the second directional coordinate is calculated. Then, a unit compensation vector is generated according to the first motion vector. And then, during subsequent movement after the second time, based on the first pointing coordinate as a starting point, performing positioning calculation on the cursor according to the unit compensation vector and a pointing coordinate displacement generated by matching the movement of the handheld pointing device.
In summary, the present invention provides a cursor positioning method for a handheld pointing device, which can actively calculate the movement change of the handheld pointing device by detecting the imaging position of a reference point on a digital image, and determine whether the handheld pointing device enters or leaves a lock point positioning mode. The cursor positioning method can actively correct the pointing coordinate of the reference point on the digital image in the relative positioning mode when the handheld pointing device leaves the locking point positioning mode so as to calculate the cursor positioning. Then, the hand-held pointing device automatically switches back to the absolute positioning mode when finishing the pointing coordinate calibration procedure, thereby avoiding the occurrence of a jumping point condition and improving the convenience and stability of the operation of a user.
For a better understanding of the nature and technical content of the present invention, reference should be made to the following detailed description of the invention and the accompanying drawings, which are provided for illustration purposes only and are not intended to limit the scope of the invention as defined by the appended claims.
Drawings
Fig. 1 is a functional block diagram of a handheld pointing device according to an embodiment of the present invention.
Fig. 2 is a schematic view illustrating a handheld pointing device applied to an interactive system according to an embodiment of the present invention.
Fig. 3 is a flowchart illustrating a cursor positioning method of a handheld pointing device according to an embodiment of the present invention.
Fig. 4A to 4C are schematic diagrams illustrating a position change of a reference point sensed when the handheld pointing device provided by the embodiment of the invention moves and a position change of a cursor on a display device.
Fig. 5 is a flowchart illustrating a method for positioning a cursor according to a relative mode of a handheld pointing device according to an embodiment of the present invention.
Fig. 6A is a schematic diagram of a position change of a reference point sensed when a handheld pointing device provided by an embodiment of the present invention moves and a position change of a cursor on a display device.
Fig. 6B is a schematic diagram of a position change of a reference point sensed when the handheld pointing device provided by the embodiment of the invention moves.
Fig. 6C and 6D are schematic diagrams of a position change of a reference point sensed when the handheld pointing device provided by the embodiment of the invention moves and a position change of a cursor on a display device screen.
Fig. 6E is a schematic diagram of a position change of a cursor on a display device screen according to an embodiment of the present invention.
Fig. 7 is a flowchart illustrating a method for positioning a cursor of a pointing device according to another embodiment of the present invention.
Fig. 8 is a flowchart illustrating a method for entering a lock point positioning mode by a handheld pointing device according to an embodiment of the present invention.
Fig. 9A and 9B are schematic diagrams respectively illustrating a change in the position movement of a sensing reference point when determining whether the handheld pointing device enters the lock point positioning mode according to an embodiment of the present invention.
Fig. 10 is a flowchart illustrating a boundary correction method according to an embodiment of the present invention.
Fig. 11A to 11D are schematic diagrams illustrating a position change of a reference point sensed when a handheld pointing device according to an embodiment of the present invention moves and a position change of a cursor on a display device.
Fig. 12 is a flowchart illustrating a method for positioning a cursor of a pointing device according to another embodiment of the present invention.
Fig. 13 is a flowchart illustrating a method for positioning a cursor of a pointing device according to another embodiment of the present invention.
Wherein the reference numerals are as follows:
10: hand-held pointing device
11: image extraction unit
12: processing unit
13: communication unit
14: input unit
15: memory cell
16: key unit
20: display device
21: reference point
23. 33: cursor
d. d 1: displacement of cursor
201: the first display side edge
203: the second display side edge
205: third display side
207: the fourth display side edge
F1, F2, F3, F4, F5, F6: digital image
111. 111a, 111b, 111c, 111d, 111 e: operating range
1111. 1111a to 1111 c: center point of operating range
113. 113', 113a, 113b, 113 c: reference point image
115. 117: reference point image
Figure BDA0001433607060000051
Motion vector pointing to coordinates
Figure BDA0001433607060000052
Pointing coordinate displacement vector
+: center point
Y _ MAX: upper limit value of first axis
Y _ MIN: lower limit value of first shaft
X _ MAX: second upper limit value
X _ MIN: second lower limit value
DE 1: first axial direction
DE 2: second axial direction
(x1, y1), (x1 ', y 1'), (x2, y2), (x3, y 3): pointing coordinate
(xN,yN)、(xN-1,yN-1): pointing coordinate
(x1, yA), (x1, yB), (xC, y1), (xD, y 1): pointing to coordinate positions
Figure BDA0001433607060000053
Pointing coordinate
Figure BDA0001433607060000054
Relative pointing coordinate
S301 to S319: flow of steps
S501-S517: flow of steps
S701 to S715: flow of steps
S801 to S811: flow of steps
S1001 to S1017: flow of steps
S1201 to S1219: flow of steps
S1301-S1311: flow of steps
Detailed Description
Hereinafter, the present invention will be described in detail by illustrating various exemplary embodiments thereof through the accompanying drawings. The inventive concept may, however, be embodied in many different forms and should not be construed as limited to the exemplary embodiments set forth herein. Moreover, in the drawings, like reference numerals may be used to designate like elements.
[ embodiment of a hand-held pointing device ]
The handheld pointing device can be applied to positioning a pointing point of a display device (not shown). Referring to fig. 1, fig. 1 is a functional block diagram of a handheld pointing device according to an embodiment of the present invention. The handheld pointing device 10 may extract an image of at least one reference point, calculate a pointing coordinate of the handheld pointing device 10 according to an imaging position of the reference point in the extracted image, and calculate relative movement information between the handheld pointing device 10 and the reference point (not shown) according to continuous time movement variation information of the pointing coordinate. The handheld pointing device 10 transmits corresponding pointing coordinate position or movement information to a display device 20 in a wireless transmission manner, so as to cooperate with the display device 20 to display the execution state of the software program.
The pointing coordinates are the coordinate positions of a pointing point of the handheld pointing device 10 pointing to the display device 20 in the coordinate system defined by the display device 20. Generally, the pointing point is usually the position where an optical axis of the handheld pointing device 10 is aimed at the display device 20, or the center point of the image frame extracted by an image extraction unit of the handheld pointing device 10 corresponds to the position of the display device 20. Since it is the prior art to calculate the pointing coordinate of the handheld pointing device 10 according to the relative position relationship between the reference point and the display device 20 and the imaging position of the reference point on the handheld pointing device 10, it is not described herein again.
In brief, when the handheld pointing device 10 operates in the absolute positioning mode, the handheld pointing device 10 first extracts a first digital image corresponding to a reference point position, and calculates a first pointing coordinate of the handheld pointing device 10 corresponding to the display device 20 according to an imaging position of the reference point on the first digital image. The handheld pointing device 10 generates cursor parameters for controlling the cursor on the display device 20 according to the first pointing coordinate. When the handheld pointing device 10 enters the lock point positioning mode, the handheld pointing device 10 records a first pointing coordinate before the lock point, and fixes a cursor (not shown) on the display device 20 at a display position corresponding to the first pointing coordinate before the lock point.
In this embodiment, when the handheld pointing device 10 leaves the lock point positioning mode, the handheld pointing device 10 enters a relative positioning mode first. Specifically, when the handheld pointing device 10 enters the relative positioning mode, the handheld pointing device 10 extracts a second digital image of the reference point, and calculates and generates a second directional coordinate of the handheld pointing device 10 corresponding to the display device 20 according to the imaging position of the reference point on the second digital image. The handheld pointing device 10 calculates a movement vector of the reference point according to the recorded first pointing coordinate before the lock point and the calculated second pointing coordinate. The handheld pointing device 10 then calculates the cursor location according to the calculated motion vector and the first pointing coordinate before the recorded lock point to compensate the offset between the first pointing coordinate before the recorded lock point and the second pointing coordinate. The handheld pointing device 10 correspondingly generates cursor parameters for controlling the movement of the cursor according to the calculation result.
Further, the handheld pointing device 10 will perform cursor positioning calibration within a predetermined calibration time or calibration times, so that the cursor moves from the cursor position corresponding to the first pointing coordinate before the recorded lock point to the cursor position corresponding to the current pointing coordinate. When the handheld pointing device 10 completes the calibration cursor positioning, the handheld pointing device 10 leaves the relative positioning mode and enters the absolute positioning mode again until entering the lock point positioning mode next time.
In short, in the present embodiment, the handheld pointing device 10 can be actively switched to the absolute positioning mode, the relative positioning mode or the lock point positioning mode, and calculate the cursor positioning on the screen of the display device 20 according to the current operation mode.
Further, the handheld pointing device 10 includes an image capturing unit 11, a processing unit 12, a communication unit 13, an input unit 14, a storage unit 15, and a key unit 16. The image extraction unit 11, the communication unit 13, the input unit 14, the storage unit 15 and the key unit 16 are respectively coupled to the processing unit 12.
The image extracting unit 11 is configured to extract an image corresponding to a reference point position when the handheld pointing device 10 points to the reference point, and sequentially generate a plurality of digital images (image frames). The reference point may be disposed near the display device 20 for the handheld pointing device 10 to determine the pointing position of the handheld pointing device 10, and further determine the moving direction and the moving amount of the handheld pointing device 10 relative to the reference point.
Specifically, the image capturing unit 11 can filter out light other than the specific light wave through a filtering unit (not shown), so that the image capturing unit 11 only senses the light with the specific light wave emitted from the reference point. The image extraction unit 11 senses light generated by the reference point according to a predetermined image sampling frequency (image frame rate) to sequentially generate a plurality of digital images having the reference point.
The processing unit 12 is configured to receive the digital images generated by the image extracting unit 11, and calculate an imaging position of the reference point on one of the digital images according to the one of the digital images. The processing unit 12 calculates and controls the cursor location on the display 20 in an absolute location mode, a relative location mode or a lock point location mode according to the operation mode of the handheld pointing device 10.
The input unit 14 is used for a user of the handheld pointing device 10 to set an image sampling frequency and a calibration parameter of a cursor in a relative positioning mode, such as the number of times of calibration of a cursor position. For example, the user may set the image sampling frequency of the reference point according to a preset calibration time and set the calibration times of the cursor according to the image sampling frequency. For another example, the user can determine the number of times of cursor calibration according to the set image sampling frequency. The image sampling frequency may correspond to a frame update frequency of the display device 20, that is, the image sampling frequency may be set according to the display device 20.
In practice, the input unit 14 may be a key interface (keypad), a finger navigation device (optical navigation device) or a button (button) for activating the display device 20 to display a setting interface for the user to set the preset calibration time, the image sampling frequency and/or the calibration times of the cursor. If the handheld pointing device 10 has a handheld display screen (not shown), the preset calibration time, the image sampling frequency and/or the calibration times of the cursor can be displayed on the handheld display screen. The handheld display screen can also be a touch screen.
The storage unit 15 may be configured to store the first directional coordinate, the second directional coordinate, the movement vector, and the cursor parameter. The storage unit 15 can also be used to store the preset calibration time, the image sampling frequency and the calibration times of the cursor according to the operation requirement of the handheld pointing device 10.
The key unit 16 is used for a user to selectively press and control the handheld pointing device 10 to enter the lock point positioning mode or leave the lock point positioning mode. That is, the user can enter or leave the handheld pointing device 10 in the lock point positioning mode by pressing the key unit 16 controlling the handheld pointing device 10.
It should be noted that, if the user presses the key unit 16 to make the handheld pointing device 10 enter the lock point positioning mode, it can be set that the handheld pointing device 10 does not leave the lock point positioning mode because the displacement of the pointing coordinate generated by the movement of the handheld pointing device is greater than a predetermined value before the user releases the lock point state (i.e., drives the handheld pointing device 10 to leave the lock point positioning mode) by using the key unit 16.
In practice, the key unit 16 may be disposed on the handheld pointing device 10, and may be a touch switch, a push switch or a rocker switch. For example, when the key unit 16 is a touch switch, the user can activate the lock point positioning mode by touching the key unit 16 once, and then move the handheld pointing device 10 away from the lock point positioning mode.
When the processing unit 12 operates in the absolute positioning mode in the handheld pointing device 10, the processing unit 12 calculates an imaging position of the reference point on the first digital image according to the first digital image of the digital images. Then, the handheld pointing device 10 generates a first pointing coordinate according to the imaging position of the reference point on the first digital image. The processing unit 12 will calculate the cursor positioning in absolute positioning mode. Specifically, the processing unit 12 generates a cursor parameter corresponding to the control cursor on the display device 20 according to the calculated first pointing coordinate. The processing unit 12 also transmits the calculated cursor parameters corresponding to the control cursor to the display device 20 by wireless transmission through the communication unit 13, so as to correspondingly control the moving position of the cursor on the screen of the display device 20. The calculation of the absolute pointing coordinate of the handheld pointing device on the display screen of the display device 20 according to the imaging position of one or more reference points in the extracted image belongs to the prior art, and is not a main improved feature of the present invention, and is not described herein again.
When the handheld pointing device 10 enters the lock point positioning mode, the processing unit 12 records the first pointing coordinate before the lock point and fixes the cursor on the display device 20. For example, the processing unit 12 may continuously fix the cursor parameter corresponding to the first pointing coordinate before the recorded lock point, which is output to the display device 20 through the communication unit 13. Alternatively, the processing unit 12 may fix the cursor to the display device 20 without outputting the cursor parameter or outputting the relative movement amount to zero.
When the handheld pointing device 10 leaves the lock point positioning mode and enters the relative positioning mode, the processing unit 12 immediately starts the pointing coordinate calibration procedure to calculate the cursor positioning by the handheld pointing device 10 in the relative positioning mode. The processing unit 12 performs positioning calculation of the cursor by using the first pointing coordinate before the recorded lock point as a starting point and matching the movement vector with the pointing coordinate displacement generated by the movement of the handheld pointing device 10 during the subsequent movement of the handheld pointing device 10 within the preset correction time or the correction times. The pointing coordinate displacement amount refers to the displacement amount of two pointing coordinates calculated by the processing unit 12 of the handheld pointing device 10 at continuous time.
It should be noted that, in the present embodiment, the image extraction unit 11 may be implemented by a charge-coupled device (CCD) image sensor or a Complementary Metal Oxide Semiconductor (CMOS) image sensor, and those skilled in the art can design the image extraction unit according to actual use cases, which is not limited herein.
In the embodiment, the processing unit 15 may be implemented by a processor chip such as a microcontroller (microcontroller) or an embedded controller (embedded controller) using a program code compiling method, but the embodiment is not limited thereto. The storage unit 15 may be implemented by a volatile or non-volatile memory chip such as a flash memory chip, a read only memory chip, or a random access memory chip, but the embodiment is not limited thereto. The communication unit 13 may transmit the motion vector information to the display device 20 by using a bluetooth transmission method, but the embodiment is not limited thereto.
It should be noted that the types, physical architectures and/or embodiments of the image extraction unit 11, the processing unit 12, the communication unit 13, the input unit 14, the storage unit 15 and the key unit 16 are set according to the types, physical architectures and/or embodiments of the handheld pointing device 10, and the invention is not limited thereto.
A specific application of the handheld pointing device 10 is described below. Referring to fig. 2 and fig. 1 at the same time, fig. 2 is a schematic diagram illustrating an application of a handheld pointing device in an interactive system according to an embodiment of the present invention. The interactive system comprises a handheld pointing device 10 and a display device 20.
When the processing unit 12 of the handheld pointing device 10 points to the position of the reference point 21, the image extracting unit 11 drives to extract a plurality of digital images of the reference point 21. The processing unit 12 then calculates the imaging position of the reference point 21 according to one of the digital images to correspondingly calculate the pointing coordinate of the handheld pointing device 10 pointing to the display device 20. The processing unit 12 calculates the cursor location of the cursor 23 on the display device 20 according to the pointing coordinate and the operation mode thereof. The handheld pointing device 10 wirelessly transmits the cursor parameters corresponding to the cursor generated according to the position change of the reference point 21 to the display device 20 through the communication unit 13. Accordingly, the handheld pointing device 10 can control the position on the screen of the display device 20 where the cursor 23 appears.
Incidentally, in the present embodiment, the display device 20 itself has a software and hardware structure capable of executing and displaying software programs. In practice, the interactive system may further include a host (not shown) according to practical application requirements, such as a game host or a computer host. The host can be used for reading and executing software programs, such as game software, for example, light gun games, baseball games, tennis games, and the like. The host computer can also display the execution state of the software program on the display device 20 for the user to browse and control.
It should be noted that, in the present embodiment, the handheld pointing device 10 may be, for example, a remote controller, the display device 20 may be, for example, but not limited to, a projection display device, a game machine display screen, a television screen, and a computer display screen, the reference point 21 may be implemented by a plurality of light emitting diodes with specific wavelengths, such as an infrared light emitting diode (IR L ED), a laser diode, or an ultraviolet light emitting diode, arranged as reference points with various shapes.
In addition, only one reference point is used in the embodiment, but those skilled in the art can set the number of the reference points 21 according to design requirements, for example, 1, 2 or more. That is, fig. 2 is only used to illustrate the operation of the handheld pointing device 10, and is not intended to limit the present invention.
The present embodiment further provides a cursor positioning method for the handheld pointing device 10, so as to describe the operation of the handheld pointing device 10 more specifically. Please refer to fig. 3 and fig. 1, fig. 2, and fig. 4A to fig. 4C. Fig. 3 is a flowchart illustrating a cursor positioning method of the handheld pointing device 10 according to an embodiment of the present invention. Fig. 4A to 4C are schematic diagrams respectively illustrating a position change of a reference point sensed when the handheld pointing device provided by the embodiment of the invention moves and a position change of a cursor on a display device.
First, in step S301, the image extracting unit 11 extracts an image corresponding to the position of the reference point 21 when the handheld pointing device 10 points to the reference point 21, and generates a first digital image F1.
Next, in step S303, the processing unit 12 calculates an imaging position of the reference point image 113 corresponding to the reference point 21 in the first digital image F1 according to the first digital image F1,correspondingly generating a first pointing coordinate of the handheld pointing device 10 corresponding to the display device 20
Figure BDA0001433607060000111
First pointing coordinate
Figure BDA0001433607060000112
Indicating the pointing position vector of the handheld pointing device 10 corresponding to the display device 20 in the extracted first digital image F1, and the first pointing coordinate
Figure BDA0001433607060000113
Is (x1, y 1).
In particular, the first pointing coordinate
Figure BDA0001433607060000114
The calculation of (c) is as follows. First, the processing unit 12 defines the operation range 111 mapped in the first digital image F1 corresponding to the display device 20 according to the center point "+" of the first digital image F1 and the imaging position of the reference point image 113 in the first digital image F1. The operation range 111 corresponds to the display device 20 at a predetermined display ratio, and the processing unit 12 uses the reference point image 113 as an origin and is defined in the first digital image F1 according to the predetermined display ratio, wherein the operation range 111 is defined by the processing unit 12. The processing unit 12 may further define a center point 1111 of the operating range 111, and the processing unit 12 may use the center point 1111 of the operating range 111 as an origin and cooperate with the rotation angle of the handheld pointing device 10 to calculate a pointing position vector of the center point "+" of the first digital image F1 in the operating range 111 to obtain the first pointing coordinate
Figure BDA0001433607060000115
It is noted that the first pointing coordinate is obtained
Figure BDA0001433607060000116
In this case, the center point 1111 is not necessarily defined, but the center point "+" and the reference point shadow of the first digital image F1 can be directly usedCalculating the corresponding rotation angle according to the imaging position of the image 113 in the first digital image F1 or the imaging characteristics of the reference point image 113, and further obtaining the first pointing coordinate
Figure BDA0001433607060000117
The center point "+" is the center of the sensing array in the image capturing unit 11 in this embodiment. In other words, the first pointing coordinate
Figure BDA0001433607060000118
Indicating that the center of the sensing array (i.e., the center point "+") in the image extraction unit 11 corresponds to the pointing coordinate position of the coordinate system of the display device 20 in the first digital image F1. The rotation angle can also be calculated by the processing unit 12 according to an acceleration value output by an acceleration unit (not shown) built in the handheld pointing device 10. The acceleration unit may be, for example, a gravity sensor (G-sensor), an accelerometer (accelerometer).
Next, in step S305, the processing unit 12 uses the absolute positioning mode according to the first pointing coordinate
Figure BDA0001433607060000121
Cursor positioning is calculated. The processing unit 12 then correspondingly generates cursor parameters for controlling the cursor 23 to be located on the display device 20 according to the calculation result of the cursor positioning.
It is worth mentioning that the first pointing coordinate is shown in fig. 4A
Figure BDA0001433607060000122
The cursor is located within the operation range 111 of the first digital image F1, so that the display device 20 sets the display position of the cursor 23 on the screen according to the display ratio when receiving the cursor parameter. Therefore, when the handheld pointing device 10 transmits the cursor parameter for controlling the cursor 23 and the preset display ratio to the display device 20 through the communication unit 13, the display device 20 calculates the display position of the cursor 23 on the screen of the display device 20 according to the current display ratio (i.e. the resolution of the display device 20). The present technologyThose skilled in the art should understand the way that the display device 20 calculates the position of the cursor 23 on the screen of the display device 20 according to the current display ratio and the cursor parameter, and therefore, the description thereof is omitted here.
Subsequently, in step S307, the processing unit 12 determines whether the handheld pointing device 10 enters the lock point positioning mode. For example, the processing unit 12 can determine whether the handheld pointing device 10 enters the lock point positioning mode according to the displacement of the pointing coordinate of the reference point image 113 in the plurality of consecutive digital images or according to whether the key unit 16 is pressed.
When the processing unit 12 determines that the handheld pointing device 10 enters the lock point positioning mode, step S309 is executed. Otherwise, when the processing unit 12 determines that the handheld pointing device 10 is still operating in the absolute positioning mode, it returns to step S301 again.
In step S309, the processing unit 12 records the first pointing coordinate
Figure BDA0001433607060000123
In the memory cell 15. Meanwhile, the processing unit 12 may fixedly output the corresponding first pointing coordinate through the communication unit 13
Figure BDA0001433607060000124
The cursor parameter(s) of (1) is/are not output any cursor parameter(s), or the movement parameter(s) of the cursor 23 is/are output to the display device 20 as (0,0), so that the cursor 23 is fixed on the display device 20. At this time, as shown in fig. 4B, even if the user moves the handheld pointing device 10 to displace the reference point image 113 corresponding to the reference point 21, the pointing position of the handheld pointing device 10 is changed from the first pointing coordinate to the second pointing coordinate
Figure BDA0001433607060000125
Move to
Figure BDA0001433607060000126
The handheld pointing device 10 will always have the cursor 23 fixed to the display device 20 as long as the handheld pointing device 10 is still in the lock point positioning mode. The above-mentioned
Figure BDA0001433607060000127
The processing unit 12 uses the center point 1111 ' of the operation range 111 ' as the origin to calculate the pointing position vector of the center point "+" of the first digital picture F1 in the operation range 111 ', wherein the operation range 111 ' is defined according to the reference point image 113 '.
In addition, it is understood that the processing unit 12 may also directly use the imaging position of the reference point image 113 'as the origin to calculate the position vector (or coordinate) of the center point "+" of the first digital image F1 in the operation range 111'. Each of the first pointing coordinates being for calculation purposes only
Figure BDA0001433607060000131
And
Figure BDA0001433607060000132
the calculation should be performed in the same manner, for example, all based on the reference point image in the current digital image as the origin, or all based on the center point of the operation range in the current digital image as the origin.
In step S311, the processing unit 12 calculates a displacement of the handheld pointing device 10 according to the imaging position of the reference point 21 in the continuous digital images or determines whether the handheld pointing device 10 leaves the lock point positioning mode according to the pressing state of the key unit 16. When the processing unit 12 determines that the handheld pointing device 10 leaves the lock point positioning mode, step S313 is executed. Otherwise, when the processing unit 12 determines that the handheld pointing device 10 has not left the lock point positioning mode, it returns to step S311.
In step S313, the image extracting unit 11 of the handheld pointing device 10 extracts an image corresponding to the position of the reference point 21 and generates a second digital image F2, as shown in fig. 4C. The second digital picture F2 is extracted later than the first digital picture F1.
Next, in step S315, the processing unit 12 calculates an imaging position of the reference point image 113 corresponding to the reference point 21 on the second digital image F2 according to the second digital image F2, and correspondingly calculates a second directional coordinate
Figure BDA0001433607060000133
As shown in fig. 4C, the second directional coordinates
Figure BDA0001433607060000134
Indicating that the center point "+" of the sensing array in the image capturing unit 11 is mapped to the pointing position vector of the operation range 111a of the second digital image F2 on the display device 20, and the second pointing coordinate
Figure BDA0001433607060000135
Is (x2, y 2). Second directional coordinate
Figure BDA0001433607060000136
The processing unit 12 uses the center point 1111a of the operation range 111a as the origin to calculate the pointing position vector of the center point "+" of the second digital image F2 in the operation range 111a, wherein the operation range 111a is defined according to the reference point image 113 a.
Subsequently, the processing unit 12 calculates and generates first pointing coordinates
Figure BDA0001433607060000137
And the second directional coordinate
Figure BDA0001433607060000138
A motion vector of
Figure BDA0001433607060000139
In other words, a motion vector
Figure BDA00014336070600001310
As a first pointing coordinate
Figure BDA00014336070600001311
And the second directional coordinate
Figure BDA00014336070600001312
The distance between them. That is, a motion vector
Figure BDA00014336070600001313
The relative displacement between the first digital image F1 and the second digital image F2 and the display device 20 of the handheld pointing device 10 is shown. Motion vector
Figure BDA00014336070600001314
May, for example, be calculated by assigning the second directional coordinates
Figure BDA00014336070600001315
And the first pointing coordinate
Figure BDA00014336070600001316
Resulting from the subtraction.
In step S317, the processing unit 12 determines the motion vector
Figure BDA00014336070600001317
And first pointing coordinates
Figure BDA00014336070600001318
Calculating the cursor positioning of the cursor 23 to compensate the first pointing coordinate
Figure BDA00014336070600001319
And the second directional coordinate
Figure BDA00014336070600001320
The offset between.
Then, the processing unit 12 generates cursor parameters for controlling the movement of the cursor 23 accordingly according to the calculation result. The processing unit 12 wirelessly transmits the cursor parameters to the display device 20 by using the communication unit 13 to correspondingly control the movement of the cursor 23.
Specifically, the reference point images 113, 113 ', and 113a are represented by a dot as shown in fig. 4A to 4C, but the reference point images 113, 113', and 113a may also be represented by a cross or an asterisk, and the embodiment is not limited thereto. In addition, if the interactive system in fig. 2 uses two or more reference points 21, the average coordinate between the positions of the reference point images in the digital image can be used as the position of the reference point image 113 in this embodiment in the digital image. In addition, the processing unit 12 of the handheld pointing device 10 can also set a preset imaging parameter and a preset imaging distance parameter of the reference point image 113 according to the reference points 21 to compensate the position calculation result of the reference point image 113.
For example, the processing unit 12 may calculate the rotation angle of the handheld pointing device 10 by using the imaging parameters, such as the reference angle formed by the center point and the predetermined imaging parameter connecting line of the extracted reference point connecting line and the center point forming the first angle and the reference point image. The processing unit 12 of the handheld pointing device 10 can calculate the ratio between the reference point image and the preset imaging distance parameter by using the imaging distance parameter to compensate the position positioning deviation caused by the difference of the shooting distances. Those skilled in the art should know how to set the preset imaging parameter and the preset imaging distance parameter and compensate the position calculation result of the reference point image 113 by using the preset imaging parameter and the preset imaging distance parameter, and therefore, the detailed description thereof is omitted.
The following will further describe the details of the execution flow of the relative positioning mode in the cursor positioning method. Please refer to fig. 5 and fig. 1, fig. 2, and fig. 6A to fig. 6E. Fig. 5 is a flowchart illustrating a method for positioning a cursor according to a relative mode of a handheld pointing device according to an embodiment of the present invention. Fig. 6A is a schematic diagram of a position change of a reference point sensed when a handheld pointing device provided by an embodiment of the present invention moves and a position change of a cursor on a display device. Fig. 6B is a schematic diagram of a position change of a reference point sensed when the handheld pointing device provided by the embodiment of the invention moves. Fig. 6C to 6D are schematic diagrams respectively illustrating a position change of a reference point sensed when the handheld pointing device provided by the embodiment of the invention moves and a position change of a cursor on a display device. Fig. 6E is a schematic diagram of a position change of a cursor on a display device screen according to an embodiment of the present invention.
In step S501, when the processing unit 12 determines that the handheld pointing device 10 leaves the lock point positioning mode, the processing unit 12 immediately starts the pointing coordinate calibration procedure to enable the handheld pointing device 10 to enter the relative positioning mode.
In step S503, the processing unit 12 sets the correction times to N, the compensation vector to C, and the start point coordinate
Figure BDA0001433607060000151
The processing unit 12 will correct the number of times to N, the compensation vector to C and the coordinates of the starting point
Figure BDA0001433607060000152
Temporarily stored in the storage unit 15. The coordinates of the starting point
Figure BDA0001433607060000153
Is the first pointing coordinate shown in FIG. 6A
Figure BDA0001433607060000154
The first pointing coordinate
Figure BDA0001433607060000155
That is, as mentioned above, the processing unit 12 uses the center point 1111 of the operation range 111 as the origin to calculate the pointing position vector of the center point "+" of the first digital image F1 in the operation range 111, wherein the operation range 111 is defined according to the imaging position of the reference point image 113.
The processing unit 12 may set N according to the image sampling frequency or the preset time inputted by the user through the input unit 14. In one embodiment, if the user sets the five digital images to complete the pointing coordinate calibration procedure according to the image sampling frequency, the processing unit 12 sets N to 5. In another embodiment, the user may set a preset calibration time to be 5 seconds (i.e. the handheld pointing device 10 completes the calibration procedure of the pointing coordinate within 5 seconds) and the image sampling frequency to be 5 digital images per second, for example, and the processing unit 12 sets N to be 25.
In addition, the processing unit 12 is used for processing the motion vector
Figure BDA0001433607060000156
By dividing by N to obtain C, wherein C isThe calculation formula is as follows:
Figure BDA0001433607060000157
wherein C represents a compensation vector;
Figure BDA0001433607060000158
representing a motion vector;
Figure BDA0001433607060000159
representing a first pointing coordinate;
Figure BDA00014336070600001510
representing a second directional coordinate; n represents the number of corrections. As shown in the formula (1), the larger N means that the C value compensated for each time is smaller; and the smaller N means the larger C value compensated each time. It is worth mentioning that the second directional coordinate
Figure BDA00014336070600001511
The processing unit 12 calculates the pointing position vector of the center point "+" of the extracted second digital image F2 in the operation range 111B using the center point 1111B of the operation range 111B shown in fig. 6B as the origin, wherein the operation range 111B is defined according to the reference point image 113B.
In step S505, the processing unit 12 calculates a pointing coordinate displacement amount generated by the movement of the handheld pointing device 10. The processing unit 12 can extract the third digital image and define the operation range 111c according to the imaging position of the reference point image 113c of the reference point 21 on the third digital image F3, as shown in fig. 6B. Then, the processing unit 12 calculates the pointing position vector of the center point "+" of the third digital image F3 in the operation range 111b by using the center point 1111c of the operation range 111c as the origin to generate the third pointing coordinate
Figure BDA00014336070600001512
The third pointing coordinate
Figure BDA0001433607060000161
The center (i.e. the center point "+") of the sensing array in the image extraction unit 11 corresponds to the pointing coordinate position of the coordinate system of the display device 20 in the third digital image F3, and as shown in fig. 6B, the third pointing coordinate
Figure BDA0001433607060000162
Shown as (x3, y 3). Subsequently, the processing unit 12 coordinates the second orientation
Figure BDA0001433607060000163
And third directional coordinate
Figure BDA0001433607060000164
Calculating the displacement of the pointing coordinate generated by the movement of the handheld pointing device 10
Figure BDA0001433607060000165
Wherein the pointing coordinate displacement amount
Figure BDA0001433607060000166
The amount of pointing coordinate displacement generated for movement of the handheld pointing device 10 in the relative positioning mode. The third digital video F3 is extracted later than the second digital video F2.
Next, in step S507, the processing unit 12 calculates the coordinates of the start point
Figure BDA0001433607060000167
Displacement of pointing coordinate of the handheld pointing device 10
Figure BDA0001433607060000168
And C to obtain the relative pointing coordinates, as shown in fig. 6C. Relative pointing coordinate
Figure BDA0001433607060000169
The calculation method of (c) is as follows:
Figure BDA00014336070600001610
in step S509, the processing unit 12 determines the relative pointing coordinate
Figure BDA00014336070600001611
Generates and outputs cursor parameters corresponding to the position of the control cursor 23 on the display device 20. The processing unit 12 outputs the cursor parameter to move the cursor 23 by a distance d1 corresponding to the original position, as shown in fig. 6C.
Meanwhile, in step S511 and step S513, the processing unit 12 sets the coordinates of the start point
Figure BDA00014336070600001612
As a relative pointing coordinate
Figure BDA00014336070600001613
And perform N-1 (i.e., decrement the correction times). Processing unit 12 and setting the relative pointing coordinate
Figure BDA00014336070600001614
And the decremented N is stored in the storage unit 15. The processing unit 12 then determines whether N is equal to zero in step S515, i.e., whether the pointing coordinate calibration procedure is completed.
If the processing unit 12 determines that N is equal to zero, i.e. the pointing coordinate calibration procedure is completed, step S517 is executed. On the contrary, if the processing unit 12 determines that N is not equal to zero, that is, the pointing coordinate calibration procedure is not completed, the processing unit 12 returns to step S505. That is, the fourth pointing coordinate in the fourth digital image F4 is re-extracted and used
Figure BDA00014336070600001615
Calculating the displacement of the pointing coordinate generated by the movement of the handheld pointing device 10
Figure BDA0001433607060000171
And relative pointing coordinate
Figure BDA0001433607060000172
N-1, and the like, so that the cursor 23 is translated by a distance d1 to the corresponding relative pointing coordinate
Figure BDA0001433607060000173
The corresponding pointing position is shown in fig. 6D.
More specifically, the processing unit 12 may generate formula (3) to calculate the relative pointing coordinate according to formula (2)
Figure BDA0001433607060000174
Figure BDA0001433607060000175
Wherein the content of the first and second substances,
Figure BDA0001433607060000176
relative pointing coordinates calculated for the last time;
Figure BDA0001433607060000177
indicating the amount of displacement of the pointing coordinate of the current handheld pointing device 10, i.e., the third pointing coordinate
Figure BDA0001433607060000178
And fourth pointing coordinate
Figure BDA0001433607060000179
And C is a fixed value.
Then, the processing unit 12 re-executes the steps S505-S515 to sequentially extract N-2 digital images (not shown) for calculation
Figure BDA00014336070600001710
And so on until N equals zero.
When the processing unit 12 completes the pointing coordinate calibration procedure, the cursor 23 is displayed on the display device 20 corresponding to the first pointing coordinate as shown in FIG. 6E
Figure BDA00014336070600001711
The current pointing position of the pointing device 10 is reached N times, and each time the cursor 23 is translated by the calculated distance. In other words, the cursor 23 will be displayed on the display device 20 from the corresponding first pointing coordinate
Figure BDA00014336070600001712
The position of the handheld pointing device 10 on the nth digital image is translated by the distance d to reach the pointing position of the corresponding reference point 21.
Then, in step S517, the processing unit 12 leaves the handheld pointing device 10 from the relative positioning mode to enter the absolute positioning mode, and the processing unit 12 calculates the cursor positioning in the absolute positioning mode.
That is, in the relative positioning mode, the handheld pointing device 10 will be moved subsequently based on the first pointing coordinate
Figure BDA00014336070600001713
As a starting point, and coordinates of a first pointing coordinate
Figure BDA00014336070600001714
And the second directional coordinate
Figure BDA00014336070600001715
A motion vector of
Figure BDA00014336070600001716
And the pointing coordinate displacement generated by the movement of the handheld pointing device 10 during the calibration
Figure BDA00014336070600001717
A positioning calculation of the cursor 23 is performed. Accordingly, after the handheld pointing device 10 leaves the lock point positioning mode, the cursor 23 can smoothly move to the position pointed by the handheld pointing device 10 within the preset correction time or times, so as to avoid the occurrence of a jump point condition and maintain the directivity.
It should be noted that, in the relative positioning mode, the handheld pointing device 10 may not calibrate the handheld pointing device 10Displacement of pointing coordinate generated by time shift
Figure BDA0001433607060000181
A correction calculation is added. Specifically, the handheld pointing device 10 can be moved subsequently based on only the first pointing coordinate
Figure BDA0001433607060000182
As a starting point, and is matched with a first pointing coordinate
Figure BDA0001433607060000183
And the second directional coordinate
Figure BDA0001433607060000184
A motion vector of
Figure BDA0001433607060000185
(i.e., the displacement of the pointing coordinate generated by the current movement of the handheld pointing device 10) to perform the calibration calculation.
In fact, the processing unit 12 can execute the cursor positioning method of fig. 3 and the relative positioning calculating method of fig. 5 when operating by designing program codes corresponding to the cursor positioning method of fig. 3 and the relative positioning calculating method of fig. 5 on a microcontroller or an embedded controller, but the embodiment is not limited thereto.
In addition, fig. 3 is only used to describe a cursor positioning method of the handheld pointing device 10, and thus fig. 3 is not intended to limit the invention. Similarly, fig. 5 is only used to describe a specific way of calculating the relative positioning of the handheld pointing device 10, and is not intended to limit the present invention. For convenience of illustration, the operation range of the corresponding display device 20 is only for illustrative purposes and is not meant to limit the invention, as long as the operation range is related to the center point "+" of the sensing array in the image capturing unit 11. It should be noted that fig. 4A to 4C and fig. 6A to 6E are only used to describe the operation mode of the handheld pointing device 10 and the cursor positioning calculation mode with reference to fig. 3 and 5, respectively, and are not intended to limit the present invention.
[ Another embodiment of a Cursor positioning method for a hand-held pointing device ]
From the above embodiments, the present invention can also be summarized as a cursor positioning method, which can be applied to the handheld pointing device in the interactive system of the above embodiments. Referring to fig. 7 and fig. 1 and 2 at the same time, fig. 7 is a schematic flow chart illustrating a method for positioning a cursor of a handheld pointing device according to another embodiment of the present invention.
The cursor positioning method implemented in the handheld pointing device 10 shown in FIG. 7 can be implemented in firmware programming and executed by the processing unit 12 of the handheld pointing device 10.
First, in step S701, the image extracting unit 11 of the handheld pointing device 10 extracts a first digital image of a reference point 21 when the handheld pointing device 10 points to a position of the reference point 21.
Next, in step S703, the processing unit 12 calculates a first pointing coordinate of the handheld pointing device 10 pointing to the display device 20 according to the imaging position of the reference point 21 on the first digital image.
Then, in step S705, the processing unit 12 calculates the cursor positioning in the absolute positioning mode, and generates the cursor parameter corresponding to the control cursor 23 on the display device 20 according to the first pointing coordinate. The detailed calculation method for controlling the cursor position is the same as the foregoing embodiment, and therefore, the detailed description is omitted.
Next, in step S707, the processing unit 12 determines whether to enter the lock point positioning mode. As mentioned above, the processing unit 12 can determine whether the handheld pointing device 10 enters the lock point positioning mode according to the displacement amounts of the reference point image 113 in the plurality of consecutive digital images or according to whether the key unit 16 of the handheld pointing device 10 is pressed.
When the processing unit 12 determines that the handheld pointing device 10 enters the lock point positioning mode, step S709 is executed. Otherwise, when the processing unit 12 determines that the handheld pointing device 10 is still operating in the absolute positioning mode, it returns to step S701 again.
In step S709, the processing unit 12 records the first pointing coordinate storage unit 15. Meanwhile, the processing unit 12 fixedly outputs the cursor parameter corresponding to the first pointing coordinate or does not output any cursor parameter to the display device 20 through the communication unit 13, so that the cursor 23 is fixed on the display device 20.
In step S711, the processing unit 12 can determine whether the handheld pointing device 10 leaves the lock point positioning mode according to the displacement in the continuous digital images or according to the pressing state of the key unit 16. When the processing unit 12 determines that the handheld pointing device 10 leaves the lock point positioning mode, step S713 is executed. Otherwise, when the processing unit 12 of the handheld pointing device 10 determines that the handheld pointing device 10 has not left the lock point positioning mode, the process returns to step S711.
In step S713, when the handheld pointing device 10 leaves the lock point positioning mode, the processing unit 12 performs a positioning calculation of the cursor 23 based on the first pointing coordinate as a starting point and in cooperation with a pointing coordinate displacement amount generated by the movement of the handheld pointing device 10 when the handheld pointing device 10 moves subsequently.
Subsequently, the processing unit 12 correspondingly generates cursor parameters for controlling the movement of the cursor 23 according to the calculation result. The processing unit 12 wirelessly transmits the cursor parameter to the display device 20 through the communication unit 13 to correspondingly control the action of the cursor 23 on the screen of the display device 20.
Fig. 7 is only used to describe a cursor positioning method of the handheld pointing device 10, and therefore fig. 7 is not intended to limit the present invention. One skilled in the art can select a determination method for determining to enter or leave the lock point positioning mode according to actual operation requirements. The cursor positioning calculation method in the relative positioning mode in the above embodiment may execute step S713, and perform the positioning of the calibration cursor 23 within the preset calibration time in the relative positioning manner.
[ embodiment of method for determining entry of hand-held pointing device into lock point positioning mode ]
From the above embodiments, the present invention can also be summarized as a method for determining that the handheld pointing device enters the lock point positioning mode, and the method can be applied to the handheld pointing device in the interactive system of the above embodiments. Referring to fig. 8 and fig. 1 and fig. 9A to 9B at the same time, fig. 8 is a schematic flow chart illustrating a method for entering a lock point positioning mode of a handheld pointing device according to an embodiment of the present invention. Fig. 9A to 9B are schematic diagrams respectively illustrating a sensing reference point position moving change when determining whether the handheld pointing device enters the lock point positioning mode according to an embodiment of the present invention.
First, in step S801, the processing unit 12 of the handheld pointing device 10 determines whether the key unit 16 is pressed by the user. That is, the processing unit 12 determines whether the user of the handheld pointing device 10 forces the handheld pointing device 10 into the lock point positioning mode using the key unit 16.
When the processing unit 12 determines that the handheld pointing device 10 enters the lock point positioning mode (e.g., the key unit 16 is pressed by the user), step S811 is performed. On the contrary, when the processing unit 12 determines that the key unit 16 is not pressed by the user, step S803 is executed.
In step S803, the processing unit 12 drives the image extracting unit 11 to extract the image corresponding to the reference point position according to a predetermined image sampling frequency, and sequentially generates a plurality of digital images.
Subsequently, in step S805 and as shown in the digital image F5 of fig. 9A, the processing unit 12 of the handheld pointing device 10 calculates the displacement change of the handheld pointing device 10 according to the imaging position of the reference point in the digital image, i.e. calculates the motion vector between the pointing coordinates
Figure BDA0001433607060000201
In step S807, the processing unit 12 may obtain a motion vector according to the calculation result
Figure BDA0001433607060000202
It is determined whether the displacement of the pointing coordinate generated by the movement of the handheld pointing device 10 is smaller than a predetermined displacement threshold (e.g., zero). When the processing unit 12 determines that the displacement amount of the pointing coordinate generated by the movement of the handheld pointing device 10 is smaller than the preset displacement threshold, step S809 is executed. Otherwise, the processing unit 12 determines that the displacement of the pointing coordinate generated by the movement of the handheld pointing device 10 is greater than the preset displacement threshold, and returns to stepStep S801.
In step S809, the processing unit 12 obtains the motion vector according to the calculation
Figure BDA0001433607060000203
It is determined whether the direction in which the handheld pointing device 10 is continuously moving is different. As shown in fig. 9A and 9B, the processing unit 12 can calculate and obtain the pointing coordinate of the center point "+" of the sensing array in the image extracting unit 11 in the operation range 111d mapped in the digital image F5 corresponding to the display device according to the imaging position of the reference point image 115 in the digital image corresponding to the reference point, and the pointing coordinate of the operation range 111e defined in the digital image F6 extracted subsequently is shown to move from right to left, i.e. the handheld pointing device 10 moves continuously from left to right. Wherein the digital image F6 is extracted later than the digital image F5. Thus, the processing unit 12 can determine that the direction of the continuous movement of the handheld pointing device 10 is different, and then execute step S811. On the contrary, if the processing unit 12 determines that the direction of the continuous movement of the handheld pointing device 10 is the same, step S801 is executed.
In step S811, the processing unit 12 makes the handheld pointing device 10 enter the lock point positioning mode to fix the cursor on the screen of the display device 20.
It should be noted that after the handheld pointing device 10 enters the lock point positioning mode, the processing unit 12 can determine the pressing state of the key unit 16 to make the handheld pointing device 10 leave the lock point positioning mode. Alternatively, the processing unit 12 may determine whether the displacement amount of the pointing coordinate generated by the movement of the handheld pointing device 10 is greater than a preset displacement threshold by determining the displacement change of the reference point 21. The processing unit 12 can determine that the displacement of the pointing coordinate generated by the movement of the handheld pointing device 10 is greater than the preset displacement threshold, so that the handheld pointing device 10 leaves the lock point positioning mode.
In addition, fig. 8 is only used to describe a method for determining the handheld pointing device 10 to enter the lock point positioning mode, and therefore, fig. 8 is not intended to limit the present invention. One of ordinary skill in the art can select a method for determining to enter the lock point positioning mode according to actual operation requirements. In other words, in practice, the steps S807 (i.e. the amount of displacement change of the handheld pointing device 10) and S809 (i.e. the moving direction of the handheld pointing device 10) can be selected to be executed simultaneously or only one of them according to the actual operation requirement. In addition, the preset displacement threshold may be set according to the application of the actual handheld pointing device 10, and the embodiment is not limited thereto.
[ Another embodiment of the relative mode position calculation for a hand-held pointing device ]
The present invention further provides a method for determining whether the reference point exceeds an operation range according to the calculated pointing coordinate, and the method can be applied to the handheld pointing device in the interactive system of the above embodiment. Please refer to fig. 10 and fig. 1 and fig. 11A to 11D simultaneously. Fig. 10 is a schematic flow chart illustrating a boundary correction method according to an embodiment of the invention. Fig. 11A to 11D are schematic diagrams respectively illustrating a position change of a reference point sensed when the handheld pointing device provided by the embodiment of the invention moves and a position change of a cursor on a display device.
The method for determining whether the reference point is beyond an operation range, which is executed in the handheld pointing device 10 shown in fig. 10, can be implemented in a firmware programming manner and executed by the processing unit 12 of the handheld pointing device 10. The processing unit 12 may be a processing chip such as a microcontroller or an embedded controller disposed on the handheld pointing device 10, but the embodiment is not limited thereto.
The operation range 111 shown in fig. 11A to 11D in the first digital image F1 corresponds to the display device 20 at a predetermined display scale. As shown in fig. 11A to 11D, the operating range 111 is an area bounded between the first axis upper limit value Y _ MAX in the first axial direction DE1 and the first axis lower limit value Y _ MIN in the first axial direction DE1 of the display device 20, and bounded between the second axis upper limit value X _ MAX in the second axial direction DE2 and the second axis lower limit value X _ MIN in the second axial direction DE2 of the display device 20.
The first axial upper limit value Y _ MAX corresponds to the first display side 201 of the display device 20. The first lower axis limit value Y _ MIN corresponds to the second display side 203 of the display device 20. The second upper axis limit value X _ MAX corresponds to the third display side 205 of the display device 20. The second lower second axis limit value X _ MIN corresponds to the fourth display side 207 of the display device 20.
The method of FIG. 10 can be applied to the implementation of the method of FIGS. 3 and 5, which calculates the first pointing coordinate of the handheld pointing device 10 according to the imaging position of the reference point image 117 on the first digital image F1 and using the center point 1111 of the operation range 111 as the origin
Figure BDA0001433607060000221
Then, the first pointing coordinate is determined
Figure BDA0001433607060000222
Whether the operation range 111 corresponding to the display range of the display device 20 is exceeded. The first pointing coordinate
Figure BDA0001433607060000223
In the present embodiment, the first axis coordinate and the second axis coordinate are included. The first axis coordinate is located in the first axial direction DE1, and the second axis coordinate is located in the second axial direction DE 2.
First, in step S1001, the processing unit 12 of the handheld pointing device 10 determines a first pointing coordinate according to the first pointing coordinate
Figure BDA0001433607060000224
Judging the first pointing coordinate
Figure BDA0001433607060000225
Whether it is outside the operating range 111 in the first digital image F1. That is, the processing unit 12 refers to the first pointing coordinates
Figure BDA0001433607060000226
It is determined whether the center (i.e., the center point "+") of the sensing array in the image capturing unit 11 is located outside the operation range 111 of the corresponding display device 20 in the first digital image F1.
When the processing unit 12 determines the first pointing coordinate
Figure BDA0001433607060000227
Is in operationWhen the range 111 is out, step S1005 is executed. On the contrary, when the processing unit 12 determines the first pointing coordinate
Figure BDA0001433607060000228
When the position is within the operation range 111 of the first digital image F1, step S1003 is executed.
In step S1003, the processing unit 12 determines the first pointing coordinate according to the first pointing coordinate
Figure BDA0001433607060000229
Cursor parameters corresponding to the position of the control cursor 33 on the display device 20 are generated. A processing unit 12 and a communication unit 13 for outputting corresponding first pointing coordinates
Figure BDA00014336070600002210
So that the cursor 33 is located at a corresponding position on the screen of the display device 20.
In step S1005, the processing unit 12 determines the first pointing coordinate
Figure BDA00014336070600002211
Judging the first pointing coordinate
Figure BDA00014336070600002212
Is greater than the first-axis upper limit value Y _ MAX of the operating range 111. When the processing unit 12 determines the first pointing coordinate
Figure BDA00014336070600002213
First axis coordinates (e.g., of fig. 11A)
Figure BDA00014336070600002214
) If the value is greater than the first axial upper limit value Y _ MAX of the operating range 111, step 1007 is executed. On the contrary, when the processing unit 12 determines the first pointing coordinate
Figure BDA00014336070600002215
Is smaller than the first axis upper limit value Y _ MAX of the operation range 111, step 1009 is executed.
In step S1007, as shown in FIG. 11A, the processing unit 12 outputs the cursor parameter through the communication unit 13 such that the moving direction of the cursor 33 in the first axial direction DE1 is fixed to the first display side 201 of the display device 20 and the moving direction of the cursor 33 in the second axial direction DE2 is according to the first pointing coordinate
Figure BDA00014336070600002316
Is located (e.g., according to second axis coordinate x1 in fig. 11A). Subsequently, the processing unit 12 executes step S1013.
In step S1009, the processing unit 12 generates the first pointing coordinate according to the first pointing coordinate
Figure BDA0001433607060000231
Judging the first pointing coordinate
Figure BDA0001433607060000232
Is less than the first-axis lower limit Y _ MIN of the operating range 111. When the processing unit 12 determines the first pointing coordinate
Figure BDA0001433607060000233
First axis coordinates (e.g., of FIG. 11B)
Figure BDA0001433607060000234
) When it is smaller than the first axis lower limit value Y _ MIN of the operating range 111, step 1011 is executed. On the contrary, when the processing unit 12 determines the first pointing coordinate
Figure BDA0001433607060000235
Is greater than the first-axis lower limit value Y _ MIN of the operating range 111, step 1013 is performed.
In step S1011, as shown in fig. 11B, the processing unit 12 outputs the cursor parameter through the communication unit 13 so that the moving direction of the cursor 33 in the first axial direction DE1 is fixed on the second display side 203. The moving direction of the cursor 33 in the second axial direction DE2 is based on the first pointing coordinate
Figure BDA0001433607060000236
To (1) aTwo-axis coordinates (e.g., according to the second axis coordinates in fig. 11B). Subsequently, the processing unit 12 executes step S1013.
In step S1013, the processing unit 12 determines the first pointing coordinate according to the first pointing coordinate
Figure BDA0001433607060000237
Judging the first pointing coordinate
Figure BDA0001433607060000238
Is greater than the second-axis upper limit value X _ MAX of the operating range 111. When the processing unit 12 determines the first pointing coordinate
Figure BDA0001433607060000239
Second axis coordinates (e.g., of FIG. 11C)
Figure BDA00014336070600002310
) If the second axis upper limit value X _ MAX is larger than the operation range 111, step S1015 is executed. On the contrary, when the processing unit 12 determines the first pointing coordinate
Figure BDA00014336070600002311
Is smaller than the second axis upper limit value X _ MAX of the operation range 111, step S1017 is executed.
In step S1015, as shown in fig. 11C, the processing unit 12 outputs the cursor parameter through the communication unit 13 such that the moving direction of the cursor 33 in the second axial direction DE2 is fixed on the third display side 205. The moving direction of the cursor 33 in the first axial direction DE1 is based on the first pointing coordinate
Figure BDA00014336070600002312
For example, according to first axis coordinate y1 in fig. 11C.
In step S1017, the processing unit 12 determines the first pointing coordinate
Figure BDA00014336070600002313
Second axis coordinates (e.g., of FIG. 11D)
Figure BDA00014336070600002314
) The second axis lower limit value X _ MIN exceeding the operation range 111 and the cursor parameter output through the communication unit 13 make the moving direction of the cursor 33 in the second axial direction DE2 fixed to the fourth display side 207. The moving direction of the cursor 33 in the first axial direction DE1 is based on the first pointing coordinate
Figure BDA00014336070600002315
For example, according to first axis coordinate y1 in fig. 11D.
In short, the method of fig. 10 is suitable for determining whether the cursor is located outside the display range of the display device 20 after calculating the pointing coordinate according to the digital image. In addition, the method of fig. 10 completes the boundary coordinate calibration procedure by fixing the cursor 33 on a boundary when the cursor 33 may exceed the boundary of the display range of the corresponding display device 20.
It should be noted that, in the embodiment, only the first pointing coordinate is used to describe the cursor boundary correction method, but actually, the processing unit of the handheld pointing device may obtain the pointing coordinates (e.g., the second pointing coordinate and the third pointing coordinate) corresponding to the reference point position according to the digital image (e.g., the second digital image and the third digital image), and then correspondingly detect whether the calculated pointing coordinates exceed the operation range in the digital image by using the method of fig. 10, and perform the boundary correction when the calculated pointing coordinates exceed the operation range.
Fig. 10 is only used to describe a method of performing boundary correction for the handheld pointing device 10, and therefore, fig. 10 is not intended to limit the present invention. One skilled in the art can also set the operation range 111 of the first digital image according to the display range of the display device 20, i.e. the first upper limit value Y _ MAX, the first lower limit value Y _ MIN, the second upper limit value X _ MAX, and the second lower limit value X _ MIN.
[ still another embodiment of a cursor positioning method of a handheld pointing device ]
From the above embodiments, the present invention can also be summarized as a cursor positioning method, which can be applied to the handheld pointing device in the interactive system of the above embodiments. Referring to fig. 12 and fig. 1 and fig. 2 at the same time, fig. 12 is a schematic flowchart illustrating a method for positioning a cursor of a handheld pointing device according to another embodiment of the present invention.
The cursor positioning method implemented in the handheld pointing device 10 shown in FIG. 12 can be implemented in firmware programming and executed by the processing unit 12 of the handheld pointing device 10.
In step S1201, the image extracting unit 11 of the handheld pointing device 10 extracts the first digital image of the reference point 21 at the first time, so that the processing unit 12 calculates the first pointing coordinate of the handheld pointing device 10 pointing to the display device 20 according to the imaging position of the reference point 21 on the first digital image. The processing unit 12 fixes the display position of the cursor 23 at the first pointing coordinate according to the first pointing coordinate, so that the handheld pointing device 10 enters the lock point positioning mode.
Further, the processing unit 12 may generate a cursor parameter corresponding to the position of the control cursor 23 on the display device 20 according to the first pointing coordinate. More specifically, the processing unit 12 may fix the cursor 23 at the first pointing coordinate by fixedly outputting the cursor parameter corresponding to the first pointing coordinate or not outputting any cursor parameter to the display device 20 through the communication unit 13. The detailed calculation and positioning manner of the cursor 23 is the same as that of the foregoing embodiment, and therefore the detailed description is omitted.
In step S1203, the processing unit 12 then moves the handheld pointing device 10 away from the lock point positioning mode at a second time. The processing unit 12 drives the image extracting unit 11 to extract the second digital image of the reference point 21 at the second time, so as to calculate a second pointing coordinate of the handheld pointing device 10 pointing to the display device 20 according to the imaging position of the reference point 21 on the second digital image. The second time is after the first time, i.e. the second digital image is extracted later than the first digital image.
In step S1205, the processing unit 12 calculates a first motion vector between the first directional coordinate and the second directional coordinate according to the first directional coordinate and the second directional coordinate.
Next, in step S1207, the processing unit 12 generates a unit compensation vector according to the first motion vector calculation. The processing unit 12 may calculate the unit compensation vector according to a preset number of corrections or correction time as described in the previous embodiments. For example, the handheld pointing device 10 may generate a unit compensation vector by dividing the first motion vector by the number of corrections or the correction time. The number of times or the time for correction may be set according to the image sampling frequency or the preset time.
Next, in step S1209, the processing unit 12 drives the image extracting unit 11 at a third time to extract a third digital image of the reference point 21, so as to calculate a third pointing coordinate of the handheld pointing device 10 pointing to the display device 20 according to the imaging position of the reference point 21 in the third digital image. The processing unit 12 starts the aforementioned pointing coordinate calibration procedure at the third time and calculates the positioning of the cursor 23 in the relative positioning mode. The third time is after the second time, that is, the extraction time of the third digital image is later than that of the second digital image.
In step S1211, the processing unit 12 simultaneously calculates a second motion vector between the third directional coordinate and the second directional coordinate at a third time. The processing unit 12 may obtain the second movement vector by calculating a displacement distance between the third directional coordinate and the second directional coordinate.
In step S1213, the handheld pointing device 10 then calculates the display position of the cursor 23 on the display device 20 at the third time according to the first pointing coordinate, the unit compensation vector and the second movement vector. Specifically, the handheld pointing device 10 correspondingly generates cursor parameters for controlling the movement of the cursor 23 according to the calculation result. The processing unit 12 wirelessly transmits the cursor parameter to the display device 20 through the communication unit 13 to correspond to the display position of the control cursor 23 on the screen of the display device 20 at the third time.
Then, the handheld pointing device 10 completes executing the relative positioning mode at the fourth time, and calculates the cursor positioning in the absolute positioning mode at the fourth time. In step S1215, the processing unit 12 extracts a fourth digital image of the reference point 21 at a fourth time, so as to calculate a fourth pointing coordinate of the handheld pointing device 10 pointing to the display device 20 according to the imaging position of the reference point 21 in the fourth digital image. The fourth time is after the third time, that is, the fourth digital image is extracted later than the third digital image. The time length between the fourth time and the third time can be configured according to the correction times or the correction time. The specific implementation manner of the relative positioning mode is the same as that of the foregoing embodiment, and therefore, the description thereof is omitted.
In step S1217, the processing unit 12 calculates a display position of the cursor on the display device 20 at a fourth time according to the fourth pointing coordinate.
In step S1219, the processing unit 12 generates and outputs a cursor parameter corresponding to the display position of the control cursor 23 at the fourth time through the communication unit 13 according to the calculation result. The handheld pointing device 10 wirelessly transmits the cursor parameter to the display device 20 to correspond to the display position of the control cursor 23 on the display screen of the display device 20 at the fourth time.
It should be noted that the processing unit 12 records the calculated first directional coordinate, second directional coordinate, third directional coordinate, fourth directional coordinate, first motion vector, second motion vector and unit compensation vector in the storage unit 15 respectively. In addition, one skilled in the art can also add the method for determining whether the reference point 21 exceeds the operation range to the processing unit 12 in the method of fig. 10 by using a firmware design according to the actual operation requirement. That is, the processing unit 12 can automatically determine whether the reference point 21 exceeds the operation range according to the first axis coordinate and the second axis coordinate of the first pointing coordinate, the second pointing coordinate, the third pointing coordinate, and the fourth pointing coordinate when calculating the first pointing coordinate, the second pointing coordinate, the third pointing coordinate, and the fourth pointing coordinate, so as to prevent the cursor 23 from exceeding the screen of the display device 20. It should be noted that fig. 12 is only used to describe a cursor positioning method of the handheld pointing device 10, and fig. 12 is not used to limit the present invention.
[ still another embodiment of a cursor positioning method for a handheld pointing device ]
From the above embodiments, the present invention can also be summarized as a cursor positioning method, which can be applied to the handheld pointing device in the interactive system of the above embodiments. Referring to fig. 13 and fig. 1 and 2 at the same time, fig. 13 is a schematic flow chart illustrating a method for positioning a cursor of a handheld pointing device according to still another embodiment of the present invention.
The cursor positioning method implemented in the handheld pointing device 10 shown in FIG. 13 can be implemented in firmware programming and executed by the processing unit 12 of the handheld pointing device 10.
In step S1301, the image extracting unit 11 of the handheld pointing device 10 extracts the first digital image corresponding to the reference point 21 at the first time, so as to calculate and generate a first pointing coordinate of the handheld pointing device 10 pointing to the display device 20 according to the imaging position of the reference point 21 in the first digital image. The handheld pointing device 10 fixes the display position of the cursor 23 at the first pointing coordinate according to the first pointing coordinate, so as to enter the lock point positioning mode.
In step S1303, the image extracting unit 11 then extracts a second digital image of the reference point 21 at a second time, so as to calculate a second pointing coordinate of the handheld pointing device 10 pointing to the display device 20 according to the imaging position of the reference point 21 in the second digital image, and to move the handheld pointing device 10 away from the lock point positioning mode. The second time is after the first time, that is, the second digital image is extracted later than the first digital image.
In step S1305, the processing unit 11 of the handheld pointing device 10 calculates a first movement vector between the first directional coordinate and the second directional coordinate according to the first directional coordinate and the second directional coordinate.
Next, in step S1307, the processing unit 11 generates a unit compensation vector according to the first motion vector calculation. The handheld pointing device 10 can calculate the unit compensation vector according to the preset correction times or correction time as described in the previous embodiments. For example, the handheld pointing device 10 may generate a unit compensation vector by dividing the first motion vector by the number of corrections or the correction time.
Next, in step S1309, the processing unit 11 performs a positioning calculation of the cursor 23 according to the unit compensation vector and a pointing coordinate displacement amount generated by the movement of the handheld pointing device 10 based on the first pointing coordinate as a starting point during a subsequent movement after the second time. Specifically, the processing unit 11 starts the aforementioned pointing coordinate calibration procedure at a third time, so that the handheld pointing device 10 calculates the positioning of the cursor 23 in the relative positioning mode. The specific implementation manner of the relative positioning mode is the same as that of the foregoing embodiment, and therefore, the description thereof is omitted.
In step 1311, the processing unit 11 correspondingly generates cursor parameters for controlling the cursor 23 to move according to the calculation result. The processing unit 11 wirelessly transmits the cursor parameter to the display device 20 through the communication unit 13 to correspond to the display position of the control cursor 23 on the screen of the display device 20 at the third time.
It should be noted that fig. 13 is only used to describe a cursor positioning method of the handheld pointing device 10, and is not intended to limit the invention.
[ possible effects of the embodiment ]
In summary, the embodiments of the present invention provide a cursor positioning method for a handheld pointing device, which can actively calculate a movement change of the handheld pointing device by detecting an imaging position of a reference point on a digital image, and determine whether the handheld pointing device enters a lock point positioning mode. The cursor positioning method can avoid the jumping point condition by actively switching the cursor positioning mode to the absolute positioning mode or the relative positioning mode when the handheld pointing device leaves the locking point positioning mode.
In addition, the cursor positioning method can correct the boundary of the absolute positioning coordinate of the reference point when detecting that the cursor possibly exceeds the display boundary of the display device because the reference point exceeds the operation range of the digital image.
Therefore, the cursor positioning method provided by the invention can correspondingly correct the absolute positioning coordinate of the detected reference point by actively according to the operation mode of the handheld pointing device, and meanwhile, the directivity of the handheld pointing device can be maintained, so that the convenience and the stability of the operation of a user can be improved.
The above description is only an example of the present invention, and is not intended to limit the scope of the claims of the present invention.

Claims (12)

1. A method for positioning a cursor of a handheld pointing device, comprising:
generating a first pointing coordinate at a first time, entering a locking point positioning mode, and correspondingly fixing a display position of a cursor at the first pointing coordinate according to the first pointing coordinate;
generating a second directional coordinate to leave the lock point locating mode at a second time, wherein the second time is after the first time;
calculating a first motion vector between the first directional coordinate and the second directional coordinate;
generating a unit compensation vector according to the first motion vector;
generating a third pointing coordinate at a third time, wherein the third time is after the second time;
calculating a second motion vector between the third directional coordinate and the second directional coordinate; and
and calculating the display position of the cursor at the third time according to the first pointing coordinate, the unit compensation vector and the second movement vector.
2. A cursor positioning method according to claim 1, further comprising:
generating a fourth pointing coordinate at a fourth time, wherein the fourth time is after the third time;
calculating the display position of the cursor at the fourth time according to the fourth pointing coordinate; and
and generating a cursor parameter corresponding to the display position of the cursor at the fourth time.
3. The method of claim 1, wherein the step of calculating the display position of the cursor at the third time according to the first pointing coordinate, the unit compensation vector and the second motion vector comprises:
and starting a pointing coordinate correction program at the third time to enable the handheld pointing device to calculate the cursor positioning in a relative positioning mode, wherein the relative positioning mode is that the handheld pointing device performs the positioning calculation of the cursor at the third time based on the first pointing coordinate as a starting point and by matching the unit compensation vector and the second movement vector when moving subsequently.
4. A cursor positioning method according to claim 3, wherein the step of performing the relative positioning mode comprises:
step a, setting a correction frequency as N, the unit compensation vector as C and an initial point coordinate, wherein C is the first motion vector divided by N, and the initial point coordinate is the first pointing coordinate;
b, calculating the second motion vector according to the third pointing coordinate;
c, calculating the sum of the starting point coordinate, the second motion vector and C to obtain a relative pointing coordinate;
d, generating and outputting a cursor parameter corresponding to the display position of the cursor at the third time according to the relative pointing coordinate;
step e, setting the initial point coordinate as the relative pointing coordinate and executing N-1;
f, judging whether N is equal to zero; and
and g, if N is not equal to zero, returning to the step b.
5. The cursor positioning method of claim 1, wherein the step of fixing the display position of the cursor at the first time at the first pointing coordinate comprises:
and enabling the handheld pointing device to fixedly output a cursor parameter which correspondingly controls the display position of the cursor to be at the first pointing coordinate, so that the cursor is fixed at the first pointing coordinate at the first time.
6. The cursor positioning method of claim 1, wherein the step of fixing the display position of the cursor at the first time at the first pointing coordinate comprises:
and not outputting a cursor parameter corresponding to the first pointing coordinate of the display position of the cursor, so that the cursor is fixed at the first pointing coordinate at the first time.
7. A cursor positioning method according to claim 1, wherein after the step of generating the first pointing coordinate, further comprising:
judging whether the first pointing coordinate exceeds an operation range in a first digital image;
the first pointing coordinate position comprises a first axis coordinate and a second axis coordinate, the operation range corresponds to a display device in a preset display proportion, the operation range is an area between a first axis upper limit value and a first axis lower limit value and between a second axis upper limit value and a second axis lower limit value, the first axis upper limit value corresponds to a first display side edge of the display device, the first axis lower limit value corresponds to a second display side edge of the display device, the second axis upper limit value corresponds to a third display side edge of the display device, and the second axis lower limit value corresponds to a fourth display side edge of the display device.
8. The cursor positioning method of claim 7, wherein the step of determining whether the first pointing coordinate is beyond the operation range comprises:
when the first axis coordinate is larger than the first axis upper limit value, the hand-held pointing device outputs a cursor parameter for controlling the display position of the cursor so that the cursor is fixed on the first display side edge in the first axis moving direction;
when the first axis coordinate is smaller than the first axis lower limit value, the hand-held pointing device outputs the cursor parameter to enable the cursor to be fixed on the second display side edge in the first axis moving direction;
when the second axis coordinate is larger than the second axis upper limit value, the hand-held pointing device outputs the cursor parameter to enable the cursor to be fixed on the third display side edge in the second axis moving direction; and
when the second axis coordinate is smaller than the second axis lower limit value, the hand-held pointing device outputs the cursor parameter to enable the cursor to be fixed on the fourth display side edge in the second axis moving direction.
9. A method for positioning a cursor of a handheld pointing device, comprising:
generating a first pointing coordinate at a first time to enter a lock point positioning mode, and fixing a display position of a cursor at the first pointing coordinate;
generating a second directional coordinate at a second time to leave the lock point locating mode, wherein the second time is after the first time;
calculating a first motion vector between the first directional coordinate and the second directional coordinate;
generating a unit compensation vector according to the first motion vector; and
and during subsequent movement after the second time, based on the first pointing coordinate as a starting point, performing positioning calculation on the cursor according to the unit compensation vector and a pointing coordinate displacement generated by matching the movement of the handheld pointing device.
10. A cursor positioning method according to claim 9, wherein after the step of performing a positioning calculation of the cursor after the second time, comprising:
and correspondingly generating a cursor parameter for controlling the display position of the cursor according to the calculation result.
11. The cursor positioning method of claim 9, wherein the step of performing the positioning calculation of the cursor after the second time comprises:
and starting a pointing coordinate correction program at a third time to enable the handheld pointing device to calculate the cursor positioning in a relative positioning mode, wherein the relative positioning mode is that when the handheld pointing device moves subsequently after the second time, the positioning calculation of the cursor at the third time is carried out based on the first pointing coordinate as a starting point and the pointing coordinate displacement generated by the movement of the handheld pointing device and the unit compensation vector, wherein the pointing coordinate displacement is the displacement of two pointing coordinates calculated by the handheld pointing device in continuous time.
12. A cursor positioning method according to claim 11, wherein the relative positioning mode performing step comprises:
step a, setting a correction frequency as N, the unit compensation vector as C and an initial point coordinate, wherein C is the first motion vector divided by N, and the initial point coordinate is the first pointing coordinate;
b, calculating the pointing coordinate displacement of the handheld pointing device;
c, calculating the sum of the starting point coordinate, the pointing coordinate displacement of the handheld pointing device and C to obtain a relative pointing coordinate;
d, generating and outputting a cursor parameter for correspondingly controlling the cursor to be positioned on the display device according to the relative pointing coordinate;
step e, setting the initial point coordinate as the relative pointing coordinate and executing N-1;
f, judging whether N is equal to zero; and
and g, if N is not equal to zero, returning to the step b.
CN201710954368.5A 2013-07-22 2013-07-22 Cursor positioning method of handheld pointing device Active CN107728881B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710954368.5A CN107728881B (en) 2013-07-22 2013-07-22 Cursor positioning method of handheld pointing device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710954368.5A CN107728881B (en) 2013-07-22 2013-07-22 Cursor positioning method of handheld pointing device
CN201310309281.4A CN104331212B (en) 2013-07-22 2013-07-22 The cursor positioning method of hand-held indicator device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201310309281.4A Division CN104331212B (en) 2013-07-22 2013-07-22 The cursor positioning method of hand-held indicator device

Publications (2)

Publication Number Publication Date
CN107728881A CN107728881A (en) 2018-02-23
CN107728881B true CN107728881B (en) 2020-08-04

Family

ID=52405949

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201710954368.5A Active CN107728881B (en) 2013-07-22 2013-07-22 Cursor positioning method of handheld pointing device
CN201310309281.4A Active CN104331212B (en) 2013-07-22 2013-07-22 The cursor positioning method of hand-held indicator device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201310309281.4A Active CN104331212B (en) 2013-07-22 2013-07-22 The cursor positioning method of hand-held indicator device

Country Status (1)

Country Link
CN (2) CN107728881B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104768045B (en) * 2015-03-26 2018-01-16 百视通网络电视技术发展有限责任公司 Cursor control method and cursor control system based on television applications
CN107463320B (en) * 2017-08-25 2021-06-15 上海联影医疗科技股份有限公司 System correction method and device
CN110489027B (en) * 2019-07-05 2021-07-23 深圳市格上格创新科技有限公司 Handheld input device and display position control method and device of indication icon of handheld input device
CN112333489A (en) * 2020-08-20 2021-02-05 深圳Tcl新技术有限公司 Laser cursor monitoring method and device and storage medium
CN112631495A (en) * 2020-12-25 2021-04-09 深圳Tcl新技术有限公司 Interaction method based on laser induction, terminal equipment and computer readable storage medium
CN114760513A (en) * 2022-04-22 2022-07-15 海信电子科技(武汉)有限公司 Display device and cursor positioning method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101226448A (en) * 2008-01-21 2008-07-23 杜炎淦 Raster display screen sursor locating method and cursor telecontrol indicator
CN101452349A (en) * 2007-11-30 2009-06-10 原相科技股份有限公司 Cursor controller on image display apparatus, method and image system
CN201859402U (en) * 2010-10-18 2011-06-08 康佳集团股份有限公司 Mouse device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7796116B2 (en) * 2005-01-12 2010-09-14 Thinkoptics, Inc. Electronic equipment for handheld vision based absolute pointing system
US8913003B2 (en) * 2006-07-17 2014-12-16 Thinkoptics, Inc. Free-space multi-dimensional absolute pointer using a projection marker system
CN101482782A (en) * 2009-02-06 2009-07-15 袁鸿军 Cursor positioning system and method
CN201716689U (en) * 2010-03-23 2011-01-19 精元电脑股份有限公司 Computer input device capable of setting macros

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101452349A (en) * 2007-11-30 2009-06-10 原相科技股份有限公司 Cursor controller on image display apparatus, method and image system
CN101226448A (en) * 2008-01-21 2008-07-23 杜炎淦 Raster display screen sursor locating method and cursor telecontrol indicator
CN201859402U (en) * 2010-10-18 2011-06-08 康佳集团股份有限公司 Mouse device

Also Published As

Publication number Publication date
CN104331212B (en) 2017-11-10
CN107728881A (en) 2018-02-23
CN104331212A (en) 2015-02-04

Similar Documents

Publication Publication Date Title
CN107728881B (en) Cursor positioning method of handheld pointing device
TWI489324B (en) Pointer positioning method of pointer device
US20210096651A1 (en) Vehicle systems and methods for interaction detection
US9348418B2 (en) Gesture recognizing and controlling method and device thereof
JP5802667B2 (en) Gesture input device and gesture input method
CN103379274B (en) Camera apparatus and its control method
KR102114377B1 (en) Method for previewing images captured by electronic device and the electronic device therefor
TWI543018B (en) An input device, an input method, and storage medium
JP6232730B2 (en) Projector and control method
JP5581817B2 (en) Control system, control device, handheld device, control method and program.
JP5258399B2 (en) Image projection apparatus and control method thereof
US20140043535A1 (en) Display apparatus, information processing system and recording medium
CN111417989B (en) Program, information processing method, information processing system, head-mounted display device, and information processing device
US7884802B2 (en) Information selecting apparatus and storage medium storing information selecting program
US20190318169A1 (en) Method for Generating Video Thumbnail on Electronic Device, and Electronic Device
JP5988549B2 (en) POSITION CALCULATION SYSTEM, POSITION CALCULATION DEVICE, POSITION CALCULATION PROGRAM, AND POSITION CALCULATION METHOD
US9229614B2 (en) Storage medium storing information processing program, information processing device, information processing system, and method for calculating specified position
US20150145957A1 (en) Three dimensional scanner and three dimensional scanning method thereof
JP5873515B2 (en) GAME PROGRAM AND INFORMATION PROCESSING DEVICE
KR101365083B1 (en) Interface device using motion recognition and control method thereof
US20140125634A1 (en) Information processing apparatus, information processing system, information processing method, program and information recording medium
JP6510134B1 (en) Estimation system and estimation apparatus
CN110780743A (en) VR (virtual reality) interaction method and VR equipment
TWI607343B (en) Information technology device input systems and associated methods
TWI522848B (en) Pointer device and pointer positioning method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant