US9804689B2 - Handheld pointer device and pointer positioning method thereof - Google Patents

Handheld pointer device and pointer positioning method thereof Download PDF

Info

Publication number
US9804689B2
US9804689B2 US14/536,769 US201414536769A US9804689B2 US 9804689 B2 US9804689 B2 US 9804689B2 US 201414536769 A US201414536769 A US 201414536769A US 9804689 B2 US9804689 B2 US 9804689B2
Authority
US
United States
Prior art keywords
tilt angle
cursor
pointer device
reference point
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US14/536,769
Other versions
US20150054745A1 (en
Inventor
Han-Ping CHENG
Chao-Chien Huang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/771,072 external-priority patent/US20130328772A1/en
Priority claimed from TW102144801A external-priority patent/TWI522848B/en
Priority claimed from US14/273,523 external-priority patent/US10067576B2/en
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to US14/536,769 priority Critical patent/US9804689B2/en
Assigned to PIXART IMAGING INC. reassignment PIXART IMAGING INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHENG, HAN-PING, HUANG, CHAO-CHIEN
Publication of US20150054745A1 publication Critical patent/US20150054745A1/en
Priority to US15/687,525 priority patent/US10379627B2/en
Application granted granted Critical
Publication of US9804689B2 publication Critical patent/US9804689B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0383Signal control means within the pointing device

Definitions

  • the present disclosure relates to a positioning method of a pointer device in particular, to a pointer positioning method of a handheld pointer device.
  • a handheld pointer device is operable to compute pointing coordinates thereof by analyzing image positions of at least one reference light source formed in images captured and to transmit pointing coordinates computed to a video game console for assisting gaming process executed on the video game console.
  • handheld pointer devices have been widely used in many types of interactive gaming systems such as light gun games, baseball games, tennis games, and the like.
  • a handheld pointer device is typically equipped with at least one tilt sensing device for constantly detecting the instant rotation angle of the handheld pointer device and correspondingly updating the tilt angle used in calculation of pointing coordinates.
  • the relative movement of the handheld pointer device with respective to the position of the reference light source thus can be accurately computed and determined, thereby avoid erroneous position determination of the reference point.
  • the handheld pointer device updates the tilt angle presently used in the computation of pointing coordinate
  • the handheld pointer device will instantly compute the pointing coordinate using the newly updated tilt angle and the image position of the reference light source in the sensing area of the image sensor computed and control the movement of the cursor, accordingly.
  • cursor displayed on the display apparatus would suddenly jump from one place to another, and reduce user's operability, at same time, increases operation inconvenience.
  • an exemplary embodiment of the present disclosure provide a pointer positioning method for a handheld pointer device, and the pointer positioning method can cause the handheld pointer device to automatically compensate and correct pointing coordinates computed based on the displacement generated as the handheld pointer device updated the tilt angle thereof, thereby effectively avoid the occurrence of having the cursor suddenly jumps from one place to another and increases the user's operation with the handheld pointer device.
  • An exemplary embodiment of the present disclosure provides a pointer positioning method of a handheld pointer device and the pointer position method includes the following steps.
  • the handheld pointer device updates a first tilt angle presently used to a second tilt angle, captures a first frame containing a reference point.
  • a first pointing coordinate is computed according to the image position of the reference point formed in the first frame and the first tilt angle.
  • a second pointing coordinate is computed according to the image position of the reference point formed in the first frame and the second tilt angle.
  • a second frame containing the reference point is captured.
  • a third pointing coordinate is then computed according to the image position of the reference point formed in the second frame and the second tilt angle.
  • a cursor position according to the first pointing coordinate, the second pointing coordinate, and the third pointing coordinate, so as to correspondingly generate a cursor parameter controlling a display position of a cursor on a display apparatus.
  • An exemplary embodiment of the present disclosure provides a pointer positioning method of a handheld pointer device and the pointer position method includes the following steps.
  • the handheld pointer device updates a first tilt angle presently used to a second tilt angle
  • An angle difference between the first tilt angle and the second tilt angle is subsequently computed.
  • a first pointing coordinate is computed according to the image position of the reference point formed in the first frame and the first tilt angle.
  • a second pointing coordinate is computed according to the image position of the reference point formed in the first frame and the second tilt angle.
  • the handheld pointer device is driven to compute a cursor position of a cursor in the subsequent movement of the handheld pointer device on the basis of a first displacement vector computed between the first and the second pointing coordinates along with the pointing coordinate generated in accordance to the movement of the handheld pointer device.
  • the handheld pointer device further generates a cursor parameter and correspondingly controls a display position the cursor on a display apparatus.
  • An exemplary embodiment of the present disclosure provides a pointer positioning method of a handheld pointer device, and the pointed positioning method includes the following steps.
  • the handheld pointer device is driven to update a first tilt angle presently used to a second tilt angle.
  • the handheld pointer device computes a first pointing coordinate and a second point coordinate according to the image position of the reference point in a first frame with the first tilt angle and the second tilt angle, respectively.
  • the handheld pointer device is driven to compute a third pointing coordinate according to the image position of the reference point formed in a second frame captured and the second tilt angle.
  • the second time interval occurs after the first time interval.
  • the cursor position of a cursor is then computed according to the first pointing coordinate, the second pointing coordinate, and the third pointing coordinate to generate a cursor parameter for controlling a display position of the cursor on a display apparatus, accordingly.
  • the handheld pointer device includes an image capturing unit, an accelerometer unit, and a processing unit.
  • the image capturing unit is configured to operatively capture a plurality of images corresponding to the position of a reference point and sequentially generate a plurality of frames.
  • the accelerometer unit is configured to operatively detect a plurality of accelerations of the handheld pointer device over multiple axes for generating an acceleration vector.
  • the processing unit is coupled to the image capturing unit and the accelerometer unit.
  • the processing unit is configured to operatively compute a cursor position of a cursor according to the image positions of the reference points in the frames and a first tilt angle.
  • the processing unit When the processing unit updates the first tilt angle presently used in the cursor position computation to a second tilt angle according to the plurality of accelerations detected, the processing unit operatively drives the image capturing unit to capture a first frame containing the reference point. The processing unit then respectively computes a first pointing coordinate and a second pointing coordinate using the first and the second tilt angles in coordination with the image position of the reference point formed in the first frame. The processing unit drives the image capturing unit to capture a second frame containing the reference point thereafter. Afterward, the processing unit computes the cursor position according to the image position of the reference frame in the second frame, the first pointing coordinate, the second pointing coordinate, to correspondingly generate a cursor parameter for controlling a display position of the cursor on a display apparatus.
  • An exemplary embodiment of the present disclosure provides a non-transitory computer-readable media, for storing a computer executable program for the aforementioned pointer positioning method.
  • the processor executes the aforementioned pointer positioning method.
  • exemplary embodiments provide a handheld pointer device and a pointer positioning method thereof.
  • the handheld pointer device and the pointer positioning method thereof area are adapted for controlling the operation of a cursor displayed on a display apparatus.
  • the pointer positioning method operatively calibrate and correct pointing coordinates during the computation of cursor position after the handheld pointer device updated the tilt angle thereof in such a way that the display position of the cursor can be adjusted to gradually move to the position which the handheld pointer device actually point toward within a preset calibration time or a preset number of calibration. Accordingly, the issue of the cursor suddenly jump from one place to another after the tilt angle has updated can be effectively avoid. Thereby, enhance the stability of the handheld pointer device and at the same time, increase the operation convenience and of the user as well.
  • FIG. 1 is a diagram illustration an operation of a handheld pointer device in an interactive system provided in accordance to an exemplary embodiment of the present disclosure.
  • FIG. 2 is a block diagram of a handheld pointer device provided in accordance to an exemplary embodiment of the present disclosure.
  • FIG. 3 is a flowchart diagram illustrating a pointer positioning method of a handheld pointer device provided in accordance to an exemplary embodiment of the present disclosure.
  • FIG. 4A ⁇ 4 B are diagrams respectively illustrating image positions of the reference point detected as the handheld pointer device moves provided in accordance to an exemplary embodiment of the present disclosure.
  • FIG. 4C is a diagram illustrating the image positions of the reference point computed using different tilt angle provided in accordance to an exemplary embodiment of the present disclosure.
  • FIG. 4D is a diagram illustrating the image position of the reference point detected as the handheld pointer device moves and the correspondingly movement of the cursor displayed on a display apparatus provided in accordance to an exemplary embodiment of the present disclosure.
  • FIG. 5 is a diagram illustrating the movement of the cursor displayed on a display apparatus provided in accordance to an exemplary embodiment of the present disclosure.
  • FIG. 6 is a flowchart diagram illustrating a method for calibrating the cursor position after the update of tilt angle provided in accordance to an exemplary embodiment of the present disclosure.
  • FIG. 7 is a diagram illustrating the movement of the cursor displayed on a display apparatus along with the movement of a handheld pointer device provided in accordance to an exemplary embodiment of the present disclosure.
  • FIG. 8 is a flowchart diagram illustrating a pointer positioning method of a handheld pointer device provided in accordance to another exemplary embodiment of the present disclosure.
  • FIG. 9 is a flowchart diagram illustrating a pointer positioning method of a handheld pointer device provided in accordance to another exemplary embodiment of the present disclosure.
  • a handheld pointer device of the present disclosure can be adapted for positioning a pointer (such as a cursor) on a display apparatus.
  • FIG. 1 shows a diagram illustration an operation of a handheld pointer device in an interactive system provided in accordance to an exemplary embodiment of the present disclosure.
  • An interactive system of the instant embodiment includes a handheld pointer device 10 and a display apparatus 20 .
  • the display apparatus 20 is equipped with at least one reference point 21 , which is provided to the handheld pointer device 10 to use as reference for controlling the movement of a cursor 23 displayed on the display apparatus 20 .
  • the display apparatus 20 is configured to have the necessary software and hardware architectures for executing data and displaying software application.
  • the display apparatus 20 includes but not limited to a projection display, a game console display, a television, or a monitor of a computer system.
  • the interactive system can further include a host computer (not shown) such as video game console or a computer.
  • the host computer can be configured to operatively process the program codes associated with a software application (e.g., video games such as light gun games, baseball games, tennis games and the like) and execute the software application.
  • the host computer further can be configured to display the execution progress of the software application on the display apparatus 20 for the user to view and perform the correspondingly control operations.
  • the reference point 21 is placed near the display apparatus 20 and is provided to the handheld pointer device 10 for determining the pointing position thereof 10 , i.e., determines the moving direction and the displacement of the handheld pointer device 10 relative to the reference point 21 .
  • the reference point 21 can be implemented by a plurality of light emitting diodes with specific wavelength, such as infrared light emitting diodes (IR LED), laser diodes, or ultraviolet light emitting diodes, arranged in a regular or irregular shape. Moreover, the light emitting diodes may be configured to electrically connect to the display apparatus 20 or may be powered by an independent power source for lighting. It shall be noted that the number of the reference point is not limited to one as used in the instant embodiment. Those skilled in the art should be able to configure the exact number of the reference point 21 required to be one, two, or more than two according to the practical design and/or operational requirements. In other words, FIG. 1 is merely used to illustrate an operation of the handheld pointer device 10 , and the instant disclosure is not limited thereto.
  • the handheld pointer device 10 operatively drives an image capturing unit 11 installed thereon to capture images of the reference point 21 as the handheld pointer device 10 points toward the position of the reference point 21 and sequentially generates a plurality of frames containing the image of the reference point 21 .
  • the handheld pointer device 10 operatively computes a pointing coordinate generated as the handheld pointer device 10 points toward the display apparatus 20 according to an image position of the reference point 21 formed in one of the frames captured and the tilt angle presently used in the pointing coordinate calculation.
  • the handheld pointer device 10 computes the cursor position of the cursor 23 on the display apparatus 20 according to the pointing coordinate computed.
  • the handheld pointer device 10 further wirelessly transmits a cursor parameter generated based on the relative movement of the reference point 21 for controlling the display position of the cursor 23 to the display apparatus 20 .
  • the handheld pointer device 10 thus controls the movement of the cursor 23 displayed on the display apparatus 20 .
  • the handheld pointer device 10 may further determine whether to update a first tilt angle (i.e., the current rotation angle of the handheld pointer device 10 ) being presently used to a second tilt angle based on the movement of image positions of the reference point 21 in frames captured.
  • the handheld pointer device 10 may first determine whether the handheld pointer device 10 is in motion or at rest by determining whether or not the image position of the reference point 21 formed in consecutive frames has substantially moved. The handheld pointer device 10 subsequently determines whether to update the tilt angle presently used in computing the pointing coordinate according to the determination result.
  • the handheld pointer device 10 may also determine whether the handheld pointer device 10 is in motion or at rest by determining whether pointing coordinates computed based on the image positions of the reference point 21 formed in frames captured using the first tilt angle has substantially moved. The handheld pointer device 10 determines whether to update the tilt angle presently used in computing pointing coordinate according to the determination result thereafter.
  • the phrase of the reference point 21 has substantially moved herein indicates that the reference point 21 has moved over a short period of time (i.e., a second, a millisecond, two adjacent frames, or multiple consecutive frames). Whether the reference point 21 has substantially moved can be determined by the position displacement of the image position of the reference point 21 formed in consecutive frames captured, or the velocity of the image position of the reference point 21 formed in the consecutive frames captured, or the acceleration of image position of the reference point 21 formed in the consecutive frames captured, or the displacement, the velocity, or the acceleration of pointing coordinates computed based on consecutive frames captured.
  • the handheld pointer device 10 may use an inertial sensor to sense and compute the instant tilt angle of the handheld pointer device 10 .
  • the force exerted by the user onto the handheld pointer device 10 while the user operates the handheld pointer device 10 might affect the gravitational direction determination result detected by the inertial sensor.
  • the impact of the user on the handheld pointer device 10 while the user operates the handheld pointer device 10 must be removed or eliminated in order to accurately compute and update the tilt angle of the handheld pointer device 10 .
  • the handheld pointer device 10 can be regarded as unaffected by the external force exerted thereon.
  • the handheld pointer device 10 thus can accurately sense and compute the instant rotation angle of the handheld pointer device 10 , and update the first tilt angle presently used by the handheld pointer device 10 to the second tilt angle.
  • the handheld pointer device 10 When the handheld pointer device 10 determines to update the first tilt angle presently used to the second tilt angle, the handheld pointer device 10 operatively captures a first frame containing the reference point 21 . The handheld pointer device 10 computes a first pointing coordinate relative to the display apparatus 20 according to the image position of the reference pointer 21 formed in the first frame and the first tilt angle. The handheld pointer device 10 further computes the cursor position according to the first pointing coordinate computed to correspondingly generate the cursor parameter for controlling the display position of the cursor 23 on the display apparatus 20 .
  • the handheld pointer device 10 then computes a second pointing coordinate relative to the display apparatus 20 according to the image position of the reference pointer 21 formed in the first frame and the second tilt angle. Thereafter, in the subsequent computation of a third pointing coordinate, the handheld pointer device 10 operatively determines whether or not to perform a cursor position calibration process to calibrate and correct the third pointing coordinate computed according to the displacement vector between the first and the second pointing coordinates (i.e., the displacement of the cursor 23 ) during the computation of a third pointing coordinate according to the image position of the reference pointer 21 formed in the first frame and the second tilt angle.
  • a cursor position calibration process to calibrate and correct the third pointing coordinate computed according to the displacement vector between the first and the second pointing coordinates (i.e., the displacement of the cursor 23 ) during the computation of a third pointing coordinate according to the image position of the reference pointer 21 formed in the first frame and the second tilt angle.
  • the handheld pointer device 10 determines that the displacement vector between the first and the second pointing coordinates is computed to be greater than or equal to a first predetermined threshold, the handheld pointer device 10 operatively corrects the third pointing coordinate for compensating the offset generated as the handheld pointer device 10 updates the tilt angle thereof. More specifically, the handheld pointer device 10 computes a cursor position according to the first pointing coordinate, the second pointing coordinate, and the third pointing coordinate, so as to correspondingly generate the cursor parameter for controlling the display position of the cursor 23 on the display apparatus 20 .
  • the handheld pointer device 10 determines that the displacement vector between the first and the second pointing coordinates is computed to be less than the first predetermined threshold, the handheld pointer device 10 directly computes the cursor position according to the third pointing coordinate without apply any compensation and correspondingly generate the cursor parameter to control the display position of the cursor 23 on the display apparatus 20 .
  • the handheld pointer device 10 can also determine whether to calibrate and correct the pointing coordinate computed using the updated tilt angle (i.e. the second tilt angle) according to the angle difference between the first tilt angle and the second tilt angle.
  • the handheld pointer device 10 calibrates and corrects the pointing coordinate (i.e., the third pointing coordinate) computed in the subsequent cursor position computation using the updated tilt angle when the angle difference between the first tilt angle presently used and the second tilt angle is computed to be larger than a preset angle.
  • the handheld pointer device 10 When the handheld pointer device 10 has determined to compensate pointing coordinates computed after the operation of updating the first tilt angle to the second tilt angle, the handheld pointer device 10 completes a cursor position calibration within a preset calibration time or preset number of calibration and causes the cursor 23 to smoothly move from the movement path that corresponds to the first tilt angle to the movement path that corresponds to the second tilt angle.
  • the handheld pointer device 10 thus can accurately compute the relative moving information of the handheld pointer device 10 with respect to the display apparatus 20 and precisely control the movement of the cursor 23 on the display apparatus 20 , at the same time prevent user's operation with the handheld pointer device 10 from being affected by the tilt angle update.
  • the handheld pointer device 10 is operable to determine whether to calibrate and correct the pointing coordinate computed using the updated tilt angle after each tilt angle update (i.e., configure the first predetermined threshold and the preset angle) as well as the associated calibration and compensation method (i.e., the amount of compensation in each calibration and the preset calibration time) according to the type of the software program executed by the display apparatus 20 as well as the resolution of the display apparatus 20 .
  • the handheld pointer device 10 can pre-store multiple sets of calibration parameter associated with different resolutions of the display apparatus 20 and the type of software application.
  • the value of the first predetermined threshold or the value of the preset angle should be configured to be relatively small, such that the handheld pointer device 10 executes a calibration program and calibrates pointing coordinate after each tilt angle updates, so as to increase the directivity of the handheld pointer device 10 .
  • the value of the first predetermined threshold or the value of the preset angle should be configured to be relatively larger such that the handheld pointer device 10 does not have to calibrate the pointing coordinate after each tilt angle update, or does not have to execute the calibration program of pointing coordinate after each tilt angle adjustment, and reduces the number of calibration needed, thereby reduces the computational complexity of the pointing coordinates.
  • the handheld pointer device 10 can automatically link with the display apparatus 20 at start up and access the type of the software application currently executed on the display apparatus 20 . Then, the handheld pointer device 10 operatively determines whether to calibrate and correct pointing coordinates computed by the handheld pointer device 10 after tilt angle update based on the type of the software program currently executed on the display apparatus 20 , and selects the appropriate calibration parameters. Accordingly, the applicability and operation convenience of the handheld pointer device 10 can be enhanced.
  • FIG. 2 shows a block diagram of a handheld pointer device provided in accordance to an exemplary embodiment of the present disclosure.
  • the handheld pointer device 10 includes an image capturing unit 11 , an accelerometer unit 12 , a processing unit 13 , an input unit 14 , a memory unit 15 , and a communication unit 16 .
  • the image capturing unit 11 , the accelerometer unit 12 , the input unit 14 , the memory unit 15 , and the communication unit 16 are coupled to the processing unit 13 , respectively.
  • the accelerometer unit 12 can be integrated with the image capturing unit 11 .
  • the accelerometer unit 12 is electrically connected to the processing unit 13 through the image capturing unit 11 .
  • at least one of the image capturing unit 11 , the accelerometer unit 12 , the input unit 14 , the memory unit 15 , and the communication unit 16 and another component thereof may be configured to electrically connect the processing unit 13 in series.
  • the image capturing unit 11 is configured to operatively capture images containing the reference point 21 as the handheld pointer device 10 pointing toward the reference point 21 and sequentially generate a plurality of frames. Specifically, the image capturing unit 11 can be configured to operatively detect the light emitted from the reference point 21 according to a frame capturing rate (for example, 200 frames per second), and sequentially generates a plurality of frames containing the image of the reference point 21 .
  • a frame capturing rate for example, 200 frames per second
  • An optical filter (not shown) can be used for filtering out light spectrum outside the specific light spectrum generated by the reference point 21 such that the image capturing unit 11 only detects the light having wavelength within the specific light spectrum generated by the reference point 21 .
  • the image capturing unit 11 can be implemented by a charge-coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor.
  • CCD charge-coupled device
  • CMOS complementary metal oxide semiconductor
  • the accelerometer unit 12 is configured to detect a plurality of accelerations of the handheld pointer device 10 over multiple axes (e.g., X-axis, Y-axis, and Z-axis) of a space, and generate an acceleration vector, accordingly.
  • the accelerometer unit 12 in the instant embodiment includes but not limited to a G-sensor or an accelerometer, and the accelerometer unit 12 can be built-in in the handheld pointer device 10 .
  • the accelerometer unit 12 may be implemented by an external device connected to the handheld pointer device 10 .
  • Those skilled in the art should be able to implement the accelerometer unit 12 according to the practical operation and/or design requirements and the instant present disclosure is not limited to the example provided herein.
  • the processing unit 13 is configured to receive frames outputted by the image capturing unit 11 and compute an image position of the reference point 21 formed in one of the frames according to the respective frame among the frames captured.
  • the processing unit 13 operatively computes the pointing coordinate of the handheld pointer device 10 with respective to the position of the reference point 21 using the first tilt angle.
  • the processing unit 13 further computes the cursor position based on the pointing coordinate computed, so as to correspondingly generate the cursor parameter controlling the movement (i.e., the display position) of the cursor.
  • the processing unit 13 drives the communication unit 16 and wirelessly transmits the cursor parameter to the display apparatus 20 to correspondingly control the movement of the cursor 23 displayed on the display apparatus 20 in coordination with the execution of the software program on the display apparatus 20 .
  • the processing unit 13 can operatively determine whether the reference point 21 has moved based on frames captured, i.e., whether the image position of the reference point 21 has substantially moved.
  • the processing unit 13 determines that the reference point 21 has not substantially moved, the processing unit 13 instantly reads the accelerations of the handheld pointer device 10 over multiple axes detected by the accelerometer unit 12 .
  • the processing unit 13 computes and updates the first tilt angle presently used to the second tilt angle according to the accelerations of the handheld pointer device 10 detected.
  • the processing unit 13 uses the second tilt angle updated and the image position of the reference point 21 formed in one of the frames to compute the pointing coordinate of the handheld pointer device 10 relative to the display apparatus 20 .
  • the processing unit 13 can compute the instant tilt angle of the handheld pointer device 10 using the accelerations of the handheld pointer device 10 over X-axis, Y-axis, and Z-axis detected by the accelerometer unit 12 and the included angles computed between any two axes, and update the first tilt angle to the second tilt angle, accordingly.
  • the processing unit 13 determines that the position of the reference point 21 has substantially moved, the processing unit 13 does not update the first tilt angle presently used as the processing unit 13 operatively determines that the accelerometer 13 is currently unable to made accurately acceleration measurement associated with the handheld pointer device 10 .
  • the processing unit 13 continues to use the first tilt angle and the image position of the reference point 21 formed in one of the frames to compute the pointing coordinate of the handheld pointer device 10 relative to the display apparatus 20 .
  • the processing unit 13 generates the cursor parameter for controlling the display position of the cursor 23 according to the pointing coordinate computed.
  • the processing unit 13 then drives the communication unit 16 to wirelessly transmit the cursor parameter to the display apparatus 20 .
  • the plurality frames generated by the image capturing unit 11 are rectangular-shape.
  • the long side of a frame is configured to be parallel to the X-axis, and the short side of the frame is configured to be parallel to the Y-axis.
  • the processing unit 13 determines that the reference point 21 has not substantially moved, the processing unit 13 reads the accelerations Vx, Vy, and Vz of the handheld pointer device 10 over the X-axis, Y-axis, and Z-axis of the three dimensional space depicted in FIG. 1 detected by the accelerometer unit 12 .
  • the accelerometer unit 12 operatively generates an acceleration vector V according to the detection result, and generates an acceleration sensing signal, accordingly.
  • the acceleration sensing signal represents the ratio of any two accelerations, such as the ratio of the acceleration Vx to the acceleration Vy.
  • the processing unit 13 computes the instant tilt angle of the handheld pointer device 10 according to the acceleration sensing signal received.
  • the processing unit 13 can compute the acceleration vector V and the included angles between any two of axes by using the following Eqs. (1) to (3) and obtain the instant tilt angle of the handheld pointer device 10 ,
  • Vx represents the acceleration of the handheld pointer device 10 over the X-axis detected by the accelerometer unit 12
  • Vy represents the acceleration of the handheld pointer device 10 over the Y-axis detected by the accelerometer unit 12
  • represents the gravitational acceleration computed according to the acceleration Vx and the acceleration Vy.
  • the processing unit 13 subsequently corrects the orientation of the frames based on the computation result of Eq. (1) and Eq. (2) using Eq. (4), so that the coordinate system of the frame corrected is the same as the coordinate system of the display apparatus 20 ,
  • [ x ′ y ′ ] [ cos ⁇ ⁇ ( ⁇ ) - sin ⁇ ⁇ ( ⁇ ) sin ⁇ ⁇ ( ⁇ cos ⁇ ⁇ ( ⁇ ) ] ⁇ [ x y ] , ( 4 ) wherein x represents the X-axis coordinate of the image position of the reference point 21 formed in one of the frames; y represents the Y-axis coordinate of the image position of the reference point 21 formed in one of the frames; x′ represents the X-axis coordinate of the image position of the reference point 21 formed in one of the frames after orientation correction; y′ represents the adjusted Y-axis coordinate of the image position of the reference point 21 formed in one of the frames after orientation correction.
  • the processing unit 13 further computes the pointing coordinate of the handheld pointer device 10 relative to the reference point 21 or the display apparatus 20 according to X-axis coordinate x′ and Y-axis coordinate y′ obtained after orientation correction.
  • the processing unit 13 computes the cursor position according to the pointing coordinate computed, so as to generate the cursor parameter to correspondingly control the movement of the cursor 23 on the display apparatus 20 .
  • the processing unit 13 wirelessly transmits the cursor parameter or the relative movement information of the handheld pointer device 10 to the display apparatus 20 via the communication unit 16 to correspondingly control the display position of the cursor 23 on the display apparatus 20 .
  • the accelerometer unit 12 of the handheld pointer device 10 in the present disclosure can also be configured to only detect accelerations over two dimensions, such as the acceleration Vx and the acceleration Vy.
  • the above described acceleration determination method for the handheld pointer device 10 is only an implementation and does not limit the scope of the present disclosure.
  • computing the pointing coordinate of the handheld pointer device 10 relative to the display apparatus 20 according to the image position of one or more reference point formed in frames captured is known technique in the art and is not the focus of the present disclosure, thus further descriptions are hereby omitted.
  • the input unit 14 is configured to enable a user of the handheld pointer device 10 configuring the frame capturing rate and the calibration parameters, which includes but not limited to the calibration time, the number of calibrations, and the amount of compensation in each calibration.
  • the user of the handheld pointer device 10 may set the frame capturing rate according to a preset calibration time and configure the number of calibrations according to the predetermined frame capturing rate.
  • the user may also determine and set the number of calibrations based on the frame capturing rate configured.
  • the frame capturing rate may be configured according to the frame refresh rate of the display apparatus 20 .
  • the input unit 14 is configured to cause the display apparatus 20 to display a configuration or setting interface provided for the user to configure the calibration time, the frame capturing rate and/or the number of calibrations for correcting the cursor position.
  • the input unit 14 may be implemented by a keypad interface, an optical finger navigation component, or a button and the present disclosure is not limited thereto.
  • the handheld pointer device 10 has a display screen (not shown), and the display screen can be configured to show the calibration time, the frame capturing rate, the number of calibrations for correcting the cursor position, and the amount of compensation applied in each calibration.
  • the display screen of the handheld pointer device 10 may be a touch screen.
  • the memory unit 15 can be configured to store operation parameters of the handheld pointer device 10 including but not limited to the first pointing coordinate, the second pointing coordinate, the third pointing coordinate, the first tilt angle, the second tilt angle, the first predetermined threshold, the preset angle, and the cursor parameter.
  • the memory unit 15 can be also configured to store the calibration time, the frame capturing rate and the number of calibrations for the cursor according to the operation of the handheld pointer device 10 .
  • the processing unit 13 in the instant embodiment can be implemented by a processing chip such as a microcontroller or an embedded controller programmed with necessary firmware, however the present disclosure is not limited to the example provided herein.
  • the memory unit 15 can be implemented by a volatile memory chip or a nonvolatile memory chip including but not limited to a flash memory chip, a read-only memory chip, or a random access memory chip.
  • the communication unit 16 can be configured to utilize Bluetooth technology and transmit the relative movement information to the display apparatus 20 , but the present disclosure is not limited thereto
  • the internal components of the handheld pointer device 10 may be added, removed, adjusted or replaced according to the functional requirements or design requirements and the present disclosure is not limited thereto. That is, the exact type, exact structure and/or implementation method associated with the image capturing unit 11 , the accelerometer unit 12 , the processing unit 13 , the input unit 14 , the memory unit 15 , and the communication unit 16 may depend upon the practical structure and the exact implementation method adopted for the handheld pointer device 10 and the present disclosure is not limited thereto.
  • FIG. 3 shows a flowchart diagram illustrating a pointer positioning method of a handheld pointer device provided in accordance to an exemplary embodiment of the present disclosure.
  • FIG. 4A ⁇ 4 B are diagrams respectively illustrating image positions of the reference point detected as the handheld pointer device moves provided in accordance to an exemplary embodiment of the present disclosure.
  • FIG. 4C shows a diagram illustrating the image positions of the reference point computed using different tilt angle provided in accordance to an exemplary embodiment of the present disclosure.
  • FIG. 4D shows a diagram illustrating the image position of the reference point detected as the handheld pointer device moves and the correspondingly movement of the cursor displayed on a display apparatus provided in accordance to an exemplary embodiment of the present disclosure.
  • Step S 301 when the processing unit 13 of the handheld pointer device updates the first tilt angle ⁇ 1 presently used to the second tilt angle ⁇ 2, the processing unit 13 drives the image capturing unit 11 to capture and generate the first frame F1 containing the reference point 21 .
  • the processing unit 13 can operatively determine whether to update the first tilt angle ⁇ 1 presently used in the cursor position computation to the second tilt angle ⁇ 2 by determining whether the image position of the reference point 21 formed corresponding to the position of the reference point 21 in the multiple consecutive images captured by the image capturing unit 11 has substantially moved.
  • the processing unit 13 can operatively determine whether to update the first tilt angle ⁇ 1 presently used in cursor position computation to the second tilt angle ⁇ 2 according to frames containing the reference point 21 captured and generated by the image capturing unit 11 .
  • the processing unit 13 operatively updates the first tilt angle ⁇ 1 presently used in the cursor position computation to the second tilt angle ⁇ 2 upon determining that the position displacement of the image position of the reference point 21 formed in any two consecutive frames is less than a predefined displacement threshold (e.g., 1 pixel). In another embodiment, the processing unit 13 operatively updates the first tilt angle ⁇ 1 presently used in the cursor position computation to the second tilt angle ⁇ 2, upon determining that the velocity of the image position of the reference point 21 formed in the any two consecutive frames is less than a predefined velocity threshold (e.g., 1 pixel per unit time).
  • a predefined displacement threshold e.g. 1 pixel
  • the processing unit 13 may operatively update the first tilt angle ⁇ 1 presently used in cursor position computation to the second tilt angle ⁇ 2, upon determining that the magnitude of the acceleration vector of the handheld pointer device 10 is equal to the gravitational acceleration (g) of the handheld pointer device 10 , wherein the acceleration vector is generated based on accelerations of the handheld pointer device 10 over multiple axes detected.
  • the processing unit 13 operatively reads accelerations of the handheld pointer device 10 over multiple axes e.g., X-axis, Y-axis, and Z-axis) detected by the accelerometer unit 12 and correspondingly updates the first tilt angle ⁇ 1 presently used to the second tilt angle ⁇ 2 computed upon determining that the position of the reference point 21 sensed has not substantially moved (i.e., the handheld pointer device 10 is at rest).
  • Step S 303 the processing unit 13 computes the first pointing coordinate based on the image position of the reference point 21 formed in the first frame F1 and the first tilt angle ⁇ 1.
  • the first pointing coordinate represents the pointing vector of the handheld pointer device 10 relative to the display apparatus 20 in the first frame F1.
  • the first pointing coordinate is represented by (x1, y1).
  • the processing unit 13 further computes the cursor position of the cursor 23 according to the first pointing coordinate , so as to correspondingly generate the cursor parameter controlling the display position of the cursor 23 on the display apparatus 20 . Subsequently, the processing unit 13 drives the communication unit 16 to wirelessly transmit the cursor parameter to the display apparatus 20 for correspondingly controlling the display position of the cursor 23 on the display apparatus 20 .
  • the processing unit 13 first defines an operating area 111 on the first frame F1 that corresponds to the display apparatus 20 according to the center point “+” of the first frames F1 and the image position of the reference point image 113 formed in the first frames F1.
  • the operating area 111 corresponds to the screen of the display apparatus 20 and is scaled with a predetermined display ratio.
  • the processing unit 13 defines the operating area 111 in the first frame F1 by using the image position of the reference point image 113 as the origin and scaled with the predetermined display ratio.
  • the processing unit 13 further defines the center 1111 of the operating area 111 in the first frame F1.
  • the processing unit 13 may set the center 1111 of the operating area 111 as the origin, apply Eqs (1) ⁇ (4) along with the first tilt angle ⁇ 1, and compute the pointing vector of the center point “+” of the first frame F1 in the operating area 111 to obtain the first pointing coordinate .
  • the processing unit 13 may also obtain the first pointing coordinate by computing the rotation angle of the handheld pointer device 10 .
  • the rotation angle of the handheld pointer device 10 is computed directly according to relationship between the center point “+” of the first frames F1 and the image position of the reference point image 113 in the first frames F1 or the image feature of the reference point image 113 .
  • the center point “+” in the instant embodiment represents the center of the image sensing array of the image capturing unit 11 .
  • the first pointing coordinate represents the pointing vector of the center of the image sensing array of the image capturing unit 11 (i.e., the center point “+”) in the first frame F1 with respect to the coordinate system of the display apparatus 20 defined therein.
  • Step S 305 the processing unit 13 computes the second pointing coordinate based on the image position of the reference point 21 formed in the first frame F1 and the second tilt angle ⁇ 2.
  • the second pointing coordinate represents the pointing vector computed by mapping the center of the image sensing array of the image capturing unit 11 (i.e., the center point “+”) onto the operating area 111 a which corresponds to the screen of the display apparatus 20 defined in the first frame F1.
  • the second pointing coordinate is represented by (x2, y2).
  • the processing unit 13 uses the center 1111 a of the operating area 111 a as the origin and correspondingly computes the pointing vector of the center point“+” of the first frame F1 in the operating area 111 a so as to obtain the second pointing coordinate and the second tilt angle ⁇ 2.
  • the operating area 111 a is defined based on the position of the reference point image 113 a.
  • the processing unit 13 subsequently computes the first displacement vector associated with the pointing coordinate computed for the same frame after the tilt angle adjustment using the first pointing coordinate and the second pointing coordinate .
  • the processing unit 13 then stores the first displacement vector S1 in the memory unit 15 .
  • Step S 307 the processing unit 13 drives the image capturing unit 11 to capture and generate the second frame F2 containing the reference point 21 .
  • the processing unit 13 computes the third pointing coordinate based on the image position of the reference point 21 formed in the second frame F2 and the second tilt angle ⁇ 2.
  • the second frame F2 is captured at a later time than the first frames F1.
  • the third pointing coordinate represents the pointing vector computed by mapping the center of the image sensing array of the image capturing unit 11 (i.e., the center point “+”) onto the operating area 111 b which corresponds to the screen of the display apparatus 20 defined in the second frame F2.
  • the second pointing coordinate is represented by (x3, y3).
  • the operating area 111 b is defined based on the reference point image 113 b.
  • Step S 309 the processing unit 13 determines whether an angle difference ⁇ d between the first tilt angle ⁇ 1 and the second tilt angle ⁇ 2 is smaller than a preset angle (e.g., 20 degrees).
  • a preset angle e.g. 20 degrees.
  • Step S 311 the processing unit 13 determines whether a first displacement vector between the first pointing coordinate and the second pointing coordinate is less than a first predetermined threshold (e.g., 10 pixels).
  • a first predetermined threshold e.g. 10 pixels.
  • the processing unit 13 executes Step S 313 ; otherwise, the processing unit 13 executes Step S 315 .
  • the first predetermined threshold can be configured according to the preset angle, e.g., set to a pixel value that corresponds to the angle difference of 20 degree.
  • Step S 313 the processing unit 13 computes the cursor position directly according to the third pointing coordinate . That is to say, when the processing unit 13 determined that both the angle difference ⁇ d is smaller than the preset angle and the first displacement vector is less than the first predetermined threshold, the processing unit 13 does not compensate the third pointing coordinate , instead, the processing unit 13 directly computes the cursor position according to the third pointing coordinate .
  • Step S 315 the processing unit 13 computes the cursor position according to the first pointing coordinate , the second pointing coordinate , and the third pointing coordinate . Particularly, the processing unit 13 first computes a compensated third pointing coordinate according to the third pointing coordinate and the first displacement vector . Afterward, the processing unit 13 computes the cursor position according to the compensated third pointing coordinate for compensating the offset between the first pointing coordinate and the second pointing coordinate .
  • Step S 317 the processing unit 13 generates the cursor parameter for correspondingly controlling the movement of the cursor 23 based on the computation result from either Step S 313 or Step S 315 .
  • the processing unit 13 subsequently drives the communication unit 16 to wirelessly transmit the cursor parameter to the display apparatus 20 to correspondingly control the movement (i.e., the display position) of the cursor 23 on the display apparatus 20
  • the display apparatus 20 will correspondingly display the cursor 23 on the display area of the screen shown thereon according to a display aspect ratio configured for the display apparatus 20 upon receiving the cursor parameter.
  • the display apparatus 20 operatively computes the display position of the cursor 23 and correspondingly positions the cursor 23 on the screen shown by the display apparatus 20 according to the current display aspect ratio (i.e., the resolution of the display apparatus 20 ).
  • the current display aspect ratio i.e., the resolution of the display apparatus 20 .
  • the reference point images 113 , 113 a , and 113 b in the instant disclosure are respectively represents by a circle, however the reference point images 113 , 113 a , and 113 b may also be represented by a cross-shaped or a star shaped symbol.
  • the present disclosure is not limited to the example illustrated in FIG. 4A ⁇ FIG. 4C .
  • the image positions of the reference point images 113 , 113 a and 113 b formed in the frames can be configured to be the average-coordinate between/among the reference point images identified.
  • the processing unit 13 further can compensate the computation of the image positions of the reference point images according to the preset image-forming parameters and the preset image-forming distance, so as to accurately determine the position of the reference point image.
  • the processing unit 13 should be able to know the configuration of the preset image-forming parameters and the image forming distance as well as apply compensation to the image positions of the reference point images 113 , 113 a , and 113 b in the frame computed using the preset image-forming parameters and the image forming distance, and further details are hereby omitted.
  • FIG. 5 shows a diagram illustrating the movement of the cursor displayed on the display apparatus 20 provided in accordance to an exemplary embodiment of the present disclosure.
  • the display position of the cursor 23 a corresponds to the pointing coordinate computed by the handheld pointer device 10 at time TA using the first tilt angle ⁇ 1.
  • the display position of the cursor 23 b corresponds to the pointing coordinate computed by the handheld pointer device 10 at time TB using the first tilt angle ⁇ 1.
  • the display position of the cursor 23 c corresponds to the pointing coordinate computed by the handheld pointer device 10 at time TC using the first tilt angle ⁇ 1.
  • the handheld pointer device 10 updates the first tilt angle ⁇ 1 to the second tilt angle ⁇ 2.
  • the handheld pointer device 10 at same time computes the first pointing coordinate and the second pointer coordinate using the first tilt angle ⁇ 1 and the second tilt angle ⁇ 2, respectively to obtain the first displacement vector .
  • the display position of the cursor 23 d corresponds to the compensated third pointing coordinate computed by the handheld pointer device 10 at time TD according to the first displacement vector and the second tilt angle ⁇ 2.
  • the display position of the cursor 25 a corresponds to the pointing coordinate directly computed by the handheld pointer device 10 at time TC using the second tilt angle ⁇ 2 without any compensation. That is, when no compensation is applied to the pointing coordinate computed after the handheld pointer device 10 updated the tilt angle used, the display position of the cursor will be at the position corresponds to the cursor 25 a . More specifically, as shown in FIG. 5 , the display position of cursor 23 c will suddenly jump to the display position of cursor 25 a when no compensation is applied to pointing coordinate after tilt angle adjustment, which degrades the user's operability.
  • the handheld pointer device 10 of the instant embodiment is operable to determine whether or not to compensate the pointing coordinates after updated the first tilt angle ⁇ 1 to the second tilt angle ⁇ 2 (e.g., whether cursor jumping issue is noticeable to the user). Moreover, when the handheld pointer device 10 determines to compensate the pointing coordinates computed after tilt angle update, the handheld pointer device 10 compensates the pointing coordinates according to the displacement generated after the handheld pointer device 10 updates the first tilt angle ⁇ 1 to the second tilt angle ⁇ 2.
  • the instant embodiment further provides a cursor position calibration algorithm.
  • the cursor position calibration algorithm can cause the cursor controlled to translate smoothly from the current moving path to the movement path that corresponds to the actual movement path of the handheld pointer device 10 after the tilt angle update within a preset calibration time or preset number of calibration. Such that the occurrence of the cursor jumping from one place to another can be prevented while and the directivity of the handheld pointer device 10 can be maintained.
  • FIG. 6 shows a flowchart diagram illustrating a method for calibrating the cursor position after the tilt angle update provided in accordance to an exemplary embodiment of the present disclosure.
  • FIG. 7 shows a diagram illustrating the movement of the cursor displayed on the display apparatus along with the movement of a handheld pointer device provided in accordance to an exemplary embodiment of the present disclosure.
  • Step S 601 when the processing unit 13 updates the first tilt angle ⁇ 1 to the second tilt angle ⁇ 2, the processing unit 13 initiates a cursor position calibration program and causes the handheld pointer device 10 to operate in a cursor calibration mode.
  • Step S 603 the processing unit 13 sets the number of calibrations as N, a compensation vector as C and a calibration coordinate p c .
  • the calibration coordinate herein is the pointing coordinate that requires compensation, such as the third pointing coordinate computed based on the image position of the reference point in the second frame F2 and the second tilt angle ⁇ 2.
  • the processing unit 13 stores N, C, and the calibration coordinate in the memory unit 15 .
  • the processing unit 13 determines whether the first displacement vector is greater than a second predetermined threshold. When the processing unit 13 determines that the first displacement vector is greater than the second predetermined threshold, the processing unit 13 operatively sets N equal to the first displacement vector divided by C wherein C is a predetermined compensation value. On the contrary, when the processing unit 13 determines that the first displacement vector is less than the second predetermined threshold, the processing unit 13 operatively sets C equal to the first displacement vector divided by N wherein N is a predetermined number of calibrations.
  • first predetermined threshold and the second predetermined threshold may be configured to be the same or different depend upon the practical operational requirements of the handheld pointer device 10 and/or the type of the software application executed on the display apparatus 20 .
  • the processing unit 13 when the first displacement vector is determined to be greater than the second predetermined threshold, indicates that the angle difference is relative large and requires a larger compensation vector, the processing unit 13 automatically selects the constant compensation method and gradually compensates pointing coordinates computed after tilt angle update to avoid the occurrence of cursor jumping negatively affecting the user operation.
  • the processing unit 13 quickly compensates and corrects the pointing coordinates computed within the preset number of calibrations.
  • the processing unit 13 determines to compute C according to the first displacement vector and N, the processing unit 13 can use Eq. (6) to compute C,
  • the processing unit 13 may set N according to the frame capturing rate or the preset calibration time configured by the user via the input unit 14 . For instance, when the user configures the handheld pointer device 10 to complete the cursor position calibration program within 5 frames based on the frame capturing rate, the processing unit 13 sets N to be 5 and computes C according to N and the first displacement vector . For another instance, when the user configures the preset calibration time to be 5 seconds (i.e., causes the handheld pointer device 10 to complete the cursor position calibration program within 5 seconds) and configures the frame capturing rate to be 5 frames per second, the processing unit 13 operatively sets N to be 25 and computes C according to N and the first displacement vector .
  • the processing unit 13 determines to compute N according to the first displacement vector and C, the processing unit 13 can use Eq. (7) to compute N,
  • C represents the compensation vector and C is a constant value
  • N represents the number of calibrations. According to Eq. (7), the larger the C is, the smaller the N is; the smaller the C is, the larger the N is.
  • the processing unit 13 can configure C according to the resolution of the display apparatus 20 provided by the user via the input unit 14 .
  • the processing unit 13 operatively sets C to be 3 and computes N according to C and the first displacement vector .
  • the user of the handheld pointer device 10 as described may also configure N and C based on the accuracy or precision needed by the software application executed on the display apparatus 20 through the user interface provided by the input unit 14 .
  • Step S 605 the processing unit 13 determines whether to update the second tilt angle ⁇ 2 to a third tilt angle ⁇ 3.
  • the processing unit 13 executes Step S 607 ; otherwise, the processing unit 13 executes Step S 611 .
  • Step S 607 the processing unit 13 computes a second displacement vector generated due to the instant rotation of the handheld pointer device 10 .
  • the processing unit 13 drives the image capturing unit 11 to capture and generates a third frame F3.
  • the processing unit 13 computes a fourth pointing coordinate and a fifth pointing coordinate using the second tilt angle ⁇ 2 and the third tilt angle ⁇ 3, respectively, in coordination with the image position of the reference point formed in the third frame F3.
  • the processing unit 13 subsequently computes the second displacement vector according to the fourth pointing coordinate and the fifth pointing coordinate .
  • the third frame F3 is captured and generated at a later time than the second frame F2.
  • the processing unit 13 operatively determines whether the handheld pointer device 10 has generated new rotation angle under user's operation, and correspondingly compensates the cursor position computed according to the displacement generated after updated the second tilt angle ⁇ 2 to the third tilt angle ⁇ 3, thereby improves the directivity of the handheld pointer device 10 and at same time resolves the cursor jumping issue.
  • Step S 609 when the processing unit 13 determines that the second tilt angle ⁇ 2 has updated to the third tilt angle ⁇ 3, the processing unit 13 computes the sum of the calibration coordinate , the second displacement vector , and C to generate a compensated pointing coordinate , e.g., the compensated third pointing coordinate .
  • the compensated pointing coordinate is computed using Eq. (8),
  • C p 3 ⁇ ( ⁇ 2 ) + ( p 2 ⁇ ( ⁇ 2 ) - p 1 ⁇ ( ⁇ 1 ) + ( p 5 ⁇ ( ⁇ 3 ) - p 4 ⁇ ( ⁇ 2 ) ) N , ( 8 ) wherein represents the compensated pointing coordinate; represent calibration coordinate; C represents the compensation vector; represents the first pointing coordinate; represents the second pointing coordinate; represents the third pointing coordinate; represents the fourth pointing coordinate; represents the fifth pointing coordinate; N represents the number of calibrations.
  • Step S 611 when the processing unit 13 determines that second tilt angle ⁇ 2 has not been updated, i.e., no tilt angle update operation has been executed, the processing unit 13 computes the sum of the calibration coordinate and C to generate the compensated pointing coordinate , e.g., the compensated third pointing coordinate .
  • the compensated pointing coordinate is computed using Eq. (9),
  • Step S 613 the processing unit 13 generates and outputs the cursor parameter for controlling the display position of the cursor on the display apparatus 20 according to the compensated pointing coordinate computed.
  • the processing unit 13 drives the communication unit 16 to output the cursor parameter to the display apparatus 20 and causes the cursor to smoothly translate a distance d1 to the target position (i.e., the display position of the cursor 33 b shown in FIG. 7 ) from the current cursor position (i.e., the display position of the cursor 33 a shown in FIG. 7 ).
  • the display position of the cursor 33 a corresponds to the pointing coordinate computed by the handheld pointer device 10 using the first tilt angle ⁇ 1.
  • the display position of the cursor 35 a corresponds to the pointing coordinate computed by the handheld pointer device 10 using the second tilt angle ⁇ 2.
  • Step S 615 the processing unit 13 sets the calibration coordinate to be the newly computed pointing coordinate e.g., a sixth pointing coordinate.
  • the sixth pointing coordinate is computed according to image position of the reference point formed in a forth frame F4 using the second tilt angle ⁇ 2 or the third tilt angle ⁇ 3 (e.g., when the handheld pointer device 10 updated the second tilt angle ⁇ 2 to the third tilt angle ⁇ 3).
  • Step S 617 the processing unit 13 executes N ⁇ 1 (i.e. decrement the number of calibrations by one).
  • the processing unit 13 stores the calibration coordinate and the number of calibrations after decremented by one in the memory unit 15 .
  • Step S 619 the processing unit 13 determines whether N is equal to zero, i.e., whether the cursor position calibration program has been completed.
  • the processing unit 13 determines that N is equal to zero, i.e., the cursor position calibration program has been completed, the processing unit 13 executes S 621 . Conversely, when the processing unit 13 determines that N is not equal to zero, i.e., the cursor position calibration program has not been completed, the processing unit 13 returns to Step S 605 .
  • the processing unit 13 drives the image capturing unit 11 to capture a fifth frame F5 and performs steps of computing a seventh pointing coordinate of the handheld pointer device 10 relative to the reference point according to the image position of the reference point formed in the fifth frame F5 and the second tilt angle ⁇ 2 or the third tilt angle ⁇ 3, setting the seventh pointing coordinate as the calibration coordinate, and computing the compensated pointing coordinate based on the calibration coordinate and C. So that, the cursor displayed on the display apparatus 20 translates or moves a distance d2 from the display position of cursor 33 b to the display position of cursor 33 c as illustrated in FIG. 7 .
  • Steps S 605 ⁇ S 619 sequentially captures N ⁇ 2 frames (not shown) for continue to compensate the pointing coordinates computed and compute the cursor position accordingly, until N is equal to zero.
  • the cursor displayed om the display apparatus 20 smoothly moves N times from the display position that corresponds to the first pointing coordinate (e.g., the display position of the cursor 33 a ) to the position currently pointed by the handheld pointer device 10 . More specifically, the cursor displayed on the display apparatus 20 is translated smoothly from the display position (i.e., the display position of the cursor 33 a ) that corresponds to the first pointing coordinate to the display position (i.e., the display position of the cursor 33 N) that corresponds to the position currently pointed by the handheld pointer device 10 relative to the display apparatus 20 in accordance to the distances d1, d2, d3, . . . , do computed after the Nth frame
  • Step S 621 the processing unit 13 computes the cursor position in the subsequent movement of the handheld pointer device 10 according to the image position of the reference point formed in one of the frames captured along with the tilt angle presently used in cursor position computation, so as to improves the accuracy in cursor control operation.
  • the processing unit 13 accumulates the displacement generated and correspondingly adjusts the amount of compensation applied, i.e., adjusting C, for maintaining the directivity of the handheld pointer device 10 .
  • the processing unit 13 further operatively determines whether to incorporate the displacement generated after tilt angle update into pointing coordinate compensation computation or not according to angle difference before and after the tilt angle update and/or the magnitude of the displacement vector generated after the tilt angle update.
  • the processing unit 13 may also constantly communicate with the display apparatus 20 via the communication unit 16 during the operation of the handheld pointer device 10 , so as to obtain information associated with the software application executed on the display apparatus 20 including but not limited to the type and the execution progress of the software application, the frame refreshing rate, and the resolution required by the display apparatus 20 in the execution of the software application.
  • the processing unit 13 can operatively determine whether or not to execute the cursor position calibration program as well as configuring the calibration parameters for the cursor position calibration program according to the information obtained from the display apparatus.
  • the calibration parameters for the cursor position calibration program in the instant embodiment includes but not limited to the predetermined threshold (such as the first and the second predetermined thresholds), the preset angle, the number of calibrations, the calibration time, and the amount of compensation in each calibration.
  • the pointer positioning method of FIG. 3 and the method of calibrating the cursor position after the tilt angle update can be implemented by writing the corresponding program codes into the processing unit 13 (such as microcontroller or an embedded controller) via firmware design and executed by the processing unit 13 during the operation of the handheld pointer device 10 , however the present disclosure is not limited thereto.
  • the processing unit 13 such as microcontroller or an embedded controller
  • FIG. 3 is merely used for illustrating a pointer positioning method for the handheld pointer device 10 , and the present disclosure is not limited thereto.
  • FIG. 6 is merely used for illustrating an implementation method of the cursor position calibration algorithm and shall not be used to limit the present disclosure.
  • FIG. 4A ⁇ FIG. 4D are merely used to illustrate the computation of pointing coordinates and the relationship between the operating area (i.e., the display area) of the display apparatus 20 and the center of the image sensing array of the image capturing unit 11 (i.e. the center pointer “+”) and should not be used to limit the present disclosure.
  • FIG. 5 and FIG. 7 are merely used to illustrate the operation of the handheld pointer device 10 and the pointer positioning method in coordination with FIG. 3 and FIG. 5 , respectively and the present disclosure is not limited thereto.
  • FIG. 8 shows a flowchart diagram illustrating a pointer positioning method provided in accordance to another exemplary embodiment of the present disclosure.
  • the pointer positioning method of FIG. 8 can be implemented by programming the processing unit 13 via firmware design and executed by the processing unit 13 during the operation of the handheld pointer device 10 .
  • Step S 801 the processing unit 13 of the handheld pointer device 10 determines whether to update a first tilt angle presently used in the cursor position computation to a second tilt angle.
  • the processing unit 13 executes Step S 803 ; otherwise, the processing unit 13 returns to Step S 801 .
  • the processing unit 13 operatively determines whether the reference point 21 has substantially moved according to a plurality of frames generated by the image capturing unit 21 , wherein the image capturing unit 21 captures images corresponding to the position of the reference point 21 and sequentially generates the plurality of frames.
  • the processing unit 13 of the handheld pointer device 10 determines to update the first tilt angle to the second tilt angle upon determined that the reference point 21 has not substantially moved i.e., the handheld pointer device 10 is at rest.
  • the handheld pointer device 10 may also determine whether to update the first tilt angle to the second tilt angle by determining whether the pointing coordinate computed based on the image position of the reference point 21 in frames captured has substantially moved. For instance, when the processing unit 13 of the handheld pointer device 10 determines that pointing coordinate computed based on the position of the reference point 21 has not substantially moved, the processing unit 13 updates the first tilt angle presently used in the cursor position computation to the second tilt angle.
  • Step S 803 the processing unit 13 operatively drives the image capturing unit 11 to capture and generate a first frame containing the reference point 21 after the processing unit 13 updated the first tit angle presently used in the cursor position computation to the second tilt angle.
  • Step S 805 the processing unit 13 computes an angle difference between the first tilt angle and the second tilt angle.
  • Step S 807 the processing unit 13 determines whether the angle difference between the first tilt angle and the second tilt angle is smaller than a preset angle, e.g., 20 degrees.
  • a preset angle e.g. 20 degrees.
  • Step S 809 the processing unit 13 drives the image capturing unit 11 to capture and generate a second frame containing the reference point 21 .
  • the second frame is captured and generated at a later time than the first frame.
  • the processing unit 13 directly computes the cursor position of the cursor 23 based on the image position of the reference point formed in the second frame and the second tilt angle. That is to say, when the angle difference between the first tilt angle and the second tilt angle is computed to be smaller than the preset angle, e.g., 20 degrees, the processing unit 13 determines that the cursor jumping phenomenon is not noticeable to human eye and computes the cursor position of the cursor 23 directly based on the image position of the reference point formed in the second frame without applying any compensation.
  • the preset angle e.g. 20 degrees
  • Step S 811 the processing unit 13 computes a first pointing coordinate according to the image position of the reference point 21 formed in the first frame and the first tilt angle.
  • Step S 813 the processing unit 13 computes a second pointing coordinate according to the image position of the reference point 21 formed in the first frame and the second tilt angle.
  • the processing unit 13 stores the first and the second pointing coordinates in the memory unit 15 .
  • Algorithm used for computing the first and the second pointing coordinates are essentially the same as described in the aforementioned embodiment, and further descriptions are hereby omitted.
  • Step S 815 the processing unit 13 computes the cursor position of the cursor 23 in the subsequent movement of the handheld pointer device 10 on the basis of the offset between the first pointing coordinate and the second pointing coordinate generated after the first tilt angle is updated to the second tilt angle along with the movement of the handheld pointer device 10 .
  • Step S 817 the processing unit 13 generates a cursor parameter for controlling the display position of the cursor 23 on the display apparatus 20 according to the computational result from either Step S 809 or Step S 815 .
  • the processing unit 13 drives the communication unit 16 to wirelessly transmit the cursor parameter to the display apparatus 20 for correspondingly controlling the display position of the cursor 23 on the display apparatus 20 .
  • FIG. 8 is merely used for illustrating another pointer positioning method for the handheld pointer device and the present disclosure is not limited thereto.
  • Those skilled in art should be able to select the method for determining whether the handheld pointer device 10 is at rest, such as by analyzing the displacement, the velocity, or the acceleration associated with the image position of the reference point 21 formed in a set of consecutive frames captured, or by analyzing displacement information at least two pointing coordinates which is computed based on the image position of the reference point in a set of consecutive frames captured, or by analyzing the magnitude of an acceleration vector generated based on multiple accelerations of the handheld pointer device 10 detected over multiple axes, so as to determine whether to cause the handheld pointer device to update the first tilt angle to the second tilt angle according to practical operation requirements of the handheld pointer device 10 .
  • the method for calibrating the cursor position after the tilt angle update described in the aforementioned embodiment can be executed during the execution of Step S 815 .
  • FIG. 9 shows a flowchart diagram illustrating a pointer positioning method provided in accordance to another exemplary embodiment of the present disclosure.
  • the pointer positioning method of FIG. 9 can be implemented by programming the processing unit 13 via firmware design and executed by the processing unit 13 during the operation of the handheld pointer device 10 .
  • Step S 901 the processing unit 13 of the handheld pointer device 10 updated a first tilt angle presently used in the cursor position computation to a second tilt angle at a first time interval.
  • the processing unit 13 operatively reads accelerations of the handheld pointer device 10 over multiple axes (e.g., X-axis, Y-axis, and Z-axis) detected by the accelerometer unit 12 .
  • the accelerometer unit 12 operatively generates an acceleration vector according to the accelerations of the handheld pointer device 10 detected and generates an acceleration vector accordingly to the processing unit 13 in signal form (i.e., the acceleration sensing signal).
  • the processing unit 13 then computes the instant tilt angle of the handheld pointer device 10 using Eqs. (1) ⁇ (3) with the acceleration vector of the handheld pointer device 10 and the included angles computed between any two axes and correspondingly updates the first tilt angle presently used to the second tilt angle.
  • the processing unit 13 drives the image capturing unit 11 to capture and generate a first frame containing the reference point 21 .
  • Step S 903 the processing unit 13 computes a first pointing coordinate and a second point coordinate at the first time interval using the first tilt angle and the second tilt angle, respectively, in coordination with the image position of the reference point 21 formed in the first frame.
  • the processing unit 13 computes the cursor position according to the first pointing coordinate and generates the cursor parameter accordingly for controlling the display position of the cursor 23 on the display apparatus 20 .
  • the processing unit 13 drives the communication unit 16 to wirelessly transmit the cursor parameter to the display apparatus 20 at the first time interval and causes the cursor 23 to be fixed at the first pointing coordinate.
  • Cursor position computation and positioning methods are essentially the same as described in the aforementioned embodiment, and further details are hereby omitted.
  • Step S 905 the processing unit 13 computes a first displacement vector between the first pointing coordinates and the second pointing coordinate.
  • Step S 907 the processing unit 13 generates a compensating vector per unit displacement according to the first displacement vector.
  • the processing unit 13 operatively determines whether to compute the compensating vector per unit displacement based on a predetermined number of calibrations or a constant amount of compensation per calibration according to an angle difference between the first and the second tilt angles and/or the first displacement vector.
  • the processing unit 13 determines to complete the pointing coordinate calibration within the predetermined number of calibration, the processing unit 13 computes the compensating vector per unit displacement by dividing a predetermined number of calibration or a calibration time from the first displacement vector computed.
  • the processing unit 13 may set the compensating vector per unit displacement based on the amount of compensation per calibration and compute the number of calibrations by dividing the compensating vector per unit displacement from the first displacement vector.
  • the processing unit 13 may set the number of calibration or the calibration time according to a frame capturing rate or a predetermined time. In another embodiment, the processing unit 13 can also set the number of calibration, the calibration time, and the amount of compensation per each calibration based on the type of software application, e.g., type of game software, executed by the display apparatus 20 . Calibration parameters configuration method has been detailed explained in above described embodiments, and further descriptions are hereby omitted.
  • Step S 909 the processing unit 13 drives the image capturing unit 11 to capture and generate a second frame containing the reference point 21 in a second time interval.
  • the processing unit 13 further computes a third pointing coordinate according to the image position of the reference point formed in the second frame and the second tilt angle.
  • the second time interval occurs after the first time interval. That is, the second frame is captured at a later time than the first frame.
  • Step S 911 the processing unit 13 initiates a cursor position calibration program in the second time interval and computes the cursor position according to the third pointing coordinate and the compensating vector per unit displacement. Particularly, the processing unit 13 implements the cursor position calibration method depicted in FIG. 6 and calibrates the third pointing coordinate.
  • Step S 913 the processing unit 13 computes the display position of the cursor 23 on the display apparatus 20 at the second time interval. More specifically, the processing unit 13 computes and generates the cursor parameter according to the third pointing coordinate for controlling the display position of the cursor 23 on the display apparatus 20 . The processing unit 13 further drives the communication unit 16 to wirelessly transmit the cursor parameter to the display apparatus 20 to control the display position of the cursor 23 on the display apparatus 20 at the second time interval.
  • Step S 915 the processing unit 13 drives the image capturing unit 11 to capture and generate a third frame containing the reference point at a third time interval.
  • the processing unit 13 then computes a fourth pointing coordinate according to the image position of the reference point 21 formed in the third frame and the second tilt angle.
  • the third time interval occurs after the second time interval. That is, the third frame is captured at a later time than the second frame.
  • the time interval between the second and third time interval can be designed based on the preset number of calibrations or the preset calibration time configured.
  • Step S 917 the processing unit 13 computes the display position of the cursor 23 on the display apparatus 20 at the third time interval according to the fourth pointing coordinate.
  • Step S 919 the processing unit 13 generates the cursor parameter for controlling the display position of the cursor 23 on the display apparatus 20 .
  • the processing unit 13 further drives the communication unit 16 to wirelessly transmit the cursor parameter to the display apparatus 20 to control the display position of the cursor 23 on the display apparatus 20 at the third time interval.
  • the processing unit 13 may determine whether to calibrate and compensate pointing coordinates computed using the second tilt angle according to the first displacement vector and/or the angle difference between the first and the second tilt angles.
  • a first predetermined threshold e.g., 5 pixels
  • a preset angle e.g. 20 degrees
  • the processing unit 13 does not initiate the cursor position calibration program and computes the cursor position directly according to the third pointing coordinate. Thereafter, the processing unit 13 generates the cursor parameter for controlling the display position of the cursor 23 on the display apparatus 20 , accordingly.
  • the processing unit 13 in the instant embodiment can further store the first and the second tilt angles, the first pointing coordinate, the second pointing coordinate, the third pointing coordinate, the first displacement vector, the compensating vector per unit displacement in the memory unit 15 .
  • Those skilled in the art should be able to program the processing unit 13 to utilize algorithm for determining whether to update the first tilt angle to the second tilt angle in the first time interval via firmware design. That is, the processing unit 13 can be programmed with necessary program codes to determine whether the handheld pointer device 10 is in motion or at rest e.g., whether the reference point 21 or the pointing coordinate associated with the position of the reference point 21 has substantially moved, to determine whether to update the tilt angle presently used by the handheld pointer device 10 in the cursor position computation.
  • FIG. 9 is merely used to describe a pointer positioning method for the handheld pointer device 10 and the present disclosure is not limited thereto.
  • the present disclosure also discloses a non-transitory computer-readable media for storing the computer executable program codes of the pointer position methods depicted in FIG. 3 , FIG. 8 , and FIG. 9 as well as the cursor position calibration method depicted in FIG. 6 .
  • the non-transitory computer-readable media may be a floppy disk, a hard disk, a compact disk (CD), a flash drive, a magnetic tape, accessible online storage database or any type of storage media having similar functionality known to those skilled in the art.
  • exemplary embodiments of the present disclosure provide a handheld pointer device and a pointer positioning method thereof.
  • the handheld pointer device and the pointer method thereof area can be adapted for controlling the operation of a cursor displayed on a display apparatus.
  • the pointer positioning method disclosed operatively calibrates and corrects pointing coordinates in the computation of cursor position after the handheld pointer device updated the tilt angle thereof so that the display position of the cursor can be adjusted to gradually move to the correct position which the handheld pointer device actually point toward within a preset calibration time or a preset number of calibration. Accordingly, the issue of the cursor suddenly jump from one place to another after the tilt angle has updated can be effectively avoid. Thereby, enhance the stability of the handheld pointer device and at the same time, the operation convenience and of the user.
  • the pointer positioning method enables the handheld pointer device to actively determine whether to calibrate the pointing coordinate computed using the updated tilt angle and the associated calibration and compensation method based on the degree of precision required by the type of software application executed on the display apparatus and the resolution of the display apparatus, thereby enhances the practicality and applicability of the handheld pointer device.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A pointer positioning method for a handheld pointer device includes: capturing a first frame containing a reference point when the handheld pointer device updates a first tilt angle presently used to a second tilt angle; computing a first pointing coordinate according to the image position of the reference point in the first frame and the first tilt angle; computing a second pointing coordinate according to the image position of the reference point in the first frame and the second tilt angle; capturing a second frame containing the reference point to compute a third pointing coordinate according to the image position of the reference point in the second frame and the second tilt angle; generating a cursor parameter for controlling a display position of a cursor on a display apparatus according to the first pointing coordinate, the second pointing coordinate, and the third pointing coordinate.

Description

This U.S. Non-provisional application is a continuation-in-Part of application Ser. No. 13/771,072, filed Feb. 19, 2013, now status pending, and entitled “Hand-Held Pointing Device”, and a continuation-in-Part of application Ser. No. 14/273,523, filed May 8, 2014, now status pending, and entitled “Handheld Pointer Device And Tilt Angle Adjustment Method Thereof”. The disclosures of all of the foregoing applications are incorporated by reference herein in their entirety.
This U.S. Non-provisional application further claims the priority to Taiwan patent application No. 102144801, filed Dec. 6, 2013, entitled “Pointer Device And Pointer Positioning Method Thereof”. The entire specification of which is hereby incorporated by reference in its entirety for all purposes.
BACKGROUND
1. Technical Field
The present disclosure relates to a positioning method of a pointer device in particular, to a pointer positioning method of a handheld pointer device.
2. Description of Related Art
A handheld pointer device is operable to compute pointing coordinates thereof by analyzing image positions of at least one reference light source formed in images captured and to transmit pointing coordinates computed to a video game console for assisting gaming process executed on the video game console. Currently, handheld pointer devices have been widely used in many types of interactive gaming systems such as light gun games, baseball games, tennis games, and the like.
It is well known in the art the that the distance between an image sensor installed on a handheld pointer device and a display apparatus and the rotation angle of the image sensor while capturing the images affect the computation of pointing coordinates thereafter. Hence, to enhance user's operability, a handheld pointer device is typically equipped with at least one tilt sensing device for constantly detecting the instant rotation angle of the handheld pointer device and correspondingly updating the tilt angle used in calculation of pointing coordinates. The relative movement of the handheld pointer device with respective to the position of the reference light source thus can be accurately computed and determined, thereby avoid erroneous position determination of the reference point.
However, whenever the handheld pointer device updates the tilt angle presently used in the computation of pointing coordinate, the handheld pointer device will instantly compute the pointing coordinate using the newly updated tilt angle and the image position of the reference light source in the sensing area of the image sensor computed and control the movement of the cursor, accordingly. As a result, cursor displayed on the display apparatus would suddenly jump from one place to another, and reduce user's operability, at same time, increases operation inconvenience.
SUMMARY
Accordingly, an exemplary embodiment of the present disclosure provide a pointer positioning method for a handheld pointer device, and the pointer positioning method can cause the handheld pointer device to automatically compensate and correct pointing coordinates computed based on the displacement generated as the handheld pointer device updated the tilt angle thereof, thereby effectively avoid the occurrence of having the cursor suddenly jumps from one place to another and increases the user's operation with the handheld pointer device.
An exemplary embodiment of the present disclosure provides a pointer positioning method of a handheld pointer device and the pointer position method includes the following steps. When the handheld pointer device updates a first tilt angle presently used to a second tilt angle, captures a first frame containing a reference point. Subsequently, a first pointing coordinate is computed according to the image position of the reference point formed in the first frame and the first tilt angle. A second pointing coordinate is computed according to the image position of the reference point formed in the first frame and the second tilt angle. Afterward, a second frame containing the reference point is captured. A third pointing coordinate is then computed according to the image position of the reference point formed in the second frame and the second tilt angle. Thereafter, a cursor position according to the first pointing coordinate, the second pointing coordinate, and the third pointing coordinate, so as to correspondingly generate a cursor parameter controlling a display position of a cursor on a display apparatus.
An exemplary embodiment of the present disclosure provides a pointer positioning method of a handheld pointer device and the pointer position method includes the following steps. When the handheld pointer device updates a first tilt angle presently used to a second tilt angle, captures a first frame containing a reference point. An angle difference between the first tilt angle and the second tilt angle is subsequently computed. When the angle difference is computed to be larger than a preset angle, a first pointing coordinate is computed according to the image position of the reference point formed in the first frame and the first tilt angle. At same time, a second pointing coordinate is computed according to the image position of the reference point formed in the first frame and the second tilt angle. Then, the handheld pointer device is driven to compute a cursor position of a cursor in the subsequent movement of the handheld pointer device on the basis of a first displacement vector computed between the first and the second pointing coordinates along with the pointing coordinate generated in accordance to the movement of the handheld pointer device. The handheld pointer device further generates a cursor parameter and correspondingly controls a display position the cursor on a display apparatus.
An exemplary embodiment of the present disclosure provides a pointer positioning method of a handheld pointer device, and the pointed positioning method includes the following steps. At a first time interval, the handheld pointer device is driven to update a first tilt angle presently used to a second tilt angle. The handheld pointer device computes a first pointing coordinate and a second point coordinate according to the image position of the reference point in a first frame with the first tilt angle and the second tilt angle, respectively. At a second time interval, the handheld pointer device is driven to compute a third pointing coordinate according to the image position of the reference point formed in a second frame captured and the second tilt angle. The second time interval occurs after the first time interval. The cursor position of a cursor is then computed according to the first pointing coordinate, the second pointing coordinate, and the third pointing coordinate to generate a cursor parameter for controlling a display position of the cursor on a display apparatus, accordingly.
An exemplary embodiment of the present disclosure provides a handheld pointer device. The handheld pointer device includes an image capturing unit, an accelerometer unit, and a processing unit. The image capturing unit is configured to operatively capture a plurality of images corresponding to the position of a reference point and sequentially generate a plurality of frames. The accelerometer unit is configured to operatively detect a plurality of accelerations of the handheld pointer device over multiple axes for generating an acceleration vector. The processing unit is coupled to the image capturing unit and the accelerometer unit. The processing unit is configured to operatively compute a cursor position of a cursor according to the image positions of the reference points in the frames and a first tilt angle.
When the processing unit updates the first tilt angle presently used in the cursor position computation to a second tilt angle according to the plurality of accelerations detected, the processing unit operatively drives the image capturing unit to capture a first frame containing the reference point. The processing unit then respectively computes a first pointing coordinate and a second pointing coordinate using the first and the second tilt angles in coordination with the image position of the reference point formed in the first frame. The processing unit drives the image capturing unit to capture a second frame containing the reference point thereafter. Afterward, the processing unit computes the cursor position according to the image position of the reference frame in the second frame, the first pointing coordinate, the second pointing coordinate, to correspondingly generate a cursor parameter for controlling a display position of the cursor on a display apparatus.
An exemplary embodiment of the present disclosure provides a non-transitory computer-readable media, for storing a computer executable program for the aforementioned pointer positioning method. When the non-transitory computer readable recording medium is read by a processor, the processor executes the aforementioned pointer positioning method.
To sum up, exemplary embodiments provide a handheld pointer device and a pointer positioning method thereof. The handheld pointer device and the pointer positioning method thereof area are adapted for controlling the operation of a cursor displayed on a display apparatus. The pointer positioning method operatively calibrate and correct pointing coordinates during the computation of cursor position after the handheld pointer device updated the tilt angle thereof in such a way that the display position of the cursor can be adjusted to gradually move to the position which the handheld pointer device actually point toward within a preset calibration time or a preset number of calibration. Accordingly, the issue of the cursor suddenly jump from one place to another after the tilt angle has updated can be effectively avoid. Thereby, enhance the stability of the handheld pointer device and at the same time, increase the operation convenience and of the user as well.
In order to further understand the techniques, means and effects of the present disclosure, the following detailed descriptions and appended drawings are hereby referred, such that, through which, the purposes, features and aspects of the present disclosure can be thoroughly and concretely appreciated; however, the appended drawings are merely provided for reference and illustration, without any intention to be used for limiting the present disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings are included to provide a further understanding of the present disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
FIG. 1 is a diagram illustration an operation of a handheld pointer device in an interactive system provided in accordance to an exemplary embodiment of the present disclosure.
FIG. 2 is a block diagram of a handheld pointer device provided in accordance to an exemplary embodiment of the present disclosure.
FIG. 3 is a flowchart diagram illustrating a pointer positioning method of a handheld pointer device provided in accordance to an exemplary embodiment of the present disclosure.
FIG. 4A˜4B are diagrams respectively illustrating image positions of the reference point detected as the handheld pointer device moves provided in accordance to an exemplary embodiment of the present disclosure.
FIG. 4C is a diagram illustrating the image positions of the reference point computed using different tilt angle provided in accordance to an exemplary embodiment of the present disclosure.
FIG. 4D is a diagram illustrating the image position of the reference point detected as the handheld pointer device moves and the correspondingly movement of the cursor displayed on a display apparatus provided in accordance to an exemplary embodiment of the present disclosure.
FIG. 5 is a diagram illustrating the movement of the cursor displayed on a display apparatus provided in accordance to an exemplary embodiment of the present disclosure.
FIG. 6 is a flowchart diagram illustrating a method for calibrating the cursor position after the update of tilt angle provided in accordance to an exemplary embodiment of the present disclosure.
FIG. 7 is a diagram illustrating the movement of the cursor displayed on a display apparatus along with the movement of a handheld pointer device provided in accordance to an exemplary embodiment of the present disclosure.
FIG. 8 is a flowchart diagram illustrating a pointer positioning method of a handheld pointer device provided in accordance to another exemplary embodiment of the present disclosure.
FIG. 9 is a flowchart diagram illustrating a pointer positioning method of a handheld pointer device provided in accordance to another exemplary embodiment of the present disclosure.
DESCRIPTION OF THE EXEMPLARY EMBODIMENTS
Reference will now be made in detail to the exemplary embodiments of the present disclosure, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the description to refer to the same or like parts.
(An Exemplary Embodiment of a Handheld Pointer Device)
A handheld pointer device of the present disclosure can be adapted for positioning a pointer (such as a cursor) on a display apparatus. Please refer to FIG. 1, which shows a diagram illustration an operation of a handheld pointer device in an interactive system provided in accordance to an exemplary embodiment of the present disclosure. An interactive system of the instant embodiment includes a handheld pointer device 10 and a display apparatus 20. The display apparatus 20 is equipped with at least one reference point 21, which is provided to the handheld pointer device 10 to use as reference for controlling the movement of a cursor 23 displayed on the display apparatus 20.
In the instant embodiment, the display apparatus 20 is configured to have the necessary software and hardware architectures for executing data and displaying software application. The display apparatus 20 includes but not limited to a projection display, a game console display, a television, or a monitor of a computer system. In practice, depending upon the practical operational requirements of the interactive system, the interactive system can further include a host computer (not shown) such as video game console or a computer. The host computer can be configured to operatively process the program codes associated with a software application (e.g., video games such as light gun games, baseball games, tennis games and the like) and execute the software application. The host computer further can be configured to display the execution progress of the software application on the display apparatus 20 for the user to view and perform the correspondingly control operations.
The reference point 21 is placed near the display apparatus 20 and is provided to the handheld pointer device 10 for determining the pointing position thereof 10, i.e., determines the moving direction and the displacement of the handheld pointer device 10 relative to the reference point 21.
The reference point 21 can be implemented by a plurality of light emitting diodes with specific wavelength, such as infrared light emitting diodes (IR LED), laser diodes, or ultraviolet light emitting diodes, arranged in a regular or irregular shape. Moreover, the light emitting diodes may be configured to electrically connect to the display apparatus 20 or may be powered by an independent power source for lighting. It shall be noted that the number of the reference point is not limited to one as used in the instant embodiment. Those skilled in the art should be able to configure the exact number of the reference point 21 required to be one, two, or more than two according to the practical design and/or operational requirements. In other words, FIG. 1 is merely used to illustrate an operation of the handheld pointer device 10, and the instant disclosure is not limited thereto.
Briefly, the handheld pointer device 10 operatively drives an image capturing unit 11 installed thereon to capture images of the reference point 21 as the handheld pointer device 10 points toward the position of the reference point 21 and sequentially generates a plurality of frames containing the image of the reference point 21. The handheld pointer device 10 operatively computes a pointing coordinate generated as the handheld pointer device 10 points toward the display apparatus 20 according to an image position of the reference point 21 formed in one of the frames captured and the tilt angle presently used in the pointing coordinate calculation. Next, the handheld pointer device 10 computes the cursor position of the cursor 23 on the display apparatus 20 according to the pointing coordinate computed. The handheld pointer device 10 further wirelessly transmits a cursor parameter generated based on the relative movement of the reference point 21 for controlling the display position of the cursor 23 to the display apparatus 20. The handheld pointer device 10 thus controls the movement of the cursor 23 displayed on the display apparatus 20.
The handheld pointer device 10 may further determine whether to update a first tilt angle (i.e., the current rotation angle of the handheld pointer device 10) being presently used to a second tilt angle based on the movement of image positions of the reference point 21 in frames captured. In one embodiment, the handheld pointer device 10 may first determine whether the handheld pointer device 10 is in motion or at rest by determining whether or not the image position of the reference point 21 formed in consecutive frames has substantially moved. The handheld pointer device 10 subsequently determines whether to update the tilt angle presently used in computing the pointing coordinate according to the determination result. In another embodiment, the handheld pointer device 10 may also determine whether the handheld pointer device 10 is in motion or at rest by determining whether pointing coordinates computed based on the image positions of the reference point 21 formed in frames captured using the first tilt angle has substantially moved. The handheld pointer device 10 determines whether to update the tilt angle presently used in computing pointing coordinate according to the determination result thereafter.
It worth to note that the phrase of the reference point 21 has substantially moved herein indicates that the reference point 21 has moved over a short period of time (i.e., a second, a millisecond, two adjacent frames, or multiple consecutive frames). Whether the reference point 21 has substantially moved can be determined by the position displacement of the image position of the reference point 21 formed in consecutive frames captured, or the velocity of the image position of the reference point 21 formed in the consecutive frames captured, or the acceleration of image position of the reference point 21 formed in the consecutive frames captured, or the displacement, the velocity, or the acceleration of pointing coordinates computed based on consecutive frames captured.
In one embodiment, the handheld pointer device 10 may use an inertial sensor to sense and compute the instant tilt angle of the handheld pointer device 10. However, the force exerted by the user onto the handheld pointer device 10 while the user operates the handheld pointer device 10, might affect the gravitational direction determination result detected by the inertial sensor. Hence, the impact of the user on the handheld pointer device 10 while the user operates the handheld pointer device 10 must be removed or eliminated in order to accurately compute and update the tilt angle of the handheld pointer device 10. In particular, when determines that the handheld pointer device 10 being operated by the user has not substantially moved (i.e., the reference point 21 detected has not substantially moved), the handheld pointer device 10 can be regarded as unaffected by the external force exerted thereon. The handheld pointer device 10 thus can accurately sense and compute the instant rotation angle of the handheld pointer device 10, and update the first tilt angle presently used by the handheld pointer device 10 to the second tilt angle.
When the handheld pointer device 10 determines to update the first tilt angle presently used to the second tilt angle, the handheld pointer device 10 operatively captures a first frame containing the reference point 21. The handheld pointer device 10 computes a first pointing coordinate relative to the display apparatus 20 according to the image position of the reference pointer 21 formed in the first frame and the first tilt angle. The handheld pointer device 10 further computes the cursor position according to the first pointing coordinate computed to correspondingly generate the cursor parameter for controlling the display position of the cursor 23 on the display apparatus 20.
The handheld pointer device 10 then computes a second pointing coordinate relative to the display apparatus 20 according to the image position of the reference pointer 21 formed in the first frame and the second tilt angle. Thereafter, in the subsequent computation of a third pointing coordinate, the handheld pointer device 10 operatively determines whether or not to perform a cursor position calibration process to calibrate and correct the third pointing coordinate computed according to the displacement vector between the first and the second pointing coordinates (i.e., the displacement of the cursor 23) during the computation of a third pointing coordinate according to the image position of the reference pointer 21 formed in the first frame and the second tilt angle.
When the handheld pointer device 10 determines that the displacement vector between the first and the second pointing coordinates is computed to be greater than or equal to a first predetermined threshold, the handheld pointer device 10 operatively corrects the third pointing coordinate for compensating the offset generated as the handheld pointer device 10 updates the tilt angle thereof. More specifically, the handheld pointer device 10 computes a cursor position according to the first pointing coordinate, the second pointing coordinate, and the third pointing coordinate, so as to correspondingly generate the cursor parameter for controlling the display position of the cursor 23 on the display apparatus 20. On the other hand, when the handheld pointer device 10 determines that the displacement vector between the first and the second pointing coordinates is computed to be less than the first predetermined threshold, the handheld pointer device 10 directly computes the cursor position according to the third pointing coordinate without apply any compensation and correspondingly generate the cursor parameter to control the display position of the cursor 23 on the display apparatus 20.
Accordingly, the occurrence that the cursor 23 suddenly jumps from one place to another after the operation of updating the tilt angle negatively affecting the user's operation with handheld pointer device 10 can be effectively prevented or eliminated.
In another embodiment, the handheld pointer device 10 can also determine whether to calibrate and correct the pointing coordinate computed using the updated tilt angle (i.e. the second tilt angle) according to the angle difference between the first tilt angle and the second tilt angle.
For instance, the handheld pointer device 10 calibrates and corrects the pointing coordinate (i.e., the third pointing coordinate) computed in the subsequent cursor position computation using the updated tilt angle when the angle difference between the first tilt angle presently used and the second tilt angle is computed to be larger than a preset angle.
When the handheld pointer device 10 has determined to compensate pointing coordinates computed after the operation of updating the first tilt angle to the second tilt angle, the handheld pointer device 10 completes a cursor position calibration within a preset calibration time or preset number of calibration and causes the cursor 23 to smoothly move from the movement path that corresponds to the first tilt angle to the movement path that corresponds to the second tilt angle. The handheld pointer device 10 thus can accurately compute the relative moving information of the handheld pointer device 10 with respect to the display apparatus 20 and precisely control the movement of the cursor 23 on the display apparatus 20, at the same time prevent user's operation with the handheld pointer device 10 from being affected by the tilt angle update.
It is worth to note that the handheld pointer device 10 is operable to determine whether to calibrate and correct the pointing coordinate computed using the updated tilt angle after each tilt angle update (i.e., configure the first predetermined threshold and the preset angle) as well as the associated calibration and compensation method (i.e., the amount of compensation in each calibration and the preset calibration time) according to the type of the software program executed by the display apparatus 20 as well as the resolution of the display apparatus 20.
To put it concretely, the handheld pointer device 10 can pre-store multiple sets of calibration parameter associated with different resolutions of the display apparatus 20 and the type of software application.
For instance, when the type of the software application currently executed by the display apparatus 20 requires high precision (such as displaying rapid motion images), then the value of the first predetermined threshold or the value of the preset angle should be configured to be relatively small, such that the handheld pointer device 10 executes a calibration program and calibrates pointing coordinate after each tilt angle updates, so as to increase the directivity of the handheld pointer device 10.
For another instance, when the type of the software application currently executed by the display apparatus 20 does not require high precision (such as displaying still images), then the value of the first predetermined threshold or the value of the preset angle should be configured to be relatively larger such that the handheld pointer device 10 does not have to calibrate the pointing coordinate after each tilt angle update, or does not have to execute the calibration program of pointing coordinate after each tilt angle adjustment, and reduces the number of calibration needed, thereby reduces the computational complexity of the pointing coordinates.
In one implementation, the handheld pointer device 10 can automatically link with the display apparatus 20 at start up and access the type of the software application currently executed on the display apparatus 20. Then, the handheld pointer device 10 operatively determines whether to calibrate and correct pointing coordinates computed by the handheld pointer device 10 after tilt angle update based on the type of the software program currently executed on the display apparatus 20, and selects the appropriate calibration parameters. Accordingly, the applicability and operation convenience of the handheld pointer device 10 can be enhanced.
More specifically, please refer to FIG. 2 in conjunction with FIG. 1, wherein FIG. 2 shows a block diagram of a handheld pointer device provided in accordance to an exemplary embodiment of the present disclosure. The handheld pointer device 10 includes an image capturing unit 11, an accelerometer unit 12, a processing unit 13, an input unit 14, a memory unit 15, and a communication unit 16. The image capturing unit 11, the accelerometer unit 12, the input unit 14, the memory unit 15, and the communication unit 16 are coupled to the processing unit 13, respectively.
It is worth to note that, in another embodiment, the accelerometer unit 12 can be integrated with the image capturing unit 11. In particular, the accelerometer unit 12 is electrically connected to the processing unit 13 through the image capturing unit 11. Alternatively, in other embodiments, at least one of the image capturing unit 11, the accelerometer unit 12, the input unit 14, the memory unit 15, and the communication unit 16 and another component thereof may be configured to electrically connect the processing unit 13 in series.
The image capturing unit 11 is configured to operatively capture images containing the reference point 21 as the handheld pointer device 10 pointing toward the reference point 21 and sequentially generate a plurality of frames. Specifically, the image capturing unit 11 can be configured to operatively detect the light emitted from the reference point 21 according to a frame capturing rate (for example, 200 frames per second), and sequentially generates a plurality of frames containing the image of the reference point 21.
An optical filter (not shown) can be used for filtering out light spectrum outside the specific light spectrum generated by the reference point 21 such that the image capturing unit 11 only detects the light having wavelength within the specific light spectrum generated by the reference point 21.
In the instant embodiment, the image capturing unit 11 can be implemented by a charge-coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor. Those skilled in the art should be able to design and implement the image capturing unit 11 according to practical operation requirements, and the instant embodiment is not limited to the example provided herein.
The accelerometer unit 12 is configured to detect a plurality of accelerations of the handheld pointer device 10 over multiple axes (e.g., X-axis, Y-axis, and Z-axis) of a space, and generate an acceleration vector, accordingly. The accelerometer unit 12 in the instant embodiment includes but not limited to a G-sensor or an accelerometer, and the accelerometer unit 12 can be built-in in the handheld pointer device 10. Certainly, in other embodiments, the accelerometer unit 12 may be implemented by an external device connected to the handheld pointer device 10. Those skilled in the art should be able to implement the accelerometer unit 12 according to the practical operation and/or design requirements and the instant present disclosure is not limited to the example provided herein.
The processing unit 13 is configured to receive frames outputted by the image capturing unit 11 and compute an image position of the reference point 21 formed in one of the frames according to the respective frame among the frames captured. The processing unit 13 operatively computes the pointing coordinate of the handheld pointer device 10 with respective to the position of the reference point 21 using the first tilt angle. The processing unit 13 further computes the cursor position based on the pointing coordinate computed, so as to correspondingly generate the cursor parameter controlling the movement (i.e., the display position) of the cursor. Thereafter, the processing unit 13 drives the communication unit 16 and wirelessly transmits the cursor parameter to the display apparatus 20 to correspondingly control the movement of the cursor 23 displayed on the display apparatus 20 in coordination with the execution of the software program on the display apparatus 20.
More specifically, the processing unit 13 can operatively determine whether the reference point 21 has moved based on frames captured, i.e., whether the image position of the reference point 21 has substantially moved. When the processing unit 13 determines that the reference point 21 has not substantially moved, the processing unit 13 instantly reads the accelerations of the handheld pointer device 10 over multiple axes detected by the accelerometer unit 12. The processing unit 13 computes and updates the first tilt angle presently used to the second tilt angle according to the accelerations of the handheld pointer device 10 detected. The processing unit 13 then uses the second tilt angle updated and the image position of the reference point 21 formed in one of the frames to compute the pointing coordinate of the handheld pointer device 10 relative to the display apparatus 20.
In one embodiment, the processing unit 13 can compute the instant tilt angle of the handheld pointer device 10 using the accelerations of the handheld pointer device 10 over X-axis, Y-axis, and Z-axis detected by the accelerometer unit 12 and the included angles computed between any two axes, and update the first tilt angle to the second tilt angle, accordingly.
On the contrary, when the processing unit 13 determines that the position of the reference point 21 has substantially moved, the processing unit 13 does not update the first tilt angle presently used as the processing unit 13 operatively determines that the accelerometer 13 is currently unable to made accurately acceleration measurement associated with the handheld pointer device 10. The processing unit 13 continues to use the first tilt angle and the image position of the reference point 21 formed in one of the frames to compute the pointing coordinate of the handheld pointer device 10 relative to the display apparatus 20. The processing unit 13 generates the cursor parameter for controlling the display position of the cursor 23 according to the pointing coordinate computed. The processing unit 13 then drives the communication unit 16 to wirelessly transmit the cursor parameter to the display apparatus 20.
Algorithms used by the processing unit 13 for calculating the tilt angle (i.e., the first tilt angle, and the second tilt angle) of the handheld pointer device 10 will be briefly described in the following paragraphs.
In one embodiment, the plurality frames generated by the image capturing unit 11 are rectangular-shape. The long side of a frame is configured to be parallel to the X-axis, and the short side of the frame is configured to be parallel to the Y-axis. When the processing unit 13 determines that the reference point 21 has not substantially moved, the processing unit 13 reads the accelerations Vx, Vy, and Vz of the handheld pointer device 10 over the X-axis, Y-axis, and Z-axis of the three dimensional space depicted in FIG. 1 detected by the accelerometer unit 12. The accelerometer unit 12 operatively generates an acceleration vector V according to the detection result, and generates an acceleration sensing signal, accordingly. The acceleration sensing signal represents the ratio of any two accelerations, such as the ratio of the acceleration Vx to the acceleration Vy. The processing unit 13 computes the instant tilt angle of the handheld pointer device 10 according to the acceleration sensing signal received.
Specifically, the processing unit 13 can compute the acceleration vector V and the included angles between any two of axes by using the following Eqs. (1) to (3) and obtain the instant tilt angle of the handheld pointer device 10,
sin θ x = Vx gxy ( 1 ) cos θ y = Vy gxy ( 2 ) gxy = Vx 2 + Vy 2 , ( 3 )
wherein Vx represents the acceleration of the handheld pointer device 10 over the X-axis detected by the accelerometer unit 12; Vy represents the acceleration of the handheld pointer device 10 over the Y-axis detected by the accelerometer unit 12; |gxy| represents the gravitational acceleration computed according to the acceleration Vx and the acceleration Vy.
The processing unit 13 subsequently corrects the orientation of the frames based on the computation result of Eq. (1) and Eq. (2) using Eq. (4), so that the coordinate system of the frame corrected is the same as the coordinate system of the display apparatus 20,
[ x y ] = [ cos ( θ ) - sin ( θ ) sin ( θ cos ( θ ) ] [ x y ] , ( 4 )
wherein x represents the X-axis coordinate of the image position of the reference point 21 formed in one of the frames; y represents the Y-axis coordinate of the image position of the reference point 21 formed in one of the frames; x′ represents the X-axis coordinate of the image position of the reference point 21 formed in one of the frames after orientation correction; y′ represents the adjusted Y-axis coordinate of the image position of the reference point 21 formed in one of the frames after orientation correction. The processing unit 13 further computes the pointing coordinate of the handheld pointer device 10 relative to the reference point 21 or the display apparatus 20 according to X-axis coordinate x′ and Y-axis coordinate y′ obtained after orientation correction.
Next, the processing unit 13 computes the cursor position according to the pointing coordinate computed, so as to generate the cursor parameter to correspondingly control the movement of the cursor 23 on the display apparatus 20. The processing unit 13 wirelessly transmits the cursor parameter or the relative movement information of the handheld pointer device 10 to the display apparatus 20 via the communication unit 16 to correspondingly control the display position of the cursor 23 on the display apparatus 20.
It worth noting that those skilled in the art should understand that the accelerometer unit 12 of the handheld pointer device 10 in the present disclosure can also be configured to only detect accelerations over two dimensions, such as the acceleration Vx and the acceleration Vy. The above described acceleration determination method for the handheld pointer device 10 is only an implementation and does not limit the scope of the present disclosure. Additionally, computing the pointing coordinate of the handheld pointer device 10 relative to the display apparatus 20 according to the image position of one or more reference point formed in frames captured is known technique in the art and is not the focus of the present disclosure, thus further descriptions are hereby omitted.
The input unit 14 is configured to enable a user of the handheld pointer device 10 configuring the frame capturing rate and the calibration parameters, which includes but not limited to the calibration time, the number of calibrations, and the amount of compensation in each calibration. For instance, the user of the handheld pointer device 10 may set the frame capturing rate according to a preset calibration time and configure the number of calibrations according to the predetermined frame capturing rate. For another instance, the user may also determine and set the number of calibrations based on the frame capturing rate configured. The frame capturing rate may be configured according to the frame refresh rate of the display apparatus 20.
The input unit 14 is configured to cause the display apparatus 20 to display a configuration or setting interface provided for the user to configure the calibration time, the frame capturing rate and/or the number of calibrations for correcting the cursor position. In practice, the input unit 14 may be implemented by a keypad interface, an optical finger navigation component, or a button and the present disclosure is not limited thereto. In one embodiment, where the handheld pointer device 10 has a display screen (not shown), and the display screen can be configured to show the calibration time, the frame capturing rate, the number of calibrations for correcting the cursor position, and the amount of compensation applied in each calibration. The display screen of the handheld pointer device 10 may be a touch screen.
The memory unit 15 can be configured to store operation parameters of the handheld pointer device 10 including but not limited to the first pointing coordinate, the second pointing coordinate, the third pointing coordinate, the first tilt angle, the second tilt angle, the first predetermined threshold, the preset angle, and the cursor parameter. The memory unit 15 can be also configured to store the calibration time, the frame capturing rate and the number of calibrations for the cursor according to the operation of the handheld pointer device 10.
The processing unit 13 in the instant embodiment can be implemented by a processing chip such as a microcontroller or an embedded controller programmed with necessary firmware, however the present disclosure is not limited to the example provided herein. The memory unit 15 can be implemented by a volatile memory chip or a nonvolatile memory chip including but not limited to a flash memory chip, a read-only memory chip, or a random access memory chip. The communication unit 16 can be configured to utilize Bluetooth technology and transmit the relative movement information to the display apparatus 20, but the present disclosure is not limited thereto
It should be note that the internal components of the handheld pointer device 10 may be added, removed, adjusted or replaced according to the functional requirements or design requirements and the present disclosure is not limited thereto. That is, the exact type, exact structure and/or implementation method associated with the image capturing unit 11, the accelerometer unit 12, the processing unit 13, the input unit 14, the memory unit 15, and the communication unit 16 may depend upon the practical structure and the exact implementation method adopted for the handheld pointer device 10 and the present disclosure is not limited thereto.
The instant embodiment further provides a pointer positioning method for the handheld pointer device 10 to illustrate the operation of the handheld pointer device 10 in more detail. Please refer to FIG. 3 in conjunction with FIG. 1, 2, and FIG. 4A˜FIG. 4D. FIG. 3 shows a flowchart diagram illustrating a pointer positioning method of a handheld pointer device provided in accordance to an exemplary embodiment of the present disclosure. FIG. 4A˜4B are diagrams respectively illustrating image positions of the reference point detected as the handheld pointer device moves provided in accordance to an exemplary embodiment of the present disclosure. FIG. 4C shows a diagram illustrating the image positions of the reference point computed using different tilt angle provided in accordance to an exemplary embodiment of the present disclosure. FIG. 4D shows a diagram illustrating the image position of the reference point detected as the handheld pointer device moves and the correspondingly movement of the cursor displayed on a display apparatus provided in accordance to an exemplary embodiment of the present disclosure.
Initially, in Step S301, when the processing unit 13 of the handheld pointer device updates the first tilt angle θ1 presently used to the second tilt angle θ2, the processing unit 13 drives the image capturing unit 11 to capture and generate the first frame F1 containing the reference point 21.
It should be noted that the processing unit 13 can operatively determine whether to update the first tilt angle θ1 presently used in the cursor position computation to the second tilt angle θ2 by determining whether the image position of the reference point 21 formed corresponding to the position of the reference point 21 in the multiple consecutive images captured by the image capturing unit 11 has substantially moved.
Specifically, the processing unit 13 can operatively determine whether to update the first tilt angle θ1 presently used in cursor position computation to the second tilt angle θ2 according to frames containing the reference point 21 captured and generated by the image capturing unit 11.
In one embodiment, the processing unit 13 operatively updates the first tilt angle θ1 presently used in the cursor position computation to the second tilt angle θ2 upon determining that the position displacement of the image position of the reference point 21 formed in any two consecutive frames is less than a predefined displacement threshold (e.g., 1 pixel). In another embodiment, the processing unit 13 operatively updates the first tilt angle θ1 presently used in the cursor position computation to the second tilt angle θ2, upon determining that the velocity of the image position of the reference point 21 formed in the any two consecutive frames is less than a predefined velocity threshold (e.g., 1 pixel per unit time). In further another embodiment, the processing unit 13 may operatively update the first tilt angle θ1 presently used in cursor position computation to the second tilt angle θ2, upon determining that the magnitude of the acceleration vector of the handheld pointer device 10 is equal to the gravitational acceleration (g) of the handheld pointer device 10, wherein the acceleration vector is generated based on accelerations of the handheld pointer device 10 over multiple axes detected.
In other words, the processing unit 13 operatively reads accelerations of the handheld pointer device 10 over multiple axes e.g., X-axis, Y-axis, and Z-axis) detected by the accelerometer unit 12 and correspondingly updates the first tilt angle θ1 presently used to the second tilt angle θ2 computed upon determining that the position of the reference point 21 sensed has not substantially moved (i.e., the handheld pointer device 10 is at rest).
In Step S303, the processing unit 13 computes the first pointing coordinate
Figure US09804689-20171031-P00001
based on the image position of the reference point 21 formed in the first frame F1 and the first tilt angle θ1. As shown in FIG. 4A, the first pointing coordinate
Figure US09804689-20171031-P00001
represents the pointing vector of the handheld pointer device 10 relative to the display apparatus 20 in the first frame F1. The first pointing coordinate
Figure US09804689-20171031-P00001
is represented by (x1, y1).
The processing unit 13 further computes the cursor position of the cursor 23 according to the first pointing coordinate
Figure US09804689-20171031-P00001
, so as to correspondingly generate the cursor parameter controlling the display position of the cursor 23 on the display apparatus 20. Subsequently, the processing unit 13 drives the communication unit 16 to wirelessly transmit the cursor parameter to the display apparatus 20 for correspondingly controlling the display position of the cursor 23 on the display apparatus 20.
Incidentally, the computation of first pointing coordinate
Figure US09804689-20171031-P00001
is described as follow. The processing unit 13 first defines an operating area 111 on the first frame F1 that corresponds to the display apparatus 20 according to the center point “+” of the first frames F1 and the image position of the reference point image 113 formed in the first frames F1. The operating area 111 corresponds to the screen of the display apparatus 20 and is scaled with a predetermined display ratio. The processing unit 13 defines the operating area 111 in the first frame F1 by using the image position of the reference point image 113 as the origin and scaled with the predetermined display ratio. The processing unit 13 further defines the center 1111 of the operating area 111 in the first frame F1. As such, the processing unit 13 may set the center 1111 of the operating area 111 as the origin, apply Eqs (1)˜(4) along with the first tilt angle θ1, and compute the pointing vector of the center point “+” of the first frame F1 in the operating area 111 to obtain the first pointing coordinate
Figure US09804689-20171031-P00002
.
It is worth to note that it is not necessary to define the center 1111 of the operating area to obtain the first pointing coordinate
Figure US09804689-20171031-P00003
, the processing unit 13 may also obtain the first pointing coordinate
Figure US09804689-20171031-P00004
by computing the rotation angle of the handheld pointer device 10. The rotation angle of the handheld pointer device 10 is computed directly according to relationship between the center point “+” of the first frames F1 and the image position of the reference point image 113 in the first frames F1 or the image feature of the reference point image 113.
The center point “+” in the instant embodiment represents the center of the image sensing array of the image capturing unit 11. Alternatively, the first pointing coordinate
Figure US09804689-20171031-P00005
represents the pointing vector of the center of the image sensing array of the image capturing unit 11 (i.e., the center point “+”) in the first frame F1 with respect to the coordinate system of the display apparatus 20 defined therein.
In Step S305, the processing unit 13 computes the second pointing coordinate
Figure US09804689-20171031-P00006
based on the image position of the reference point 21 formed in the first frame F1 and the second tilt angle θ2.
As shown in FIG. 4B, the second pointing coordinate
Figure US09804689-20171031-P00007
represents the pointing vector computed by mapping the center of the image sensing array of the image capturing unit 11 (i.e., the center point “+”) onto the operating area 111 a which corresponds to the screen of the display apparatus 20 defined in the first frame F1. The second pointing coordinate
Figure US09804689-20171031-P00008
is represented by (x2, y2). The processing unit 13 uses the center 1111 a of the operating area 111 a as the origin and correspondingly computes the pointing vector of the center point“+” of the first frame F1 in the operating area 111 a so as to obtain the second pointing coordinate
Figure US09804689-20171031-P00009
and the second tilt angle θ2. The operating area 111 a is defined based on the position of the reference point image 113 a.
As shown in FIG. 4C, the processing unit 13 subsequently computes the first displacement vector
Figure US09804689-20171031-P00010
associated with the pointing coordinate computed for the same frame after the tilt angle adjustment using the first pointing coordinate
Figure US09804689-20171031-P00011
and the second pointing coordinate
Figure US09804689-20171031-P00012
. The processing unit 13 then stores the first displacement vector S1 in the memory unit 15.
In Step S307, the processing unit 13 drives the image capturing unit 11 to capture and generate the second frame F2 containing the reference point 21. The processing unit 13 computes the third pointing coordinate
Figure US09804689-20171031-P00013
based on the image position of the reference point 21 formed in the second frame F2 and the second tilt angle θ2. The second frame F2 is captured at a later time than the first frames F1. As shown in FIG. 4D, the third pointing coordinate
Figure US09804689-20171031-P00014
represents the pointing vector computed by mapping the center of the image sensing array of the image capturing unit 11 (i.e., the center point “+”) onto the operating area 111 b which corresponds to the screen of the display apparatus 20 defined in the second frame F2. The second pointing coordinate
Figure US09804689-20171031-P00015
is represented by (x3, y3). The operating area 111 b is defined based on the reference point image 113 b.
Subsequently, in Step S309, the processing unit 13 determines whether an angle difference θd between the first tilt angle θ1 and the second tilt angle θ2 is smaller than a preset angle (e.g., 20 degrees). When the processing unit 13 determines that the angle difference θd is smaller than the preset angle, the processing unit 13 executes Step S313; otherwise, the processing unit 13 executes Step S311.
In Step S311, the processing unit 13 determines whether a first displacement vector
Figure US09804689-20171031-P00016
between the first pointing coordinate
Figure US09804689-20171031-P00017
and the second pointing coordinate
Figure US09804689-20171031-P00018
is less than a first predetermined threshold (e.g., 10 pixels). When the processing unit 13 determines that the first displacement vector
Figure US09804689-20171031-P00019
is less than the first predetermined threshold (e.g., 10 pixels), the processing unit 13 executes Step S313; otherwise, the processing unit 13 executes Step S315. The first predetermined threshold can be configured according to the preset angle, e.g., set to a pixel value that corresponds to the angle difference of 20 degree.
In Step S313, the processing unit 13 computes the cursor position directly according to the third pointing coordinate
Figure US09804689-20171031-P00020
. That is to say, when the processing unit 13 determined that both the angle difference θd is smaller than the preset angle and the first displacement vector
Figure US09804689-20171031-P00021
is less than the first predetermined threshold, the processing unit 13 does not compensate the third pointing coordinate
Figure US09804689-20171031-P00022
, instead, the processing unit 13 directly computes the cursor position according to the third pointing coordinate
Figure US09804689-20171031-P00023
.
In Step S315, the processing unit 13 computes the cursor position according to the first pointing coordinate
Figure US09804689-20171031-P00024
, the second pointing coordinate
Figure US09804689-20171031-P00020
, and the third pointing coordinate
Figure US09804689-20171031-P00020
. Particularly, the processing unit 13 first computes a compensated third pointing coordinate
Figure US09804689-20171031-P00020
according to the third pointing coordinate
Figure US09804689-20171031-P00020
and the first displacement vector
Figure US09804689-20171031-P00025
. Afterward, the processing unit 13 computes the cursor position according to the compensated third pointing coordinate
Figure US09804689-20171031-P00026
for compensating the offset between the first pointing coordinate
Figure US09804689-20171031-P00027
and the second pointing coordinate
Figure US09804689-20171031-P00028
.
The compensated third pointing coordinate
Figure US09804689-20171031-P00029
can be computed using Eq. (5)
Figure US09804689-20171031-P00030
=
Figure US09804689-20171031-P00031
Figure US09804689-20171031-P00032
  (5)
wherein,
Figure US09804689-20171031-P00033
represents the compensated third pointing coordinate;
Figure US09804689-20171031-P00034
represents the third pointing coordinate;
Figure US09804689-20171031-P00035
represents the first displacement vector.
In Step S317, the processing unit 13 generates the cursor parameter for correspondingly controlling the movement of the cursor 23 based on the computation result from either Step S313 or Step S315. The processing unit 13 subsequently drives the communication unit 16 to wirelessly transmit the cursor parameter to the display apparatus 20 to correspondingly control the movement (i.e., the display position) of the cursor 23 on the display apparatus 20
It worth to note that as shown in FIG. 4D, since the third pointing coordinate
Figure US09804689-20171031-P00036
lies within the operating area 111 of the first image frame F1, the display apparatus 20 will correspondingly display the cursor 23 on the display area of the screen shown thereon according to a display aspect ratio configured for the display apparatus 20 upon receiving the cursor parameter. Specifically, when the handheld pointer device 10 transmits the cursor parameter for controlling the display position of the cursor 23 along with the predetermined display ratio to the display apparatus 20 with the communication unit 16, the display apparatus 20 operatively computes the display position of the cursor 23 and correspondingly positions the cursor 23 on the screen shown by the display apparatus 20 according to the current display aspect ratio (i.e., the resolution of the display apparatus 20). Those skilled in the art should be able to infer the method of computing the display position of the cursor 23 on the screen shown by the display apparatus 20 according to the current display aspect ratio and the cursor parameter, hence further descriptions are hereby omitted.
It is worth to mention that as shown in FIG. 4A˜FIG. 4C, the reference point images 113, 113 a, and 113 b in the instant disclosure are respectively represents by a circle, however the reference point images 113, 113 a, and 113 b may also be represented by a cross-shaped or a star shaped symbol. The present disclosure is not limited to the example illustrated in FIG. 4A˜FIG. 4C. Additionally, if the interactive system of FIG. 2 utilizes two or more reference points 21, then the image positions of the reference point images 113, 113 a and 113 b formed in the frames can be configured to be the average-coordinate between/among the reference point images identified. Moreover, the processing unit 13 further can compensate the computation of the image positions of the reference point images according to the preset image-forming parameters and the preset image-forming distance, so as to accurately determine the position of the reference point image. Those skilled in the art should be able to know the configuration of the preset image-forming parameters and the image forming distance as well as apply compensation to the image positions of the reference point images 113, 113 a, and 113 b in the frame computed using the preset image-forming parameters and the image forming distance, and further details are hereby omitted.
Please refer to FIG. 5 in conjunction with FIG. 1 for clearly understandings over the operation of the pointer position method for the handheld pointer device 10. FIG. 5 shows a diagram illustrating the movement of the cursor displayed on the display apparatus 20 provided in accordance to an exemplary embodiment of the present disclosure.
The display position of the cursor 23 a corresponds to the pointing coordinate computed by the handheld pointer device 10 at time TA using the first tilt angle θ1. The display position of the cursor 23 b corresponds to the pointing coordinate computed by the handheld pointer device 10 at time TB using the first tilt angle θ1. The display position of the cursor 23 c corresponds to the pointing coordinate computed by the handheld pointer device 10 at time TC using the first tilt angle θ1. At time TC, the handheld pointer device 10 updates the first tilt angle θ1 to the second tilt angle θ2. The handheld pointer device 10, at same time computes the first pointing coordinate
Figure US09804689-20171031-P00037
and the second pointer coordinate
Figure US09804689-20171031-P00038
using the first tilt angle θ1 and the second tilt angle θ2, respectively to obtain the first displacement vector
Figure US09804689-20171031-P00039
. The display position of the cursor 23 d corresponds to the compensated third pointing coordinate
Figure US09804689-20171031-P00040
computed by the handheld pointer device 10 at time TD according to the first displacement vector
Figure US09804689-20171031-P00041
and the second tilt angle θ2. The display position of the cursor 25 a corresponds to the pointing coordinate directly computed by the handheld pointer device 10 at time TC using the second tilt angle θ2 without any compensation. That is, when no compensation is applied to the pointing coordinate computed after the handheld pointer device 10 updated the tilt angle used, the display position of the cursor will be at the position corresponds to the cursor 25 a. More specifically, as shown in FIG. 5, the display position of cursor 23 c will suddenly jump to the display position of cursor 25 a when no compensation is applied to pointing coordinate after tilt angle adjustment, which degrades the user's operability.
Therefore, by gradually compensating pointing coordinates computed after tilt angle update according to the displacement generated after tilt angle update using the method described in the instant embodiment during the cursor position computation, and positioning the cursor at the display position of the cursor 23 d, which is a distance d from the display position of the cursor 25 a without calibration and compensation, effectively resolves the cursor jumping issue.
In short, the handheld pointer device 10 of the instant embodiment is operable to determine whether or not to compensate the pointing coordinates after updated the first tilt angle θ1 to the second tilt angle θ2 (e.g., whether cursor jumping issue is noticeable to the user). Moreover, when the handheld pointer device 10 determines to compensate the pointing coordinates computed after tilt angle update, the handheld pointer device 10 compensates the pointing coordinates according to the displacement generated after the handheld pointer device 10 updates the first tilt angle θ1 to the second tilt angle θ2.
To increase the cursor control precision and improve the user's operability, the instant embodiment further provides a cursor position calibration algorithm. The cursor position calibration algorithm can cause the cursor controlled to translate smoothly from the current moving path to the movement path that corresponds to the actual movement path of the handheld pointer device 10 after the tilt angle update within a preset calibration time or preset number of calibration. Such that the occurrence of the cursor jumping from one place to another can be prevented while and the directivity of the handheld pointer device 10 can be maintained.
Details on the implementation of the cursor position calibration algorithm are provided in the following paragraphs. Please refer to FIG. 6 and FIG. 7 in conjunction with FIG. 2. FIG. 6 shows a flowchart diagram illustrating a method for calibrating the cursor position after the tilt angle update provided in accordance to an exemplary embodiment of the present disclosure. FIG. 7 shows a diagram illustrating the movement of the cursor displayed on the display apparatus along with the movement of a handheld pointer device provided in accordance to an exemplary embodiment of the present disclosure.
In Step S601, when the processing unit 13 updates the first tilt angle θ1 to the second tilt angle θ2, the processing unit 13 initiates a cursor position calibration program and causes the handheld pointer device 10 to operate in a cursor calibration mode.
In Step S603, the processing unit 13 sets the number of calibrations as N, a compensation vector as C and a calibration coordinate pc . The calibration coordinate
Figure US09804689-20171031-P00042
herein is the pointing coordinate that requires compensation, such as the third pointing coordinate
Figure US09804689-20171031-P00043
computed based on the image position of the reference point in the second frame F2 and the second tilt angle θ2. The processing unit 13 stores N, C, and the calibration coordinate
Figure US09804689-20171031-P00044
in the memory unit 15.
More specifically, the processing unit 13 determines whether the first displacement vector
Figure US09804689-20171031-P00045
is greater than a second predetermined threshold. When the processing unit 13 determines that the first displacement vector
Figure US09804689-20171031-P00045
is greater than the second predetermined threshold, the processing unit 13 operatively sets N equal to the first displacement vector
Figure US09804689-20171031-P00045
divided by C wherein C is a predetermined compensation value. On the contrary, when the processing unit 13 determines that the first displacement vector
Figure US09804689-20171031-P00045
is less than the second predetermined threshold, the processing unit 13 operatively sets C equal to the first displacement vector
Figure US09804689-20171031-P00045
divided by N wherein N is a predetermined number of calibrations.
It worth to note that the aforementioned first predetermined threshold and the second predetermined threshold may be configured to be the same or different depend upon the practical operational requirements of the handheld pointer device 10 and/or the type of the software application executed on the display apparatus 20.
Briefly, when the first displacement vector
Figure US09804689-20171031-P00045
is determined to be greater than the second predetermined threshold, indicates that the angle difference is relative large and requires a larger compensation vector, the processing unit 13 automatically selects the constant compensation method and gradually compensates pointing coordinates computed after tilt angle update to avoid the occurrence of cursor jumping negatively affecting the user operation. When determines that the first displacement vector
Figure US09804689-20171031-P00046
is less than the second predetermined threshold, indicates that the angle difference is relative small, the processing unit 13 quickly compensates and corrects the pointing coordinates computed within the preset number of calibrations.
Particularly, when the processing unit 13 determines to compute C according to the first displacement vector
Figure US09804689-20171031-P00047
and N, the processing unit 13 can use Eq. (6) to compute C,
C = N = ( p 2 ( θ 2 ) _ - p 1 ( θ 1 ) _ ) N , ( 6 )
wherein C represents the compensation vector;
Figure US09804689-20171031-P00048
represents the first displacement vector;
Figure US09804689-20171031-P00049
represents the first pointing coordinate;
Figure US09804689-20171031-P00050
represents the second pointing coordinate; N represents the number of calibrations and N is a constant value. According to Eq. (6), the larger the N is, the smaller the C is per each calibration; the smaller the N is, the larger the C is per each calibration.
In one embodiment, the processing unit 13 may set N according to the frame capturing rate or the preset calibration time configured by the user via the input unit 14. For instance, when the user configures the handheld pointer device 10 to complete the cursor position calibration program within 5 frames based on the frame capturing rate, the processing unit 13 sets N to be 5 and computes C according to N and the first displacement vector
Figure US09804689-20171031-P00051
. For another instance, when the user configures the preset calibration time to be 5 seconds (i.e., causes the handheld pointer device 10 to complete the cursor position calibration program within 5 seconds) and configures the frame capturing rate to be 5 frames per second, the processing unit 13 operatively sets N to be 25 and computes C according to N and the first displacement vector
Figure US09804689-20171031-P00052
.
On the other hand, when the processing unit 13 determines to compute N according to the first displacement vector
Figure US09804689-20171031-P00053
and C, the processing unit 13 can use Eq. (7) to compute N,
N = S 1 _ C = ( p 2 ( θ 2 ) _ - p 1 ( θ 1 ) _ ) C ( 7 )
wherein C represents the compensation vector and C is a constant value;
Figure US09804689-20171031-P00054
represents the first displacement vector;
Figure US09804689-20171031-P00055
represents the first pointing coordinate;
Figure US09804689-20171031-P00056
represents the second pointing coordinate; N represents the number of calibrations. According to Eq. (7), the larger the C is, the smaller the N is; the smaller the C is, the larger the N is.
In one embodiment, the processing unit 13 can configure C according to the resolution of the display apparatus 20 provided by the user via the input unit 14. For example, when the user configures the handheld pointer device 10 to correct one-degree difference per each calibration in accordance to the resolution of the display apparatus 20, and each degree corresponds to three pixel, the processing unit 13 operatively sets C to be 3 and computes N according to C and the first displacement vector
Figure US09804689-20171031-P00057
.
The user of the handheld pointer device 10 as described may also configure N and C based on the accuracy or precision needed by the software application executed on the display apparatus 20 through the user interface provided by the input unit 14.
In Step S605, the processing unit 13 determines whether to update the second tilt angle θ2 to a third tilt angle θ3. When the processing unit 13 determines to update the second tilt angle θ2 to the third tilt angle θ3, the processing unit 13 executes Step S607; otherwise, the processing unit 13 executes Step S611.
In Step S607, the processing unit 13 computes a second displacement vector
Figure US09804689-20171031-P00058
generated due to the instant rotation of the handheld pointer device 10. Specifically, the processing unit 13 drives the image capturing unit 11 to capture and generates a third frame F3. The processing unit 13 computes a fourth pointing coordinate
Figure US09804689-20171031-P00059
and a fifth pointing coordinate
Figure US09804689-20171031-P00060
using the second tilt angle θ2 and the third tilt angle θ3, respectively, in coordination with the image position of the reference point formed in the third frame F3. The processing unit 13 subsequently computes the second displacement vector
Figure US09804689-20171031-P00061
according to the fourth pointing coordinate
Figure US09804689-20171031-P00062
and the fifth pointing coordinate
Figure US09804689-20171031-P00063
. The third frame F3 is captured and generated at a later time than the second frame F2.
In other words, during the execution of cursor position calibration program, the processing unit 13 operatively determines whether the handheld pointer device 10 has generated new rotation angle under user's operation, and correspondingly compensates the cursor position computed according to the displacement generated after updated the second tilt angle θ2 to the third tilt angle θ3, thereby improves the directivity of the handheld pointer device 10 and at same time resolves the cursor jumping issue.
In Step S609, when the processing unit 13 determines that the second tilt angle θ2 has updated to the third tilt angle θ3, the processing unit 13 computes the sum of the calibration coordinate
Figure US09804689-20171031-P00064
, the second displacement vector
Figure US09804689-20171031-P00065
, and C to generate a compensated pointing coordinate
Figure US09804689-20171031-P00066
, e.g., the compensated third pointing coordinate
Figure US09804689-20171031-P00067
. Particularly, the compensated pointing coordinate
Figure US09804689-20171031-P00068
is computed using Eq. (8),
_ = + C = p 3 ( θ2 ) + ( p 2 ( θ2 ) - p 1 ( θ1 ) + ( p 5 ( θ3 ) - p 4 ( θ2 ) ) N , ( 8 )
wherein
Figure US09804689-20171031-P00069
represents the compensated pointing coordinate;
Figure US09804689-20171031-P00070
represent calibration coordinate; C represents the compensation vector;
Figure US09804689-20171031-P00071
represents the first pointing coordinate;
Figure US09804689-20171031-P00072
represents the second pointing coordinate;
Figure US09804689-20171031-P00073
represents the third pointing coordinate;
Figure US09804689-20171031-P00074
represents the fourth pointing coordinate;
Figure US09804689-20171031-P00075
represents the fifth pointing coordinate; N represents the number of calibrations.
In Step S611, when the processing unit 13 determines that second tilt angle θ2 has not been updated, i.e., no tilt angle update operation has been executed, the processing unit 13 computes the sum of the calibration coordinate
Figure US09804689-20171031-P00076
and C to generate the compensated pointing coordinate
Figure US09804689-20171031-P00077
, e.g., the compensated third pointing coordinate
Figure US09804689-20171031-P00078
. Particularly, the compensated pointing coordinate
Figure US09804689-20171031-P00079
is computed using Eq. (9),
= + C = p 3 ( θ2 ) + ( p 2 ( θ2 ) - p 1 ( θ1 ) ) N , ( 9 )
wherein
Figure US09804689-20171031-P00080
represents the compensated pointing coordinate;
Figure US09804689-20171031-P00081
represent calibration coordinate; C represents the compensation vector;
Figure US09804689-20171031-P00082
represents the first pointing coordinate;
Figure US09804689-20171031-P00083
represents the second pointing coordinate;
Figure US09804689-20171031-P00084
represents the third pointing coordinate; N represents the number of calibrations.
In Step S613, the processing unit 13 generates and outputs the cursor parameter for controlling the display position of the cursor on the display apparatus 20 according to the compensated pointing coordinate
Figure US09804689-20171031-P00085
computed. The processing unit 13 drives the communication unit 16 to output the cursor parameter to the display apparatus 20 and causes the cursor to smoothly translate a distance d1 to the target position (i.e., the display position of the cursor 33 b shown in FIG. 7) from the current cursor position (i.e., the display position of the cursor 33 a shown in FIG. 7). The display position of the cursor 33 a corresponds to the pointing coordinate computed by the handheld pointer device 10 using the first tilt angle θ1. The display position of the cursor 35 a corresponds to the pointing coordinate computed by the handheld pointer device 10 using the second tilt angle θ2.
In Step S615, the processing unit 13 sets the calibration coordinate
Figure US09804689-20171031-P00086
to be the newly computed pointing coordinate e.g., a sixth pointing coordinate. The sixth pointing coordinate is computed according to image position of the reference point formed in a forth frame F4 using the second tilt angle θ2 or the third tilt angle θ3 (e.g., when the handheld pointer device 10 updated the second tilt angle θ2 to the third tilt angle θ3). In Step S617, the processing unit 13 executes N−1 (i.e. decrement the number of calibrations by one). The processing unit 13 stores the calibration coordinate
Figure US09804689-20171031-P00087
and the number of calibrations after decremented by one in the memory unit 15. In Step S619, the processing unit 13 determines whether N is equal to zero, i.e., whether the cursor position calibration program has been completed.
When the processing unit 13 determines that N is equal to zero, i.e., the cursor position calibration program has been completed, the processing unit 13 executes S621. Conversely, when the processing unit 13 determines that N is not equal to zero, i.e., the cursor position calibration program has not been completed, the processing unit 13 returns to Step S605. Specifically, the processing unit 13 drives the image capturing unit 11 to capture a fifth frame F5 and performs steps of computing a seventh pointing coordinate of the handheld pointer device 10 relative to the reference point according to the image position of the reference point formed in the fifth frame F5 and the second tilt angle θ2 or the third tilt angle θ3, setting the seventh pointing coordinate as the calibration coordinate, and computing the compensated pointing coordinate based on the calibration coordinate and C. So that, the cursor displayed on the display apparatus 20 translates or moves a distance d2 from the display position of cursor 33 b to the display position of cursor 33 c as illustrated in FIG. 7.
Thereafter, the processing unit 13 re-executes Steps S605˜S619 and sequentially captures N−2 frames (not shown) for continue to compensate the pointing coordinates computed and compute the cursor position accordingly, until N is equal to zero.
When the processing unit 13 completed the cursor position calibration program, as show in FIG. 7, the cursor displayed om the display apparatus 20 smoothly moves N times from the display position that corresponds to the first pointing coordinate
Figure US09804689-20171031-P00088
(e.g., the display position of the cursor 33 a) to the position currently pointed by the handheld pointer device 10. More specifically, the cursor displayed on the display apparatus 20 is translated smoothly from the display position (i.e., the display position of the cursor 33 a) that corresponds to the first pointing coordinate
Figure US09804689-20171031-P00089
to the display position (i.e., the display position of the cursor 33N) that corresponds to the position currently pointed by the handheld pointer device 10 relative to the display apparatus 20 in accordance to the distances d1, d2, d3, . . . , do computed after the Nth frame
In Step S621, the processing unit 13 computes the cursor position in the subsequent movement of the handheld pointer device 10 according to the image position of the reference point formed in one of the frames captured along with the tilt angle presently used in cursor position computation, so as to improves the accuracy in cursor control operation.
It is worth to note that, whenever the handheld pointer device 10 updated the tilt angle thereof during the cursor position calibration, the processing unit 13 accumulates the displacement generated and correspondingly adjusts the amount of compensation applied, i.e., adjusting C, for maintaining the directivity of the handheld pointer device 10. The processing unit 13 further operatively determines whether to incorporate the displacement generated after tilt angle update into pointing coordinate compensation computation or not according to angle difference before and after the tilt angle update and/or the magnitude of the displacement vector generated after the tilt angle update.
Additionally, the processing unit 13 may also constantly communicate with the display apparatus 20 via the communication unit 16 during the operation of the handheld pointer device 10, so as to obtain information associated with the software application executed on the display apparatus 20 including but not limited to the type and the execution progress of the software application, the frame refreshing rate, and the resolution required by the display apparatus 20 in the execution of the software application. The processing unit 13 can operatively determine whether or not to execute the cursor position calibration program as well as configuring the calibration parameters for the cursor position calibration program according to the information obtained from the display apparatus. The calibration parameters for the cursor position calibration program in the instant embodiment includes but not limited to the predetermined threshold (such as the first and the second predetermined thresholds), the preset angle, the number of calibrations, the calibration time, and the amount of compensation in each calibration.
In practice, the pointer positioning method of FIG. 3 and the method of calibrating the cursor position after the tilt angle update can be implemented by writing the corresponding program codes into the processing unit 13 (such as microcontroller or an embedded controller) via firmware design and executed by the processing unit 13 during the operation of the handheld pointer device 10, however the present disclosure is not limited thereto.
FIG. 3 is merely used for illustrating a pointer positioning method for the handheld pointer device 10, and the present disclosure is not limited thereto. Similarly, FIG. 6 is merely used for illustrating an implementation method of the cursor position calibration algorithm and shall not be used to limit the present disclosure. FIG. 4A˜FIG. 4D are merely used to illustrate the computation of pointing coordinates and the relationship between the operating area (i.e., the display area) of the display apparatus 20 and the center of the image sensing array of the image capturing unit 11 (i.e. the center pointer “+”) and should not be used to limit the present disclosure. FIG. 5 and FIG. 7 are merely used to illustrate the operation of the handheld pointer device 10 and the pointer positioning method in coordination with FIG. 3 and FIG. 5, respectively and the present disclosure is not limited thereto.
(Another Exemplary Embodiment of a Handheld Pointer Device)
From the aforementioned exemplary embodiments, the present disclosure can generalize another pointer positioning method for the aforementioned handheld pointer device of the interactive system. Please refer to FIG. 8 in conjunction with FIG. 1 and FIG. 2. FIG. 8 shows a flowchart diagram illustrating a pointer positioning method provided in accordance to another exemplary embodiment of the present disclosure. The pointer positioning method of FIG. 8 can be implemented by programming the processing unit 13 via firmware design and executed by the processing unit 13 during the operation of the handheld pointer device 10.
In Step S801, the processing unit 13 of the handheld pointer device 10 determines whether to update a first tilt angle presently used in the cursor position computation to a second tilt angle. When the processing unit 13 determined to update the first tilt angle presently used in the cursor position computation to a second tilt angle, the processing unit 13 executes Step S803; otherwise, the processing unit 13 returns to Step S801.
Specifically, the processing unit 13 operatively determines whether the reference point 21 has substantially moved according to a plurality of frames generated by the image capturing unit 21, wherein the image capturing unit 21 captures images corresponding to the position of the reference point 21 and sequentially generates the plurality of frames. The processing unit 13 of the handheld pointer device 10 determines to update the first tilt angle to the second tilt angle upon determined that the reference point 21 has not substantially moved i.e., the handheld pointer device 10 is at rest.
Incidentally, in other embodiment, the handheld pointer device 10 may also determine whether to update the first tilt angle to the second tilt angle by determining whether the pointing coordinate computed based on the image position of the reference point 21 in frames captured has substantially moved. For instance, when the processing unit 13 of the handheld pointer device 10 determines that pointing coordinate computed based on the position of the reference point 21 has not substantially moved, the processing unit 13 updates the first tilt angle presently used in the cursor position computation to the second tilt angle.
In Step S803, the processing unit 13 operatively drives the image capturing unit 11 to capture and generate a first frame containing the reference point 21 after the processing unit 13 updated the first tit angle presently used in the cursor position computation to the second tilt angle.
In Step S805, the processing unit 13 computes an angle difference between the first tilt angle and the second tilt angle.
In Step S807, the processing unit 13 determines whether the angle difference between the first tilt angle and the second tilt angle is smaller than a preset angle, e.g., 20 degrees. When the angle difference between the first tilt angle and the second tilt angle is computed to be smaller than the preset angle, the processing unit 13 executes Step S809; otherwise, the processing unit 13 executes Step S811.
In Step S809, the processing unit 13 drives the image capturing unit 11 to capture and generate a second frame containing the reference point 21. The second frame is captured and generated at a later time than the first frame. The processing unit 13 directly computes the cursor position of the cursor 23 based on the image position of the reference point formed in the second frame and the second tilt angle. That is to say, when the angle difference between the first tilt angle and the second tilt angle is computed to be smaller than the preset angle, e.g., 20 degrees, the processing unit 13 determines that the cursor jumping phenomenon is not noticeable to human eye and computes the cursor position of the cursor 23 directly based on the image position of the reference point formed in the second frame without applying any compensation.
In Step S811, the processing unit 13 computes a first pointing coordinate according to the image position of the reference point 21 formed in the first frame and the first tilt angle. In Step S813, the processing unit 13 computes a second pointing coordinate according to the image position of the reference point 21 formed in the first frame and the second tilt angle. The processing unit 13 stores the first and the second pointing coordinates in the memory unit 15. Algorithm used for computing the first and the second pointing coordinates are essentially the same as described in the aforementioned embodiment, and further descriptions are hereby omitted.
In Step S815, the processing unit 13 computes the cursor position of the cursor 23 in the subsequent movement of the handheld pointer device 10 on the basis of the offset between the first pointing coordinate and the second pointing coordinate generated after the first tilt angle is updated to the second tilt angle along with the movement of the handheld pointer device 10.
In Step S817, the processing unit 13 generates a cursor parameter for controlling the display position of the cursor 23 on the display apparatus 20 according to the computational result from either Step S809 or Step S815. The processing unit 13 drives the communication unit 16 to wirelessly transmit the cursor parameter to the display apparatus 20 for correspondingly controlling the display position of the cursor 23 on the display apparatus 20.
FIG. 8 is merely used for illustrating another pointer positioning method for the handheld pointer device and the present disclosure is not limited thereto. Those skilled in art should be able to select the method for determining whether the handheld pointer device 10 is at rest, such as by analyzing the displacement, the velocity, or the acceleration associated with the image position of the reference point 21 formed in a set of consecutive frames captured, or by analyzing displacement information at least two pointing coordinates which is computed based on the image position of the reference point in a set of consecutive frames captured, or by analyzing the magnitude of an acceleration vector generated based on multiple accelerations of the handheld pointer device 10 detected over multiple axes, so as to determine whether to cause the handheld pointer device to update the first tilt angle to the second tilt angle according to practical operation requirements of the handheld pointer device 10. Moreover, the method for calibrating the cursor position after the tilt angle update described in the aforementioned embodiment can be executed during the execution of Step S815.
(Another Exemplary Embodiment of a Handheld Pointer Device)
From the aforementioned exemplary embodiments, the present disclosure can generalize another pointer positioning method for the aforementioned handheld pointer device of the interactive system. Please refer to FIG. 9 in conjunction with FIG. 1 and FIG. 2. FIG. 9 shows a flowchart diagram illustrating a pointer positioning method provided in accordance to another exemplary embodiment of the present disclosure. The pointer positioning method of FIG. 9 can be implemented by programming the processing unit 13 via firmware design and executed by the processing unit 13 during the operation of the handheld pointer device 10.
In Step S901, the processing unit 13 of the handheld pointer device 10 updated a first tilt angle presently used in the cursor position computation to a second tilt angle at a first time interval. To put it concretely, the processing unit 13 operatively reads accelerations of the handheld pointer device 10 over multiple axes (e.g., X-axis, Y-axis, and Z-axis) detected by the accelerometer unit 12. Particularly, the accelerometer unit 12 operatively generates an acceleration vector according to the accelerations of the handheld pointer device 10 detected and generates an acceleration vector accordingly to the processing unit 13 in signal form (i.e., the acceleration sensing signal). The processing unit 13 then computes the instant tilt angle of the handheld pointer device 10 using Eqs. (1)˜(3) with the acceleration vector of the handheld pointer device 10 and the included angles computed between any two axes and correspondingly updates the first tilt angle presently used to the second tilt angle.
In the first time interval, the processing unit 13 drives the image capturing unit 11 to capture and generate a first frame containing the reference point 21.
In Step S903, the processing unit 13 computes a first pointing coordinate and a second point coordinate at the first time interval using the first tilt angle and the second tilt angle, respectively, in coordination with the image position of the reference point 21 formed in the first frame.
At the same time, the processing unit 13 computes the cursor position according to the first pointing coordinate and generates the cursor parameter accordingly for controlling the display position of the cursor 23 on the display apparatus 20. The processing unit 13 drives the communication unit 16 to wirelessly transmit the cursor parameter to the display apparatus 20 at the first time interval and causes the cursor 23 to be fixed at the first pointing coordinate. Cursor position computation and positioning methods are essentially the same as described in the aforementioned embodiment, and further details are hereby omitted.
In Step S905, the processing unit 13 computes a first displacement vector between the first pointing coordinates and the second pointing coordinate.
In Step S907, the processing unit 13 generates a compensating vector per unit displacement according to the first displacement vector. In one embodiment, the processing unit 13 operatively determines whether to compute the compensating vector per unit displacement based on a predetermined number of calibrations or a constant amount of compensation per calibration according to an angle difference between the first and the second tilt angles and/or the first displacement vector. When the processing unit 13 determines to complete the pointing coordinate calibration within the predetermined number of calibration, the processing unit 13 computes the compensating vector per unit displacement by dividing a predetermined number of calibration or a calibration time from the first displacement vector computed. On the contrary, the processing unit 13 may set the compensating vector per unit displacement based on the amount of compensation per calibration and compute the number of calibrations by dividing the compensating vector per unit displacement from the first displacement vector.
It is worth to note that in on embodiment, the processing unit 13 may set the number of calibration or the calibration time according to a frame capturing rate or a predetermined time. In another embodiment, the processing unit 13 can also set the number of calibration, the calibration time, and the amount of compensation per each calibration based on the type of software application, e.g., type of game software, executed by the display apparatus 20. Calibration parameters configuration method has been detailed explained in above described embodiments, and further descriptions are hereby omitted.
In Step S909, the processing unit 13 drives the image capturing unit 11 to capture and generate a second frame containing the reference point 21 in a second time interval. The processing unit 13 further computes a third pointing coordinate according to the image position of the reference point formed in the second frame and the second tilt angle. The second time interval occurs after the first time interval. That is, the second frame is captured at a later time than the first frame.
In Step S911, the processing unit 13 initiates a cursor position calibration program in the second time interval and computes the cursor position according to the third pointing coordinate and the compensating vector per unit displacement. Particularly, the processing unit 13 implements the cursor position calibration method depicted in FIG. 6 and calibrates the third pointing coordinate.
In Step S913, the processing unit 13 computes the display position of the cursor 23 on the display apparatus 20 at the second time interval. More specifically, the processing unit 13 computes and generates the cursor parameter according to the third pointing coordinate for controlling the display position of the cursor 23 on the display apparatus 20. The processing unit 13 further drives the communication unit 16 to wirelessly transmit the cursor parameter to the display apparatus 20 to control the display position of the cursor 23 on the display apparatus 20 at the second time interval.
In Step S915, the processing unit 13 drives the image capturing unit 11 to capture and generate a third frame containing the reference point at a third time interval. The processing unit 13 then computes a fourth pointing coordinate according to the image position of the reference point 21 formed in the third frame and the second tilt angle. The third time interval occurs after the second time interval. That is, the third frame is captured at a later time than the second frame. The time interval between the second and third time interval can be designed based on the preset number of calibrations or the preset calibration time configured.
In Step S917, the processing unit 13 computes the display position of the cursor 23 on the display apparatus 20 at the third time interval according to the fourth pointing coordinate. In Step S919, the processing unit 13 generates the cursor parameter for controlling the display position of the cursor 23 on the display apparatus 20. The processing unit 13 further drives the communication unit 16 to wirelessly transmit the cursor parameter to the display apparatus 20 to control the display position of the cursor 23 on the display apparatus 20 at the third time interval.
It is worth to note that, during the second time interval, the processing unit 13 may determine whether to calibrate and compensate pointing coordinates computed using the second tilt angle according to the first displacement vector and/or the angle difference between the first and the second tilt angles. In particular, when the first displacement vector between the first and the second pointing coordinate is less than a first predetermined threshold (e.g., 5 pixels) and/or the angle difference between the first and the second tilt angle is smaller than a preset angle (e.g., 20 degrees), the processing unit 13 does not initiate the cursor position calibration program and computes the cursor position directly according to the third pointing coordinate. Thereafter, the processing unit 13 generates the cursor parameter for controlling the display position of the cursor 23 on the display apparatus 20, accordingly.
Additionally, the processing unit 13 in the instant embodiment can further store the first and the second tilt angles, the first pointing coordinate, the second pointing coordinate, the third pointing coordinate, the first displacement vector, the compensating vector per unit displacement in the memory unit 15. Those skilled in the art should be able to program the processing unit 13 to utilize algorithm for determining whether to update the first tilt angle to the second tilt angle in the first time interval via firmware design. That is, the processing unit 13 can be programmed with necessary program codes to determine whether the handheld pointer device 10 is in motion or at rest e.g., whether the reference point 21 or the pointing coordinate associated with the position of the reference point 21 has substantially moved, to determine whether to update the tilt angle presently used by the handheld pointer device 10 in the cursor position computation.
It should be noted that FIG. 9 is merely used to describe a pointer positioning method for the handheld pointer device 10 and the present disclosure is not limited thereto.
Additionally, the present disclosure also discloses a non-transitory computer-readable media for storing the computer executable program codes of the pointer position methods depicted in FIG. 3, FIG. 8, and FIG. 9 as well as the cursor position calibration method depicted in FIG. 6. The non-transitory computer-readable media may be a floppy disk, a hard disk, a compact disk (CD), a flash drive, a magnetic tape, accessible online storage database or any type of storage media having similar functionality known to those skilled in the art.
In summary, exemplary embodiments of the present disclosure provide a handheld pointer device and a pointer positioning method thereof. The handheld pointer device and the pointer method thereof area can be adapted for controlling the operation of a cursor displayed on a display apparatus. The pointer positioning method disclosed operatively calibrates and corrects pointing coordinates in the computation of cursor position after the handheld pointer device updated the tilt angle thereof so that the display position of the cursor can be adjusted to gradually move to the correct position which the handheld pointer device actually point toward within a preset calibration time or a preset number of calibration. Accordingly, the issue of the cursor suddenly jump from one place to another after the tilt angle has updated can be effectively avoid. Thereby, enhance the stability of the handheld pointer device and at the same time, the operation convenience and of the user.
Moreover, the pointer positioning method enables the handheld pointer device to actively determine whether to calibrate the pointing coordinate computed using the updated tilt angle and the associated calibration and compensation method based on the degree of precision required by the type of software application executed on the display apparatus and the resolution of the display apparatus, thereby enhances the practicality and applicability of the handheld pointer device.
The above-mentioned descriptions represent merely the exemplary embodiment of the present disclosure, without any intention to limit the scope of the present disclosure thereto. Various equivalent changes, alternations or modifications based on the claims of present disclosure are all consequently viewed as being embraced by the scope of the present disclosure.

Claims (40)

What is claimed is:
1. A pointer positioning method of a handheld pointer device, comprising:
capturing a first frame containing a reference point when the handheld pointer device updates a first tilt angle presently used to a second tilt angle;
computing a first pointing coordinate according to the image position of the reference point formed in the first frame and the first tilt angle;
computing a second pointing coordinate according to the image position of the reference point formed in the first frame and the second tilt angle;
capturing a second frame containing the reference point and computing a third pointing coordinate according to the image position of the reference point formed in the second frame and the second tilt angle; and
computing a cursor position according to the first pointing coordinate, the second pointing coordinate, and the third pointing coordinate and correspondingly generating a cursor parameter controlling a display position of a cursor on a display apparatus, wherein the step of computing the cursor position comprises:
generating the cursor parameter controlling the display position of the cursor on the display apparatus according to the third pointing coordinate when a first displacement vector between the first and the second pointing coordinates is computed to be less than a first predetermined threshold;
computing the cursor position according to the first displacement vector and the third pointing coordinate when the first displacement vector between the first and the second pointing coordinates is greater than the first predetermined threshold; and further comprising:
a) setting a number of calibrations as N, a compensation vector as C and a calibration coordinate, wherein the calibration coordinate is the third pointing coordinate;
b) determining whether the first displacement vector is greater than a second predetermined threshold;
c) setting N equal to the first displacement vector divided by C when determines that the first displacement vector is greater than the second predetermined threshold, wherein C is a predetermined compensation value; setting C equal to the first displacement vector divided by N when determines that the first displacement vector is less than the second predetermined threshold, wherein N is a predetermined number of calibrations;
d) computing the sum of the calibration coordinate and C to generate a compensated pointing coordinate;
e) generating the cursor parameter to correspondingly control the display position of the cursor on the display apparatus according to the compensated pointing coordinate;
f) executing N−1 and determining whether N is equal to zero; and
g) setting a fourth pointing coordinate to be the calibration coordinate and returning to step d) upon determined that N is not equal to zero; wherein the forth pointing coordinate is computed according to the image position of the reference point formed in a third frame and the second tilt angle.
2. The pointer positioning method according to claim 1, wherein the step of computing the cursor position comprises:
generating the cursor parameter controlling the display position of the cursor on the display apparatus according to the third pointing coordinate when an angle difference between the first and the second tilt angles is computed to be smaller than a preset angle.
3. The pointer positioning method according to claim 2, wherein the preset angle is set according to the type of a software application executed on the display apparatus.
4. The pointer positioning method according to claim 1, wherein the step of computing the cursor position further comprises:
h) when determines that N is equal to zero, causes the handheld pointer device to compute the cursor position in the subsequent movement of the handheld pointer device according to the image position of the reference point formed in a succeeding frame captured and the second tilt angle.
5. The pointer positioning method according to claim 1, wherein steps before the step d) comprise:
i) determining whether to update the second tilt angle to a third tilt angle;
j) computing a second displacement vector generated as the handheld pointer device rotates when determines that the handheld pointer device has updated the second tilt angle presently used to the third tilt angle; and
k) computing the sum of the calibration coordinate, the second displacement vector, and C to generate the compensated pointing coordinate.
6. The pointer positioning method according to claim 1, wherein the predetermined number of calibrations is set according to a frame capturing rate of the handheld pointer device used for capturing frames containing the reference point.
7. The pointer positioning method according to claim 6, wherein the frame capturing rate is configured by a user according to a preset calibration time.
8. The pointer positioning method according to claim 1, wherein first predetermined threshold is set according to the type of a software application executed on the display apparatus.
9. The pointer positioning method according to claim 1, wherein the handheld pointer device operatively updates the first tilt angle presently used to the second tilt angle upon determined that the reference point has not substantially moved.
10. The pointer positioning method according to claim 9, wherein the handheld pointer device determines whether the reference point has substantially moved by determining whether the image position of the reference point formed in the consecutive frames has moved.
11. The pointer positioning method according to claim 10, wherein the handheld pointer device operatively updates the first tilt angle presently used to the second tilt angle upon determined that the position displacement computed between the image position of the reference point formed in any two consecutive frames captured by the handheld pointer device is less than a predefined displacement threshold.
12. The pointer positioning method according to claim 10, wherein the handheld pointer device operatively updates the first tilt angle presently used to the second tilt angle upon determined that the velocity computed between the image position of the reference point formed in any two consecutive frames captured by the handheld pointer device is less than a predefined velocity threshold.
13. The pointer positioning method according to claim 1, wherein the handheld pointer device operatively updates the first tilt angle presently used to the second tilt angle upon determined that the magnitude of a acceleration vector of the handheld pointer device is equal to a gravitational acceleration of the handheld pointer device, wherein the acceleration vector is generated by the handheld pointer device according to accelerations of the handheld pointer device detected over the multiple axes.
14. The pointer positioning method according to claim 1, wherein the handheld pointer device operatively updates the first tilt angle presently used to the second tilt angle upon determined that the pointing coordinate, being computed based on the image position of the reference point formed in multiple consecutive frames using the first tilt angle, has not substantially moved.
15. A pointer positioning method of a handheld pointer device, comprising:
capturing a first frame containing a reference point when the handheld pointer device updates a first tilt angle presently used to a second tilt angle;
computing an angle difference between the first and the second tilt angles;
computing a first pointing coordinate according to the image position of the reference point formed in the first frame and the first tilt angle when the angle difference is larger than a preset angle;
computing a second pointing coordinate according to the image position of the reference point formed in the first frame and the second tilt angle;
causing the handheld pointer device to compute a cursor position of a cursor in the subsequent movement of the handheld pointer device on the basis of a first displacement vector between the first and the second pointing coordinates along with the pointing coordinate generated responsive to the movement of the handheld pointer device; wherein the step of computing the cursor position comprises:
a) setting a number of calibrations as N, a compensation vector as C and a calibration coordinate, wherein the calibration coordinate is a third pointing coordinate computed according to the image position of the reference point formed in a second frame and the second tilt angle;
b) determining whether the first displacement vector is larger than a predetermined threshold;
c) setting N equal to the first displacement vector divided by C when determines that the first displacement vector is greater than the predetermined threshold, wherein C is a predetermined compensation value; setting C equal to the first displacement vector divided by N when determines that the first displacement vector is smaller than the predetermined threshold, wherein N is a predetermined number of calibrations;
d) computing the sum of the calibration coordinate and C to generate a compensated pointing coordinate;
e) generating the cursor parameter to correspondingly control the display position of the cursor on the display apparatus according to the compensated pointing coordinate;
f) executing N−1 and determining whether N is equal to zero; and setting a fourth pointing coordinate to be the calibration coordinate and returning to step d) upon determined that N is not equal to zero; wherein the forth pointing coordinate is computed according to the image position of the reference point formed in a third frame and the second tilt angle; and
correspondingly generating a cursor parameter for controlling a display position of the cursor on a display apparatus.
16. The pointer positioning method according to claim 15, further comprising:
computing the cursor position according to the image position of the reference point formed in the second frame and the second tilt angle, and generating the cursor parameter correspondingly controlling the display position of the cursor on the display apparatus when determines that the angle difference between the first and the second tilt angles is smaller than the preset angle, wherein the second frame is captured and generated at a later time than the first frame.
17. The pointer positioning method according to claim 15, wherein the step of computing the cursor position further comprises:
a) when determines that N is equal to zero, causes the handheld pointer device to compute the cursor position in the subsequent movement of the handheld pointer device according to the image position of the reference point formed in a succeeding frame captured and the second tilt angle.
18. The pointer positioning method according to claim 15, wherein the preset angle is set according to the type of a software application executed on the display apparatus.
19. The pointer positioning method according to claim 15, wherein the handheld pointer device operatively updates the first tilt angle presently used to the second tilt angle upon determined that the image position of the reference point formed in the consecutive frames have not substantially moved.
20. The pointer positioning method according to claim 15, wherein the handheld pointer device operatively updates the first tilt angle presently used to the second tilt angle upon determined that the pointing coordinates, being computed based on the image position of the reference point formed in multiple consecutive frames using the first tilt angle, has not substantially moved.
21. A pointer positioning method of a handheld pointer device, comprising:
causing the handheld pointer device to update a first tilt angle presently used to a second tilt angle at a first time interval;
causing the handheld pointer device to compute a first pointing coordinate and a second point coordinate according to the image position of the reference point formed in a first frame using the first tilt angle and the second tilt angle in the first time interval, respectively;
causing the handheld pointer device to compute a third pointing coordinate according to the image position of the reference point formed in a second frame and the second tilt angle at a second time interval, wherein the second time interval occurs after the first time interval; and
computing the cursor position of a cursor according to the first pointing coordinate, the second pointing coordinate, and the third pointing coordinate to correspondingly generate a cursor parameter for controlling a display position of the cursor on a display apparatus; and computing the cursor position for controlling the display position of the cursor on the display apparatus according to the third pointing coordinate at the second time interval when a first displacement vector between the first and the second pointing coordinates is computed to be less than a first predetermined threshold; wherein the step of computing the cursor position comprises:
a) setting a number of calibrations as N, a compensation vector as C and a calibration coordinate, wherein the calibration coordinate is set as the third pointing coordinate;
b) determining whether the first displacement vector is greater than a second predetermined threshold;
c) setting N equal to the first displacement vector divided by C when determines that the first displacement vector is greater than the second predetermined threshold, wherein C is a predetermined compensation value; setting C equal to the first displacement vector divided by N when determines that the first displacement vector is less than the second predetermined threshold, wherein N is a predetermined number of calibrations;
d) computing the sum of the calibration coordinate and C to generate a compensated pointing coordinate;
e) generating the cursor parameter to correspondingly control the display position of the cursor on the display apparatus according to the compensated pointing coordinate;
f) executing N−1 and determining whether N is equal to zero; and
g) setting a fourth pointing coordinate to be the calibration coordinate and returning to step d) upon determined that N is not equal to zero.
22. The pointer positioning method according to claim 21, further comprising:
computing the first displacement vector between the first and the second pointing coordinates at the first time interval;
generating the compensating vector per unit displacement according to the first displacement vector; and
computing the third pointing coordinate using the compensating vector per unit displacement and the second tilt angle at the second time interval.
23. The pointer positioning method according to claim 21, wherein the first predetermined threshold is set according to the type of a software application executed on the display apparatus.
24. The pointer positioning method according to claim 21, further comprising:
computing the cursor position at the first time interval according to the first pointing coordinate; and
generating the cursor parameter according to the cursor position computed for controlling the display position of the cursor at the first time interval.
25. The pointer positioning method according to claim 21, further comprising:
computing the fourth pointing coordinate at a third time interval according to the image position of the reference point formed in a third frame captured and the second tilt angle; and
generating the cursor parameter according to the fourth pointing coordinate for controlling the display position of the cursor on the display apparatus at the third time interval.
26. The pointer positioning method according to claim 21, wherein the step of computing the cursor position comprises:
determining whether an angle difference between the first and the second tilt angle computed is smaller than a preset angle; and
computing the cursor position according to the third pointing coordinate and generating the cursor parameter for correspondingly controlling the display position of the cursor on the display apparatus at the second time interval upon determined that the angle difference is smaller than the preset angle.
27. The pointer positioning method according to claim 26, wherein the preset angle is set according to the type of a software application executed on the display apparatus.
28. A handheld pointer device, comprising:
an image capturing unit, configured to operatively capture a plurality of images corresponding to the position of a reference point and sequentially generate a plurality of frames;
an accelerometer unit, configured to detect a plurality of accelerations of the handheld pointer device over multiple axes and generating an acceleration vector; and
a processing unit coupled to the image capturing unit and the accelerometer unit, the processing unit configured to operatively compute a cursor position of a cursor according to the image positions of the reference points in the frames and a first tilt angle;
wherein when the processing unit updates the first tilt angle presently used in cursor position computation to a second tilt angle according to the plurality of accelerations detected, the processing unit operatively drives the image capturing unit to capture a first frame containing the reference point to respectively compute a first pointing coordinate and a second pointing coordinate using the first and the second tilt angles in coordination with the first frame, drives the image capturing unit to capture a second frame containing the reference point thereafter, computes the cursor position according to the image position of the reference frame in the second frame, the first pointing coordinate, the second pointing coordinate, and generates a cursor parameter for correspondingly controlling a display position of the cursor on a display apparatus;
wherein the processing unit computes the cursor position according to the image position of the reference point formed in one of the frames and the second tilt angle when the processing unit determines that a first displacement vector between the first and the second pointing coordinates is computed to be less than a first predetermined threshold; and the processing unit generates the first displacement vector according to the first and the second pointing coordinates and generates the cursor parameter for correspondingly controlling the display position of the cursor on the display apparatus according to the image position of the reference point formed in the second frame, the first displacement vector, and the second tilt angle;
wherein the processing unit computes the cursor position by executing the following steps:
a) setting a number of calibrations as N, a compensation vector as C and a calibration coordinate, wherein the calibration coordinate is a third pointing coordinate computed according the image position of the reference point in the second frame and the second tilt angle;
b) determining whether the first displacement vector is greater than a second predetermined threshold;
c) setting N equal to the first displacement vector divided by C when determines that the first displacement vector is greater than the second predetermined threshold, wherein C is a predetermined compensation value; setting C equal to the first displacement vector divided by N when determines that the first displacement vector is less than the second predetermined threshold, wherein N is a predetermined number of calibrations;
d) computing the sum of the calibration coordinate and C to generate a compensated pointing coordinate;
e) generating the cursor parameter to correspondingly control the display position of the cursor on the display apparatus according to the compensated pointing coordinate;
f) executing N−1 and determining whether N is equal to zero; and
g) setting a fourth pointing coordinate to be the calibration coordinate and returning to step d) upon determined that N is not equal to zero; wherein the forth pointing coordinate is computed according to the image position of the reference point formed in a third frame and the second tilt angle.
29. The handheld pointer device according to claim 28, wherein the processing unit computes the cursor position according to the image position of the reference point formed in one of the frames and the second tilt angle when the processing unit determines that an angle difference between the first and the second tilt angles is smaller than a preset angle.
30. The handheld pointer device according to claim 28, wherein the step of computing the cursor position further comprises:
h) when determines that N is equal to zero, causes the handheld pointer device to compute the cursor position in the subsequent movement of the handheld pointer device according to the image positions of the reference point formed in a succeeding frame and the second tilt angle.
31. The handheld pointer device according to claim 30, wherein the processing unit executes the following steps before executing step d):
i) determining whether to cause the handheld pointer device to update the second tilt angle to a third tilt angle during the computation of the compensated pointing coordinate;
j) computing a second displacement vector generated as the handheld pointer device rotates when determines that the handheld pointer device has updated the second tilt angle presently used to the third tilt angle; and
k) computing the sum of the calibration coordinate, the second displacement vector and C to generate the compensated pointing coordinate.
32. The handheld pointer device according to claim 28, further comprising:
an input unit coupled to the processing unit, configured for providing a user of the handheld pointer device to set N or C based on a frame capturing rate associated with the reference point.
33. The handheld pointer device according to claim 28, further comprising:
an input unit coupled to the processing unit, configured for providing a user of the handheld pointer device to configure a frame capturing rate for the reference point according to a preset calibration time and set N based on the frame capturing rate.
34. The handheld pointer device according to claim 28, wherein the handheld pointer device operatively updates the first tilt angle presently used to the second tilt angle upon determined that the reference point has not substantially moved.
35. The handheld pointer device according to claim 34, wherein the processing unit operatively updates the first tilt angle presently used to the second tilt angle upon determined that the position displacement computed between the image position of the reference point formed in any two consecutive frames captured is less than a predefined displacement threshold.
36. The handheld pointer device according to claim 34, wherein the processing unit operatively updates the first tilt angle presently used to the second tilt angle upon determined that the velocity computed between the image position of the reference point formed in any two consecutive frames captured is less than a predefined a predefined velocity threshold.
37. The handheld pointer device according to claim 28, wherein the processing unit operatively updates the first tilt angle presently used to the second tilt angle upon determined that the magnitude of the acceleration vector of the handheld pointer device is equal to a gravitational acceleration of the handheld pointer device.
38. The handheld pointer device according to claim 28, wherein the handheld pointer device operatively updates the first tilt angle presently used to the second tilt angle upon determined that the pointing coordinates computed based on the image position of the reference point formed in multiple consecutive frames using the first tilt angle have not substantially moved.
39. The handheld pointer device according to claim 28, further comprising:
a communication unit, configured to operatively transmit the cursor parameter of the cursor to the display apparatus wirelessly.
40. The handheld pointer device according to claim 28, wherein the accelerometer unit is an accelerometer or a gravitational sensor.
US14/536,769 2013-02-19 2014-11-10 Handheld pointer device and pointer positioning method thereof Active 2033-10-25 US9804689B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/536,769 US9804689B2 (en) 2013-02-19 2014-11-10 Handheld pointer device and pointer positioning method thereof
US15/687,525 US10379627B2 (en) 2013-02-19 2017-08-27 Handheld device and positioning method thereof

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US13/771,072 US20130328772A1 (en) 2012-06-07 2013-02-19 Handheld Pointing Device
TW102144801A 2013-12-06
TW102144801A TWI522848B (en) 2013-12-06 2013-12-06 Pointer device and pointer positioning method thereof
TW102144801 2013-12-06
US14/273,523 US10067576B2 (en) 2013-02-19 2014-05-08 Handheld pointer device and tilt angle adjustment method thereof
US14/536,769 US9804689B2 (en) 2013-02-19 2014-11-10 Handheld pointer device and pointer positioning method thereof

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/771,072 Continuation-In-Part US20130328772A1 (en) 2012-06-07 2013-02-19 Handheld Pointing Device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/687,525 Continuation US10379627B2 (en) 2013-02-19 2017-08-27 Handheld device and positioning method thereof

Publications (2)

Publication Number Publication Date
US20150054745A1 US20150054745A1 (en) 2015-02-26
US9804689B2 true US9804689B2 (en) 2017-10-31

Family

ID=52479898

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/536,769 Active 2033-10-25 US9804689B2 (en) 2013-02-19 2014-11-10 Handheld pointer device and pointer positioning method thereof
US15/687,525 Active 2033-07-04 US10379627B2 (en) 2013-02-19 2017-08-27 Handheld device and positioning method thereof

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/687,525 Active 2033-07-04 US10379627B2 (en) 2013-02-19 2017-08-27 Handheld device and positioning method thereof

Country Status (1)

Country Link
US (2) US9804689B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180356906A1 (en) * 2017-06-13 2018-12-13 Qisda Corporation Cursor Calibration Method by Detecting an Elevation Angle and Cursor Calibration System

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014225108A (en) * 2013-05-16 2014-12-04 ソニー株式会社 Image processing apparatus, image processing method, and program
JP2016110436A (en) * 2014-12-08 2016-06-20 株式会社リコー Image projection device and interactive input/output system
CN106406570A (en) * 2015-07-29 2017-02-15 中兴通讯股份有限公司 Projection cursor control method and device and remote controller
CN107248182A (en) * 2017-06-07 2017-10-13 王征 A kind of system and method for the polygon rendering based on magnet suction device
CN113282222B (en) * 2021-04-23 2024-03-29 海南视联通信技术有限公司 Cursor control method and device
TWI798037B (en) * 2021-08-31 2023-04-01 宏達國際電子股份有限公司 Virtual image display system and calibration method for pointing direction of controller thereof

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5424756A (en) 1993-05-14 1995-06-13 Ho; Yung-Lung Track pad cursor positioning device and method
US6411278B1 (en) * 1999-03-19 2002-06-25 Mitsubishi Denki Kabushiki Kaisha Coordinated position control system, coordinate position control method, and computer-readable storage medium containing a computer program for coordinate position controlling recorded thereon
US7256772B2 (en) 2003-04-08 2007-08-14 Smart Technologies, Inc. Auto-aligning touch system and method
US20070211027A1 (en) * 2006-03-09 2007-09-13 Nintendo Co., Ltd. Image processing apparatus and storage medium storing image processing program
US20080180396A1 (en) * 2007-01-31 2008-07-31 Pixart Imaging Inc. Control apparatus and method for controlling an image display
CN101388138A (en) 2007-09-12 2009-03-18 原相科技股份有限公司 Interaction image system, interaction apparatus and operation method thereof
CN101398721A (en) 2007-09-26 2009-04-01 昆盈企业股份有限公司 Control method for moving speed of cursor of air mouse
WO2009080653A1 (en) 2007-12-20 2009-07-02 Purple Labs Method and system for moving a cursor and selecting objects on a touchscreen using a finger pointer
TW201122992A (en) 2009-12-31 2011-07-01 Askey Computer Corp Cursor touch-control handheld electronic device
US20130021246A1 (en) * 2011-07-22 2013-01-24 Samsung Electronics Co., Ltd. Input apparatus of display apparatus, display system and control method thereof
TW201305854A (en) 2011-07-26 2013-02-01 Chip Goal Electronics Corp Remote controllable image display system, controller, and processing method therefor

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1677178A1 (en) 2004-12-29 2006-07-05 STMicroelectronics S.r.l. Pointing device for a computer system with automatic detection of lifting, and relative control method
US7942745B2 (en) 2005-08-22 2011-05-17 Nintendo Co., Ltd. Game operating device
US7852315B2 (en) 2006-04-07 2010-12-14 Microsoft Corporation Camera and acceleration based interface for presentations
JP4918376B2 (en) 2007-02-23 2012-04-18 任天堂株式会社 Information processing program and information processing apparatus
US8237656B2 (en) 2007-07-06 2012-08-07 Microsoft Corporation Multi-axis motion-based remote control
TWI338241B (en) 2007-08-23 2011-03-01 Pixart Imaging Inc Interactive image system, interactive device and operative method thereof
KR101348346B1 (en) 2007-09-06 2014-01-08 삼성전자주식회사 Pointing apparatus, pointer controlling apparatus, pointing method and pointer controlling method
JPWO2009072471A1 (en) 2007-12-07 2011-04-21 ソニー株式会社 Input device, control device, control system, control method, and handheld device
TW200928897A (en) 2007-12-20 2009-07-01 Elan Microelectronics Corp Hand gesture identification method applied to a touch panel
US8010313B2 (en) 2008-06-27 2011-08-30 Movea Sa Hand held pointing device with roll compensation
JP4737296B2 (en) 2009-01-19 2011-07-27 ソニー株式会社 INPUT DEVICE AND METHOD, INFORMATION PROCESSING DEVICE AND METHOD, INFORMATION PROCESSING SYSTEM, AND PROGRAM
EP2392991A1 (en) 2010-06-02 2011-12-07 Fraunhofer-Gesellschaft zur Förderung der Angewandten Forschung e.V. Hand-held pointing device, software cursor control system and method for controlling a movement of a software cursor
TWI552026B (en) 2012-06-07 2016-10-01 原相科技股份有限公司 Hand-held pointing device

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5424756A (en) 1993-05-14 1995-06-13 Ho; Yung-Lung Track pad cursor positioning device and method
US6411278B1 (en) * 1999-03-19 2002-06-25 Mitsubishi Denki Kabushiki Kaisha Coordinated position control system, coordinate position control method, and computer-readable storage medium containing a computer program for coordinate position controlling recorded thereon
US7256772B2 (en) 2003-04-08 2007-08-14 Smart Technologies, Inc. Auto-aligning touch system and method
US20070211027A1 (en) * 2006-03-09 2007-09-13 Nintendo Co., Ltd. Image processing apparatus and storage medium storing image processing program
US20080180396A1 (en) * 2007-01-31 2008-07-31 Pixart Imaging Inc. Control apparatus and method for controlling an image display
CN101388138A (en) 2007-09-12 2009-03-18 原相科技股份有限公司 Interaction image system, interaction apparatus and operation method thereof
CN101398721A (en) 2007-09-26 2009-04-01 昆盈企业股份有限公司 Control method for moving speed of cursor of air mouse
WO2009080653A1 (en) 2007-12-20 2009-07-02 Purple Labs Method and system for moving a cursor and selecting objects on a touchscreen using a finger pointer
EP2225628A1 (en) 2007-12-20 2010-09-08 Myriad France Method and system for moving a cursor and selecting objects on a touchscreen using a finger pointer
TW201122992A (en) 2009-12-31 2011-07-01 Askey Computer Corp Cursor touch-control handheld electronic device
US20130021246A1 (en) * 2011-07-22 2013-01-24 Samsung Electronics Co., Ltd. Input apparatus of display apparatus, display system and control method thereof
TW201305854A (en) 2011-07-26 2013-02-01 Chip Goal Electronics Corp Remote controllable image display system, controller, and processing method therefor
US20130093675A1 (en) * 2011-07-26 2013-04-18 Chip Goal Electronics Corporation, R.O.C. Remote controllable image display system, controller, and processing method therefor

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180356906A1 (en) * 2017-06-13 2018-12-13 Qisda Corporation Cursor Calibration Method by Detecting an Elevation Angle and Cursor Calibration System
US10684702B2 (en) * 2017-06-13 2020-06-16 Qisda Corporation Cursor calibration method by detecting an elevation angle and cursor calibration system

Also Published As

Publication number Publication date
US20150054745A1 (en) 2015-02-26
US20170371425A1 (en) 2017-12-28
US10379627B2 (en) 2019-08-13

Similar Documents

Publication Publication Date Title
US10379627B2 (en) Handheld device and positioning method thereof
EP3343320B1 (en) Information processing apparatus, information processing system, and information processing method
JP5049228B2 (en) Dialogue image system, dialogue apparatus and operation control method thereof
JP6110573B2 (en) Virtual reality system calibration
JP5463790B2 (en) Operation input system, control device, handheld device, and operation input method
US9304607B2 (en) Pointer positioning method of handheld pointer device
KR20210010437A (en) Power management for optical positioning devices
US10007359B2 (en) Navigation trace calibrating method and related optical navigation device
US20160253791A1 (en) Optical distortion compensation
US20110187638A1 (en) Interactive module applied in 3D interactive system and method
US10067576B2 (en) Handheld pointer device and tilt angle adjustment method thereof
US9778763B2 (en) Image projection apparatus, and system employing interactive input-output capability
TWI522848B (en) Pointer device and pointer positioning method thereof
US9626012B2 (en) Method for generating pointer movement value and pointing device using the same
US9354706B2 (en) Storage medium having stored therein information processing program, information processing apparatus, information processing system, and method of calculating designated position
US11061469B2 (en) Head mounted display system and rotation center correcting method thereof
JP6533946B1 (en) Program, information processing apparatus, information processing system, information processing method and head mounted display
WO2021256310A1 (en) Information processing device, terminal device, information processing system, information processing method, and program
US20130194186A1 (en) Pointing device having directional sensor and non-directional sensor, and pointing data input method using it
JP2020047236A (en) Tracking method and tracking system employing the same
TW201445359A (en) Handheld pointer device and tilt angle adjustment method thereof
CN115731286A (en) Virtual image display system and method for correcting pointing direction of controller thereof
WO2023274540A1 (en) Calibrating a transparent wearable display
US20170199587A1 (en) Method for correcting motion sensor-related errors while interacting with mobile or wearable devices
CN117472198A (en) Tracking system and tracking method

Legal Events

Date Code Title Description
AS Assignment

Owner name: PIXART IMAGING INC., TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHENG, HAN-PING;HUANG, CHAO-CHIEN;REEL/FRAME:034132/0986

Effective date: 20141105

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.)

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4