CN110968195B - Cursor control method, terminal and storage medium - Google Patents

Cursor control method, terminal and storage medium Download PDF

Info

Publication number
CN110968195B
CN110968195B CN201911199051.0A CN201911199051A CN110968195B CN 110968195 B CN110968195 B CN 110968195B CN 201911199051 A CN201911199051 A CN 201911199051A CN 110968195 B CN110968195 B CN 110968195B
Authority
CN
China
Prior art keywords
user
action
infrared
cursor
infrared signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911199051.0A
Other languages
Chinese (zh)
Other versions
CN110968195A (en
Inventor
朱日新
欧阳正良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Longtech Smart Control Co ltd
Original Assignee
Shenzhen Longtech Smart Control Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Longtech Smart Control Co ltd filed Critical Shenzhen Longtech Smart Control Co ltd
Priority to CN201911199051.0A priority Critical patent/CN110968195B/en
Publication of CN110968195A publication Critical patent/CN110968195A/en
Application granted granted Critical
Publication of CN110968195B publication Critical patent/CN110968195B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention discloses a cursor control method, a terminal and a storage medium, wherein the cursor control method comprises the following steps: detecting the azimuth of a user, and sending a first infrared signal to the user according to the azimuth; receiving a feedback second infrared signal, and analyzing the action information of the user according to the second infrared signal; and acquiring movement coordinates according to the action information, and controlling a cursor in a display screen to move according to the movement coordinates. According to the invention, the body sensor is used for acquiring the limb actions of the user, and identifying and converting the limb actions, so that the cursor movement in the display screen is controlled, and the functions of body sensing, cursor positioning and cursor control are realized.

Description

Cursor control method, terminal and storage medium
Technical Field
The present invention relates to the field of terminal applications, and in particular, to a cursor control method, a terminal, and a storage medium.
Background
With the development of terminal technology, terminals are applied to various scenes to meet the demands of daily work of people; in the current terminals, a positioning operation is generally performed by using a mouse, which acquires an input displacement by using an infrared light emitting semiconductor and a photo sensor, and then displays the acquired displacement on a display through a program process to control movement of a cursor arrow on the display.
This way of controlling the positioning of a cursor on a display by means of a mouse requires the user to manually operate the mouse, so that in some working environments, for example: the user is inconvenient to operate the mouse by hand or dirty by hand, which is not beneficial to the operation of the user.
Accordingly, the prior art is still in need of improvement and development.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a cursor control method, a terminal and a storage medium, wherein the body sensor is used for acquiring the limb actions of a user, and identifying and converting the limb actions so as to control the cursor movement in a display screen and realize the functions of body sensing, positioning and controlling the cursor.
The technical scheme adopted for solving the technical problems is as follows:
in a first aspect, the present invention provides a cursor control method, including the following steps:
detecting the azimuth of a user, and sending a first infrared signal to the user according to the azimuth;
receiving a feedback second infrared signal, and analyzing the action information of the user according to the second infrared signal;
and acquiring movement coordinates according to the action information, and controlling a cursor in a display screen to move according to the movement coordinates.
Further, before detecting the position of the user and sending the first infrared signal to the user according to the position, the method further comprises the following steps:
an action database for analyzing the somatosensory actions input by the user is preset, wherein the action database is generated according to the somatosensory actions input by the user.
Further, the motion database includes facial motion data and limb motion data.
Further, the detecting the azimuth of the user, and sending a first infrared signal to the user according to the azimuth specifically includes the following steps:
detecting the position and the distance of the user relative to the display screen in real time through a photoelectric sensor;
when the position and the distance of the user are detected, simultaneously transmitting a plurality of first infrared signals to the position of the user through an infrared projector; wherein the first infrared signal is a pulse signal.
Further, receiving a feedback second infrared signal, and analyzing the action information of the user according to the second infrared signal specifically includes the following steps:
receiving the second infrared signal through an infrared sensor;
judging whether the action input by the user is finished or not according to the second infrared signal;
and when the action input by the user is completed, analyzing the action information of the user according to the second infrared signal.
Further, when the action input by the user is completed, analyzing the action information of the user according to the second infrared signal specifically includes the following steps:
when the action input by the user is completed, acquiring corresponding action data in the action database;
and calculating a movement proportion according to the distance, and calculating movement coordinates according to the action data and the movement proportion.
Further, the step of obtaining movement coordinates according to the motion information and controlling a cursor in a display screen to move according to the movement coordinates specifically includes the following steps:
acquiring a first coordinate point and a last coordinate point in the moving coordinates according to the action information;
converting the first coordinate point and the last coordinate point into corresponding pulse signals respectively;
and controlling a cursor in the display screen to move according to the pulse signals of the first coordinate point and the last coordinate point.
Further, after the moving coordinate is converted into a pulse signal and the cursor in the display screen is controlled to move to the position where the moving coordinate is located according to the pulse signal, the method further comprises the following steps:
and controlling the cursor to finish corresponding operation according to the action information.
In a second aspect, the present invention provides a terminal, including: the body sensor, the processor and the memory are respectively connected with the processor;
the body sensor comprises a photoelectric sensor, an infrared projector and an infrared sensor, wherein the photoelectric sensor, the infrared projector and the infrared sensor are respectively and electrically connected with the processor; the photoelectric sensor is used for detecting the position and the distance of the user relative to the display screen in real time; the infrared projector is used for simultaneously sending a plurality of first infrared signals to the position where the user is located; the infrared sensor is used for receiving a second infrared signal fed back;
the memory stores a somatosensory control program which, when executed by the processor, is configured to implement the operations of the cursor control method described in the first aspect.
In a third aspect, the present invention provides a storage medium storing a somatosensory control program for implementing the operations of the cursor control method according to the first aspect, when the somatosensory control program is executed by a processor.
The technical scheme adopted by the invention has the following effects:
the invention provides a cursor control method, a terminal and a storage medium, wherein the body sensor is used for acquiring the limb actions of a user, and identifying and converting the limb actions, so that the cursor movement in a display screen is controlled, and the functions of body sensing, cursor positioning and cursor control are realized.
Drawings
FIG. 1 is a flow chart of a security detection method according to a preferred embodiment of the present invention.
Fig. 2 is a functional schematic of the terminal of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more clear and clear, the present invention will be further described in detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Example 1
In the cursor control method according to the preferred embodiment of the present invention, as shown in fig. 1, fig. 1 is a flowchart of the preferred embodiment of the security detection method according to the present invention.
The cursor control method comprises the following steps:
step S100, detecting the position of a user, and sending a first infrared signal to the user according to the position.
In this embodiment, the cursor control method is applied to a terminal, and the terminal can sense the body limb motion, control the cursor in the display screen of the terminal to move, control the cursor to perform corresponding operations, and the like; thus, in this embodiment, the terminal is provided with a body sensor including a photo sensor, an infrared projector, and an infrared sensor; the photoelectric sensor is used for detecting the position and the distance of a user relative to the display screen in real time; the infrared projector is used for simultaneously sending a plurality of infrared signals to the position where the user is located; the infrared sensor is used for receiving infrared signals of feedback.
In this embodiment, an action database and a custom motion are required to be set in the terminal in advance; the action database is a database generated according to somatosensory actions input by a user; when a user operates the terminal, motion sensing actions for motion sensing control operation can be customized in the terminal, such as: facial movements and limb movements; when the user-defined somatosensory action is completed, the user-defined somatosensory action data, such as: the facial motion data and the limb motion data are stored in the motion database so as to be convenient for analyzing the motion input by the user in the subsequent use.
Of course, in this embodiment, when the limb motion is customized, a corresponding relationship between the limb motion and the control operation needs to be defined; such as: the limb movements can be defined as control cursor movements, and the face movements can be defined as control click operations; the clicking operation comprises a single-click operation, a double-click operation, a left-click operation, a right-click operation and the like; when defining the corresponding relation between the limb actions and the control operation, the terminal generates a corresponding relation table of the limb actions and the control operation, and stores the corresponding relation table so as to facilitate the subsequent analysis of the somatosensory actions input by the user.
I.e. before said step 100, the steps of:
and step 001, presetting an action database for analyzing the somatosensory actions input by the user, wherein the action database is a database generated according to the somatosensory actions input by the user.
In this embodiment, when the self-defining motion is completed, the terminal may turn on the body sensor, and detect, in real time, the position and distance of the user relative to the display screen through the photo sensor in the body sensor; when a user enters the induction range of the photoelectric sensor, the photoelectric sensor sends an electric signal to the terminal; the terminal can then determine the location information of the user through the electrical signal; meanwhile, the terminal can also calculate the distance between the position of the user (the distance between the user and the photoelectric sensor) through the photoelectric sensor so as to determine the proportion of the motion distance of the user body feeling to the cursor movement in the display screen.
When the position and the distance of the user relative to the display screen are detected, simultaneously sending a plurality of first infrared signals to the position of the user through an infrared projector, wherein the first infrared signals are infrared pulse signals; in this embodiment, the infrared projector is disposed opposite to the infrared sensor; the infrared projector can emit a plurality of infrared rays, and the infrared rays emitted by the infrared projector can be sensed by the infrared sensor; when the user inputs the motion of the body, a part of infrared rays are blocked, and therefore, the blocked infrared rays are not received by the sensor.
Namely, in the step 100, the method specifically includes the following steps:
step 110, detecting the position and distance of the user relative to the display screen in real time through a photoelectric sensor;
step 120, when the position and distance of the user are detected, simultaneously transmitting a plurality of first infrared signals to the position of the user through an infrared projector; wherein the first infrared signal is a pulse signal.
The invention detects the position and distance of the user relative to the display screen through the photoelectric sensor so as to determine the proportion of the motion movement distance of the user body to the cursor movement in the display screen; and when the position of the user is detected, a plurality of first infrared signals are sent to the user through the infrared projector so as to acquire the somatosensory actions input by the user, so that the somatosensory actions input by the user can be conveniently analyzed, and the moving coordinates of the cursor can be determined.
Step 200, receiving a feedback second infrared signal, and analyzing the action information of the user according to the second infrared signal.
In this embodiment, when the user inputs a motion of body feeling, receiving a feedback second infrared signal through the infrared sensor; because part of the first infrared signals emitted by the infrared projector are shielded by a user in the process of the body sensing action of the user input, the number of the second infrared signals is smaller than that of the first infrared signals; and when the user input is completed, at which time the user has left the input area, the number of the second infrared signals is equal to the number of the first infrared signals; at this time, it may be determined that the somatosensory action input by the user has been completed; when the action input by the user is completed, the terminal can analyze the action information of the user according to the second infrared signal and convert the action information into a control instruction so as to control the movement of a cursor in a display screen and realize corresponding operation.
Namely, in the step 200, the method specifically includes the following steps:
step 210, receiving the second infrared signal through an infrared sensor;
step 220, judging whether the action input by the user is completed or not according to the second infrared signal;
and step 230, analyzing the action information of the user according to the second infrared signal when the action input by the user is completed.
Specifically, when the somatosensory action input by the user is analyzed, the track of the somatosensory action input by the user can be obtained through the second infrared signal; for example, the position and the corresponding time of the blocked first infrared ray are obtained, and the motion track of the somatosensory action input by the user can be obtained through simulation; after the motion trail of the somatosensory motion of the user is obtained, the terminal searches the motion database and judges whether the somatosensory motion input by the user is preset somatosensory motion or not; if yes, obtaining corresponding action data in the action database, and calculating a movement proportion according to the distance of the user, so as to calculate and obtain coordinates of the body feeling after movement according to the action data and the movement proportion.
Namely, in the step 230, the method specifically includes the following steps:
step 231, when the action input by the user is completed, acquiring corresponding action data in the action database;
and step 232, calculating a movement proportion according to the distance, and calculating movement coordinates according to the action data and the movement proportion.
According to the invention, the second infrared signal is received, and the motion of the body feeling input by the user is analyzed according to the second infrared signal, so that the direction and the distance which the user wants to move are obtained; and calculating the movement proportion by combining the distance of the user, and obtaining the moving distance of the cursor on the display screen according to the proportion so as to facilitate the terminal to control the cursor to move to the corresponding coordinate position, thereby realizing the positioning of the moving position of the cursor.
And step S300, obtaining movement coordinates according to the action information, and controlling a cursor in a display screen to move according to the movement coordinates.
In this embodiment, after the moving coordinates corresponding to the motion sensing motion are obtained by calculation, the coordinates of each moving point of the motion sensing motion are obtained, and at this time, a first coordinate point and a last coordinate point in the moving coordinates can be obtained, and the first coordinate point is taken as an origin, so that a cursor in the display screen is controlled to move to the position of the last coordinate point, and the cursor in the display screen is positioned.
Specifically, when the cursor is controlled to move, the first coordinate point and the last coordinate point are required to be converted, so that the first coordinate point and the last coordinate point are respectively converted into corresponding pulse signals; further locking the position of the cursor in the current display screen and taking the position as an origin; and then, according to the pulse signal of the last coordinate point, controlling a cursor in the display screen to move to the position of the last coordinate point, and locking the position until a user inputs a next instruction.
Namely, in the step 300, the method specifically includes the following steps:
step 310, obtaining a first coordinate point and a last coordinate point in the moving coordinates according to the action information;
step 320, converting the first coordinate point and the last coordinate point into corresponding pulse signals respectively;
and 330, controlling a cursor in the display screen to move according to the pulse signals of the first coordinate point and the last coordinate point.
In this embodiment, after the moving cursor is locked, a somatosensory action may be further input to further perform operation control on the cursor; for example, the movement of a cursor is controlled by limb actions, and the movement direction and position of the cursor follow the direction and position of the limb actions; further, the cursor can be controlled to perform motor operation through facial actions; for example, a face moving toward one direction may perform a left click operation for controlling a cursor, a face moving toward the other direction may perform a right click operation for controlling a cursor, and so on.
I.e. after said step 330, the following steps are also included:
and 340, controlling the cursor to complete corresponding operation according to the action information.
According to the method, the first coordinate point and the last coordinate point in the motion track of the motion are acquired, and the two coordinate points are converted into the corresponding pulse signals, so that a cursor in a display screen is controlled to move; after the cursor is moved, the cursor can be further controlled through the input action information to realize corresponding clicking operation; the cursor in the display screen is controlled through the motion of the body sense, a control mode of a mouse is replaced, the cursor is rapidly positioned, and the operation of a user is facilitated.
Example two
As shown in fig. 2, fig. 2 is a functional schematic of the terminal of the present invention.
The embodiment provides a terminal, which comprises a body sensor 10, a processor 20 and a memory 30, wherein the body sensor 10 and the memory 30 are respectively connected with the processor 20;
wherein the body sensor 10 comprises a photoelectric sensor 11, an infrared projector 12 and an infrared sensor 13, and the photoelectric sensor 11, the infrared projector 12 and the infrared sensor 13 are respectively and electrically connected with the processor 20; the photoelectric sensor 11 is used for detecting the position and the distance of the user relative to the display screen in real time; the infrared projector 12 is configured to simultaneously send a plurality of first infrared signals to a location where a user is located; the infrared sensor 13 is configured to receive a feedback second infrared signal;
the memory 30 stores a somatosensory control program which, when executed by the processor, is used to implement the operations of the cursor control method according to the above-described embodiment one; as described in detail above.
Example III
The present embodiment provides a storage medium, wherein the storage medium stores a somatosensory control program, and the somatosensory control program is used for implementing the operation of the cursor control method according to the above embodiment; as described in detail above.
In summary, the present invention provides a cursor control method, a terminal, and a storage medium, which acquire a limb motion of a user through a body sensor, and identify and convert the limb motion, so as to control a cursor in a display screen to move, and realize functions of positioning the cursor and controlling the cursor through body sensing.
Of course, those skilled in the art will appreciate that implementing all or part of the above-described methods may be implemented by a computer program for instructing relevant hardware (such as a processor, a controller, etc.), where the program may be stored in a computer-readable storage medium, and where the program may include the steps of the above-described method embodiments when executed. The storage medium may be a memory, a magnetic disk, an optical disk, or the like.
It is to be understood that the invention is not limited in its application to the examples described above, but is capable of modification and variation in light of the above teachings by those skilled in the art, and that all such modifications and variations are intended to be included within the scope of the appended claims.

Claims (6)

1. A cursor control method, comprising the steps of:
detecting the azimuth of a user, and sending a first infrared signal to the user according to the azimuth;
receiving a feedback second infrared signal, and analyzing the action information of the user according to the second infrared signal;
acquiring a movement coordinate according to the action information, and controlling a cursor in a display screen to move according to the movement coordinate;
the method comprises the following steps before detecting the position of the user and sending a first infrared signal to the user according to the position:
presetting an action database for analyzing somatosensory actions input by a user, wherein the action database is a database generated according to the somatosensory actions input by the user;
the action database comprises facial action data and limb action data;
the detecting the azimuth of the user, and sending a first infrared signal to the user according to the azimuth specifically comprises the following steps:
detecting the position and the distance of the user relative to the display screen in real time through a photoelectric sensor;
when the position and the distance of the user are detected, simultaneously transmitting a plurality of first infrared signals to the position of the user through an infrared projector; wherein the first infrared signal is a pulse signal;
the step of receiving the fed-back second infrared signal and analyzing the action information of the user according to the second infrared signal specifically comprises the following steps:
receiving the second infrared signal through an infrared sensor;
judging whether the action input by the user is finished or not according to the second infrared signal;
when the action input by the user is completed, analyzing the action information of the user according to the second infrared signal;
acquiring the position and the corresponding time of the blocked first infrared ray, and obtaining the motion track of the somatosensory action input by the user through simulation; searching the action database, and judging whether the somatosensory action input by the user is a preset somatosensory action or not; if yes, obtaining corresponding action data in the action database, calculating movement proportion according to the distance of the user, and calculating to obtain coordinates after the corresponding somatosensory actions move according to the action data and the movement proportion.
2. The cursor control method according to claim 1, wherein when the action input by the user is completed, resolving the action information of the user according to the second infrared signal specifically includes the following steps:
when the action input by the user is completed, acquiring corresponding action data in the action database;
and calculating a movement proportion according to the distance, and calculating movement coordinates according to the action data and the movement proportion.
3. The method according to claim 2, wherein the steps of obtaining movement coordinates according to the motion information, and controlling the cursor in the display screen to move according to the movement coordinates include the steps of:
acquiring a first coordinate point and a last coordinate point in the moving coordinates according to the action information; converting the first coordinate point and the last coordinate point into corresponding pulse signals respectively;
and controlling a cursor in the display screen to move according to the pulse signals of the first coordinate point and the last coordinate point.
4. The cursor control method according to claim 3, wherein after converting the movement coordinates into pulse signals and controlling the cursor in the display screen to move to the position where the movement coordinates are located according to the pulse signals, further comprising the steps of:
and controlling the cursor to finish corresponding operation according to the action information.
5. A terminal, comprising: the body sensor, the processor and the memory are respectively connected with the processor;
the body sensor comprises a photoelectric sensor, an infrared projector and an infrared sensor, wherein the photoelectric sensor, the infrared projector and the infrared sensor are respectively and electrically connected with the processor; the photoelectric sensor is used for detecting the position and the distance of the user relative to the display screen in real time; the infrared projector is used for simultaneously sending a plurality of first infrared signals to the position where the user is located; the infrared sensor is used for receiving a second infrared signal fed back;
the memory stores a somatosensory control program which, when executed by the processor, is adapted to carry out the cursor control method of any one of claims 1 to 4.
6. A storage medium storing a somatosensory control program which, when executed by a processor, is adapted to carry out the cursor control method according to any one of claims 1-4.
CN201911199051.0A 2019-11-29 2019-11-29 Cursor control method, terminal and storage medium Active CN110968195B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911199051.0A CN110968195B (en) 2019-11-29 2019-11-29 Cursor control method, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911199051.0A CN110968195B (en) 2019-11-29 2019-11-29 Cursor control method, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN110968195A CN110968195A (en) 2020-04-07
CN110968195B true CN110968195B (en) 2023-07-25

Family

ID=70032066

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911199051.0A Active CN110968195B (en) 2019-11-29 2019-11-29 Cursor control method, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN110968195B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1528460A1 (en) * 2003-10-30 2005-05-04 GIRAUD, Pierre Method and system for generating cursor control data in relation to a users input action
CN102981641A (en) * 2011-09-02 2013-03-20 联想(北京)有限公司 Input device and electronic device and method of controlling cursor movement
CN107831920A (en) * 2017-10-20 2018-03-23 广州视睿电子科技有限公司 Cursor movement display methods, device, mobile terminal and storage medium
CN109597544A (en) * 2018-11-23 2019-04-09 青岛海信电器股份有限公司 Input exchange method, device, equipment and storage medium

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200512652A (en) * 2003-09-26 2005-04-01 Jia-Zhang Hu Cursor simulator using limbs to control cursor and method for simulating the same
US20060022942A1 (en) * 2004-07-30 2006-02-02 Po-Chi Lin Control method for operating a computer cursor instinctively and the apparatus thereof
CN101618277A (en) * 2008-07-04 2010-01-06 英属维京群岛速位互动股份有限公司 Electronic game manipulating device capable of sensing body figure of user and method
DE102012208748B4 (en) * 2011-06-21 2023-07-13 International Business Machines Corporation Method and system for remote control of functions of a mouse pointer of a computer unit
CN103150020A (en) * 2013-03-14 2013-06-12 上海电机学院 Three-dimensional finger control operation method and system
CN104598154A (en) * 2015-02-02 2015-05-06 上海翰临电子科技有限公司 Intelligent wearing equipment interface switching control method based on infrared induction
CN105511607B (en) * 2015-11-30 2018-10-02 四川长虹电器股份有限公司 Three-dimensional human-computer interaction device, method and system
CN205540571U (en) * 2016-02-26 2016-08-31 北京金锐德路科技有限公司 Head -mounted electronic equipment
CN108491152B (en) * 2018-02-11 2020-11-10 李帆 Touch screen terminal control method, terminal and medium based on virtual cursor
CN108415562A (en) * 2018-02-12 2018-08-17 四川斐讯信息技术有限公司 A kind of cursor control method and cursor control system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1528460A1 (en) * 2003-10-30 2005-05-04 GIRAUD, Pierre Method and system for generating cursor control data in relation to a users input action
CN102981641A (en) * 2011-09-02 2013-03-20 联想(北京)有限公司 Input device and electronic device and method of controlling cursor movement
CN107831920A (en) * 2017-10-20 2018-03-23 广州视睿电子科技有限公司 Cursor movement display methods, device, mobile terminal and storage medium
CN109597544A (en) * 2018-11-23 2019-04-09 青岛海信电器股份有限公司 Input exchange method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN110968195A (en) 2020-04-07

Similar Documents

Publication Publication Date Title
US9477324B2 (en) Gesture processing
US9996160B2 (en) Method and apparatus for gesture detection and display control
US20040085522A1 (en) Display system with interpretable pattern detection
CN110796259A (en) Method and system for determining peripheral device displacement
CN103440033A (en) Method and device for achieving man-machine interaction based on bare hand and monocular camera
WO2014146516A1 (en) Interactive device and method for left and right hands
US8749488B2 (en) Apparatus and method for providing contactless graphic user interface
WO2021056428A1 (en) Intelligent terminal, control system, and method for interaction with mobile robot
CN110968195B (en) Cursor control method, terminal and storage medium
US8354996B2 (en) Electronic reading apparatus and input method thereof
CN102541417B (en) Multi-object tracking method and system in virtual touch screen system
CN104699279A (en) Displacement detection device with no hovering function and computer system including the same
CN112181132A (en) Model evaluation method and system based on ray interaction task in virtual environment
CN109413400A (en) A kind of projection process method and device
Kim et al. Visual multi-touch air interface for barehanded users by skeleton models of hand regions
US20140111428A1 (en) Remote control system and method for computer
CN116301551A (en) Touch identification method, touch identification device, electronic equipment and medium
US20230168744A1 (en) Information processing apparatus and information processing method based on input operation by user, and computer program for executing the method
CN101414224B (en) Apparatus and method for defining writing position
CN104423560A (en) Information processing method and electronic equipment
CN113347564A (en) Control signal generation method and system based on UWB positioning system
Jaiswal et al. Creative exploration of scaled product family 3D models using gesture based conceptual computer aided design (C-CAD) tool
KR101558094B1 (en) Multi-modal system using for intuitive hand motion and control method thereof
WO2001046941A1 (en) Method and apparatus for vision-based coupling between pointer actions and projected images
KR102300290B1 (en) Smart mouse that works in conjunction with finger movement using camera and method for controlling mouse cursor using the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant