CN117631855A - Multifunctional mouse based on Internet of things and control method thereof - Google Patents

Multifunctional mouse based on Internet of things and control method thereof Download PDF

Info

Publication number
CN117631855A
CN117631855A CN202311630091.2A CN202311630091A CN117631855A CN 117631855 A CN117631855 A CN 117631855A CN 202311630091 A CN202311630091 A CN 202311630091A CN 117631855 A CN117631855 A CN 117631855A
Authority
CN
China
Prior art keywords
mouse
eye focus
determining
eye
moving average
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311630091.2A
Other languages
Chinese (zh)
Inventor
沈欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhenjiang Lingyucube Intelligent Equipment Co ltd
Original Assignee
Zhenjiang Lingyucube Intelligent Equipment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhenjiang Lingyucube Intelligent Equipment Co ltd filed Critical Zhenjiang Lingyucube Intelligent Equipment Co ltd
Priority to CN202311630091.2A priority Critical patent/CN117631855A/en
Publication of CN117631855A publication Critical patent/CN117631855A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention provides a multifunctional mouse based on the Internet of things, and the embodiment of the application provides the multifunctional mouse which comprises the following components: the mouse comprises a mouse body, wherein the mouse body comprises a WIFI communication module, a wireless charging module and a battery, and the wireless charging module is electrically connected with the battery; a WIFI relay device; and the mouse body is in communication connection with the eye tracker through the WIFI relay device. The application also provides a multifunctional mouse control method based on the physical network, when the eye tracking device is used for controlling the mouse cursor, the tracking process of the eye focus can be compensated and corrected, and the inaccuracy of the position of the mouse cursor caused by the too fast eye movement is avoided.

Description

Multifunctional mouse based on Internet of things and control method thereof
Technical Field
The application relates to the technical field of computers, in particular to a multifunctional mouse based on the Internet of things and a control method thereof.
Background
With the advancement and development of science, electronic devices such as computers have become an indispensable tool in life. The interaction between people and electronic devices is varied, for example, a cursor on the electronic device may be controlled by a mouse device to interact with the system. Conventional mouse devices control the movement of a mouse cursor with an optical mouse, i.e., by hand movement, but may have a problem of inconvenient use for a person who is inconvenient to hand.
Eye tracking devices have been proposed in the prior art to control the position of a mouse cursor, but eye movements are mainly jump and gaze, so compensation correction is required for the tracking process of the eye focus.
Disclosure of Invention
The embodiment of the application provides a multifunctional mouse of a level internet of things, so as to improve the technical problems.
In a first aspect, an embodiment of the present application provides a multifunctional mouse, including: the mouse comprises a mouse body, wherein the mouse body comprises a WIFI communication module, a wireless charging module and a battery, and the wireless charging module is electrically connected with the battery; a WIFI relay device; and the mouse body is in communication connection with the eye tracker through the WIFI relay device.
In some embodiments, the mouse body further comprises:
the displacement state acquisition module is used for acquiring an eye focus displacement state through the eye tracker; the time acquisition module is used for acquiring the movement time t1 of the eye focus; the average speed confirming module is used for determining the moving average speed of the eye focus according to the running time of the eye focus and the displacement state; and the cursor confirmation module is used for determining the position of the mouse cursor according to the moving average speed of the eye focus.
It is understood that the displacement state may include a displacement distance. The displacement distance may be a movement distance of a mouse cursor displayed on the display. It will be appreciated that the eye tracker can acquire the eye focus in a unit of time, such as the speed of displacement, the distance of displacement, etc.
In a second aspect, the present application proposes a method for controlling a multifunctional mouse, which is applicable to the multifunctional mouse as described in any one of the above, and the method includes: acquiring an eye focus displacement state through the eye tracker; acquiring the movement time t1 of the eye focus; determining a moving average speed of the eye focus according to the running time of the eye focus and the displacement state; and determining the position of the mouse cursor according to the moving average speed of the eye focus.
It will be appreciated that t1 may be a unit time in this embodiment. In the running process of the multifunctional mouse, a plurality of unit time t1 can be linearly connected, namely after one unit time t1 is ended, the next unit time t1 is repeatedly carried out, and thus the control of the mouse cursor in a longer time is realized.
In some embodiments, determining the position of the mouse cursor from the moving average speed of the eye focus comprises: acquiring a first position of an eye focus; acquiring a second position of an eye focus, wherein the second position is a position of the mouse icon after the movement time t1; determining a target position according to the first position and the second position, wherein the target position is a midpoint between the first position and the second position; and if the moving average speed of the eye focus is greater than a preset threshold v1, determining the position of the mouse cursor as the target position.
In some embodiments, the method further includes determining the position of the mouse cursor as the second position if the moving average speed of the eye focus is less than or equal to the preset threshold v1.
It will be appreciated that the moving average velocity of the eye focus is determined by the run time of the eye focus and the displacement state. The average velocity may be the total distance of displacement divided by the length of time of displacement. In some embodiments, further comprising determining a displacement distance from the displacement state; and determining a moving average speed according to the displacement distance and the moving time t1.
In some embodiments, determining the position of the mouse cursor from the moving average speed of the eye focus comprises: acquiring a first position of an eye focus; acquiring a second position of an eye focus, wherein the second position is a position of the mouse icon after the movement time t 2; acquiring a third position of an eye focus, wherein the third position is a position of the mouse icon after the movement time t3, and the movement time t1 is equal to the sum of the movement time t2 and the movement time t 3; determining a target position according to the first position, the second position and the third position, wherein the target position is a geometric center formed by connecting lines of the first position, the second position and the third position; and if the moving average speed of the eye focus is greater than a preset threshold v2, determining the position of the mouse cursor as the target position.
The method further comprises the steps of: and if the moving average speed of the eye focus is smaller than or equal to the preset threshold v2, determining the position of the mouse cursor as the third position.
In a third aspect, the present application proposes a computer readable storage medium storing a computer program, wherein the computer program, when loaded and executed by a processor, implements a method according to any of the preceding claims.
In a fourth aspect, the present application proposes an electronic device, including a processor and a memory, wherein the memory is configured to store a computer program; the processor is configured to load and execute the computer program to implement the method according to any one of the preceding claims.
The multifunctional mouse based on the Internet of things and the control method thereof provided by the embodiment of the application comprise a mouse body, wherein the mouse body comprises a WIFI communication module, a wireless charging module and a battery, and the wireless charging module is electrically connected with the battery; a WIFI relay device; and the mouse body is in communication connection with the eye tracker through the WIFI relay device. Acquiring an eye focus displacement state through the eye tracker; acquiring the movement time t1 of the eye focus; determining a moving average speed of the eye focus according to the running time of the eye focus and the displacement state; and determining the position of the mouse cursor according to the moving average speed of the eye focus. According to the mouse control method based on the Internet of things, when the eye tracking device is used for controlling the mouse cursor, compensation correction can be carried out on the tracking process of the eye focus, and the inaccuracy of the position of the mouse cursor caused by too fast eye movement is avoided.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a mouse control method according to an embodiment of the present application;
FIG. 2 is a block diagram of a multifunctional mouse according to an embodiment of the present application;
fig. 3 is a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to enable those skilled in the art to better understand the present application, the following description will make clear and complete descriptions of the technical solutions in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, based on the embodiments herein, which are within the scope of the protection of the present application, will be within the skill of the art without undue effort.
In this application, the terms "mounted," "connected," "secured," and the like are to be construed broadly unless otherwise specifically indicated or defined. For example, the connection can be fixed connection, detachable connection or integral connection; can be mechanically or electrically connected; the connection may be direct, indirect, or internal, or may be surface contact only, or may be surface contact via an intermediate medium. The specific meaning of the terms in this application will be understood by those of ordinary skill in the art as the case may be.
In the embodiments of the present application, words such as "exemplary," "for example," and the like are used to indicate by way of example, illustration, or description. Any embodiment or design described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, the term use of an example is intended to present concepts in a concrete fashion. Furthermore, in embodiments of the present application, the meaning of "and/or" may be that of both, or may be that of either, optionally one of both.
Computers have become an indispensable tool in life today. The interaction between people and electronic devices is various, and in the related art, for example, the eye tracking device is used for controlling the position of a mouse cursor, but eye movement is mainly jump and staring, so that compensation and correction are needed in the tracking process of the eye focus.
To this end, the present application provides a solution, by which an eye focal point displacement state is acquired by the eye tracker; acquiring the movement time t1 of the eye focus; determining a moving average speed of the eye focus according to the running time of the eye focus and the displacement state; and determining the position of the mouse cursor according to the moving average speed of the eye focus.
The embodiment of the application provides a multifunctional mouse, which comprises: the mouse comprises a mouse body, wherein the mouse body comprises a WIFI communication module, a wireless charging module and a battery, and the wireless charging module is electrically connected with the battery; a WIFI relay device; and the mouse body is in communication connection with the eye tracker through the WIFI relay device.
It can be appreciated that the mouse body may be a wireless mouse, and the inside of the mouse body may be electrically connected with a WIFI module, and the WIFI relay device may be disposed inside the mouse body, or may be a structure independent of the mouse body, which is not limited herein. The WIFI relay device can expand and bridge the WIFI range, and the specific structure and operation mode thereof are common in the related art, which is not limited herein. The WIFI signal can be relayed through the WIFI relay device, so that the range of the WIFI router is expanded, and the multifunctional mouse is realized.
The following further describes the aspects of the present application with reference to the accompanying drawings.
Referring to fig. 1, an embodiment of the present application provides a mouse control method applied to a multifunctional mouse, where the multifunctional mouse includes an eye tracker, and the method includes steps S101 to S104.
S101, acquiring an eye focus displacement state through the eye tracker.
Illustratively, in this embodiment, the displacement state may include a displacement distance. The displacement distance may be a movement distance of a mouse cursor displayed on the display. It will be appreciated that the eye tracker can acquire the eye focus in a unit of time, such as the speed of displacement, the distance of displacement, etc.
S102, acquiring the movement time t1 of the eye focus.
It will be appreciated that t1 may be a unit time in this embodiment. In the running process of the multifunctional mouse, a plurality of unit time t1 can be linearly connected, namely after one unit time t1 is ended, the next unit time t1 is repeatedly carried out, and thus the control of the mouse cursor in a longer time is realized.
S103, determining the moving average speed of the eye focus according to the running time of the eye focus and the displacement state.
It will be appreciated that the moving average velocity of the eye focus is determined by the run time of the eye focus and the displacement state. The average velocity may be the total distance of displacement divided by the length of time of displacement. In some embodiments, further comprising determining a displacement distance from the displacement state; and determining a moving average speed according to the displacement distance and the moving time t1.
S104, determining the position of the mouse cursor according to the moving average speed of the eye focus.
When the eye tracking device is used for controlling the mouse cursor, the tracking process of the eye focus can be compensated and corrected, and the inaccuracy of the position of the mouse cursor caused by the too fast eye movement is avoided.
Specifically, in this embodiment, the determining the position of the mouse cursor according to the moving average speed of the eye focus may include the following steps S201 to S204:
s201, acquiring a first position of an eye focus.
It will be appreciated that the first position of the eye focus is the initial position at the beginning of unit time t1.
S202, acquiring a second position of an eye focus, wherein the second position is a position of the mouse icon after the movement time t1;
in this embodiment, only a straight line is taken as an example in the process of moving the locus of the eye from the first position to the second position. In this way, small tremors of the eye can be excluded. In some embodiments, determining the position of the mouse cursor according to the moving average speed of the eye focus may further include: acquiring a first position of an eye focus; acquiring a second position of an eye focus, wherein the second position is a position of the mouse icon after the movement time t 2; acquiring a third position of an eye focus, wherein the third position is a position of the mouse icon after the movement time t3, and the movement time t1 is equal to the sum of the movement time t2 and the movement time t 3; determining a target position according to the first position, the second position and the third position, wherein the target position is a geometric center formed by connecting lines of the first position, the second position and the third position; and if the moving average speed of the eye focus is greater than a preset threshold v2, determining the position of the mouse cursor as the target position.
S203, determining a target position according to the first position and the second position, wherein the target position is a midpoint between the first position and the second position;
since the first position and the second position are in a straight line connection, the midpoint between the first position and the second position may be taken as the target position in the present embodiment.
S204, if the moving average speed of the eye focus is greater than a preset threshold v1, determining the position of the mouse cursor as the target position.
In this embodiment, the preset threshold v1 is a specific speed. When the eyes are in tremor or displacement with larger amplitude, the average moving speed of the eye focus is correspondingly larger and is larger than a preset threshold value. In this case, the target position is also the cursor position. In this way, when the eye moves rapidly, the cursor can move in a slower manner, avoiding inaccurate pointing caused by too high a cursor movement speed.
In some embodiments, step S205 is also included.
S205, if the moving average speed of the eye focus is smaller than or equal to the preset threshold v1, determining the position of the mouse cursor as the second position.
And when the moving speed is smaller than a preset threshold value, the normal state of the mouse operation is the state. In this state, the target position is the focus that follows the user at any time. In this process the focus of the eye is moved to the second position and correspondingly the cursor of the mouse is also moved to the second position.
In some embodiments, before determining that the position of the mouse cursor is the target position if the moving average speed of the eye focus is greater than the preset threshold v1, the method further includes: and acquiring the preset threshold v1.
It will be appreciated that the preset threshold v1 may be an adjustable threshold which may be adjusted according to the specific use of the user. If a user controls the mouse cursor to move quickly, the threshold value can be reduced when shooting games, so that the mouse cursor can track the focal movement of eyes of the user at any time. However, when the user performs operations such as text reading, the threshold V1 can be correspondingly increased, so that the running process is avoided to be quicker.
In some embodiments, determining the position of the mouse cursor from the moving average speed of the eye focus comprises: acquiring a first position of an eye focus; acquiring a second position of an eye focus, wherein the second position is a position of the mouse icon after the movement time t 2; acquiring a third position of an eye focus, wherein the third position is a position of the mouse icon after the movement time t3, and the movement time t1 is equal to the sum of the movement time t2 and the movement time t 3; determining a target position according to the first position, the second position and the third position, wherein the target position is a geometric center formed by connecting lines of the first position, the second position and the third position; and if the moving average speed of the eye focus is greater than a preset threshold v2, determining the position of the mouse cursor as the target position.
The multifunctional mouse based on the Internet of things acquires the eye focus displacement state through the eye tracker; acquiring the movement time t1 of the eye focus; determining a moving average speed of the eye focus according to the running time of the eye focus and the displacement state; and determining the position of the mouse cursor according to the moving average speed of the eye focus. According to the mouse control method, when the eye tracking device is used for controlling the mouse cursor, compensation correction can be carried out on the tracking process of the eye focus, and the inaccuracy of the position of the mouse cursor caused by too fast eye movement is avoided.
The application proposes a multifunctional mouse 100, please refer to fig. 2, comprising:
a displacement state acquisition module 200, wherein the displacement state acquisition module 200 is configured to acquire an eye focus displacement state through the eye tracker;
optionally, in some embodiments, the displacement state obtaining module 200 may further be configured to determine a displacement distance according to the displacement state; and determining a moving average speed according to the displacement distance and the moving time t1.
A time acquisition module 300, where the time acquisition module 300 is configured to acquire a movement time t1 of the eye focus, and in this embodiment, the displacement state may include a displacement distance. The displacement distance may be a movement distance of a mouse cursor displayed on the display. It will be appreciated that the eye tracker can acquire the eye focus in a unit of time, such as the speed of displacement, the distance of displacement, etc.
An average speed confirmation module 400, wherein the average speed confirmation module 400 is used for determining the moving average speed of the eye focus according to the running time of the eye focus and the displacement state.
It will be appreciated that t1 may be a unit time in this embodiment. In the process of operating the multifunctional mouse 100, the plurality of unit times t1 may be linearly connected, that is, after one unit time t1 ends, the next unit time t1 is repeated, so that the control of the mouse cursor in a longer time is realized.
The cursor confirming module 500 is used for determining the position of the mouse cursor according to the moving average speed of the eye focus. It will be appreciated that the moving average velocity of the eye focus is determined by the run time of the eye focus and the displacement state. The average velocity may be the total distance of displacement divided by the length of time of displacement. In some embodiments, further comprising determining a displacement distance from the displacement state; and determining a moving average speed according to the displacement distance and the moving time t1.
Optionally, in some embodiments, determining the position of the mouse cursor according to the moving average speed of the eye focus comprises: acquiring a first position of an eye focus; acquiring a second position of an eye focus, wherein the second position is a position of the mouse icon after the movement time t 2; acquiring a third position of an eye focus, wherein the third position is a position of the mouse icon after the movement time t3, and the movement time t1 is equal to the sum of the movement time t2 and the movement time t 3; determining a target position according to the first position, the second position and the third position, wherein the target position is a geometric center formed by connecting lines of the first position, the second position and the third position; and if the moving average speed of the eye focus is greater than a preset threshold v2, determining the position of the mouse cursor as the target position.
Optionally, the cursor confirmation module may be further configured to obtain a first position of the eye focus.
Optionally, the cursor confirmation module may be further configured to obtain a second position of the eye focus, where the second position is a position of the mouse icon after the movement time t1.
Optionally, the cursor confirmation module may be further configured to determine a target position according to the first position and the second position, where the target position is a midpoint between the first position and the second position.
The multifunctional mouse 100 provided in this embodiment obtains the eye focus displacement state through the eye tracker; acquiring the movement time t1 of the eye focus; determining a moving average speed of the eye focus according to the running time of the eye focus and the displacement state; and determining the position of the mouse cursor according to the moving average speed of the eye focus. According to the mouse control method, when the eye tracking device is used for controlling the mouse cursor, compensation correction can be carried out on the tracking process of the eye focus, and the inaccuracy of the position of the mouse cursor caused by too fast eye movement is avoided.
In a third aspect, the present application proposes a computer readable storage medium storing a computer program, wherein the computer program, when loaded and executed by a processor, implements a method according to any of the preceding claims.
In a fourth aspect, the present application proposes an electronic device, including a processor and a memory, wherein the memory is configured to store a computer program; the processor is configured to load and execute the computer program to implement the method as described above.
As shown in fig. 3, the electronic device 2000 may include a processor 2001.
Optionally, the electronic device 2000 may also include memory 2002 and/or a transceiver 2003.
The processor 2001 is coupled with the memory 2002 and the transceiver 2003, for example, by a communication bus.
The following describes the various constituent elements of the electronic device 2000 in detail with reference to fig. 3:
the processor 2001 is a control center of the electronic device 2000, and may be one processor or a plurality of processing elements. For example, processor 2001 is one or more central processing units (central processing unit, CPU), but may also be an integrated circuit (application specific integrated circuit, ASIC), or one or more integrated circuits configured to implement embodiments of the present application, such as: one or more microprocessors (digital signal processor, DSPs), or one or more field programmable gate arrays (field programmable gate array, FPGAs).
Alternatively, the processor 2001 may perform various functions of the electronic device 2000 by running or executing software programs stored in the memory 2002, and invoking data stored in the memory 2002.
In a particular implementation, the processor 2001 may include one or more CPUs, such as CPU0 and CPU1 shown in FIG. 3, as an example.
In a particular implementation, as one embodiment, the electronic device 2000 may also include multiple processors, such as the processor 2001 and processor 2004 shown in FIG. 3. Each of these processors may be a single-core processor (single-CPU) or a multi-core processor (multi-CPU). A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
The memory 2002 is used for storing a software program for executing the solution of the present application, and is controlled by the processor 2001 to execute the program, and the specific implementation may refer to the above method embodiment, which is not described herein again.
Alternatively, memory 2002 may be a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, or an electrically erasable programmable read-only memory (electrically erasable programmable read-only memory, EEPROM), a compact disc read-only memory (compact disc read-only memory) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, without limitation. The memory 2002 may be integrated with the processor 2001, may be present separately, and may be coupled to the processor 2001 through an interface circuit (not shown in fig. 3) of the electronic device 2000, which is not specifically limited in this embodiment.
A transceiver 2003 for communication with other electronic devices. For example, the electronic device 2000 is a mobile terminal, and the transceiver 2003 may be used to communicate with a network device or with another terminal device. As another example, the electronic device 2000 is a network device and the transceiver 2003 may be used to communicate with a terminal device or with another network device.
Alternatively, transceiver 2003 may include a receiver and a transmitter (not separately shown in fig. 3). The receiver is used for realizing the receiving function, and the transmitter is used for realizing the transmitting function.
Alternatively, transceiver 2003 may be integrated with processor 2001, or may exist separately, and be coupled to processor 2001 through interface circuitry (not shown in fig. 3) of electronic device 2000, as embodiments of the present application are not specifically limited.
It should be noted that the structure of the electronic device 2000 illustrated in fig. 3 is not limited to the electronic device, and an actual electronic device may include more or fewer components than illustrated, or may combine some components, or may be different in arrangement of components.
In addition, the technical effects of the electronic device 2000 may refer to the technical effects of the elastic file time delay management and control method described in the above method embodiments, which are not described herein.
It should be appreciated that the processor in embodiments of the present application may be a central processing unit (central processing unit, CPU), which may also be other general purpose processors, digital signal processors (digital signal processor, DSP), application specific integrated circuits (application specific integrated circuit, ASIC), off-the-shelf programmable gate arrays (field programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It should also be appreciated that the memory in embodiments of the present application may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as an external cache. By way of example but not limitation, many forms of random access memory (random access memory, RAM) are available, such as Static RAM (SRAM), dynamic Random Access Memory (DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate synchronous dynamic random access memory (DDR SDRAM), enhanced Synchronous Dynamic Random Access Memory (ESDRAM), synchronous Link DRAM (SLDRAM), and direct memory bus RAM (DR RAM).
The above embodiments may be implemented in whole or in part by software, hardware (e.g., circuitry), firmware, or any other combination. When implemented in software, the above-described embodiments may be implemented in whole or in part in the form of a computer program product. The computer program product comprises one or more computer instructions or computer programs. When the computer instructions or computer program are loaded or executed on a computer, the processes or functions described in accordance with the embodiments of the present application are all or partially produced. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center by wired (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more sets of available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium. The semiconductor medium may be a solid state disk.
It should be understood that the term "and/or" is merely an association relationship describing the associated object, and means that three relationships may exist, for example, a and/or B may mean: there are three cases, a alone, a and B together, and B alone, wherein a, B may be singular or plural. In addition, the character "/" herein generally indicates that the associated object is an "or" relationship, but may also indicate an "and/or" relationship, and may be understood by referring to the context.
In the present application, "at least one" means one or more, and "a plurality" means two or more. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
It should be understood that, in various embodiments of the present application, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that an article or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such article or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of additional identical elements in an article or apparatus that comprises the element.
As an example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices located at one site or, alternatively, distributed across multiple sites and interconnected by a communication network.
In summary, the present embodiment provides a multifunctional mouse, where the eye tracker obtains an eye focus displacement state; acquiring the movement time t1 of the eye focus; determining a moving average speed of the eye focus according to the running time of the eye focus and the displacement state; and determining the position of the mouse cursor according to the moving average speed of the eye focus. According to the mouse control method, when the eye tracking device is used for controlling the mouse cursor, compensation correction can be carried out on the tracking process of the eye focus, and the inaccuracy of the position of the mouse cursor caused by too fast eye movement is avoided.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting thereof; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the embodiments of the present application, and are intended to be included within the scope of the present application.

Claims (10)

1. Multifunctional mouse based on thing networking, its characterized in that includes:
the mouse comprises a mouse body, wherein the mouse body comprises a WIFI communication module, a wireless charging module and a battery, and the wireless charging module is electrically connected with the battery;
a WIFI relay device; and
and the mouse body is in communication connection with the eye tracker through the WIFI relay device.
2. The multifunctional mouse based on the internet of things of claim 1, wherein the mouse body further comprises:
the displacement state acquisition module is used for acquiring an eye focus displacement state through the eye tracker;
the time acquisition module is used for acquiring the movement time t1 of the eye focus;
the average speed confirming module is used for determining the moving average speed of the eye focus according to the running time of the eye focus and the displacement state;
and the cursor confirmation module is used for determining the position of the mouse cursor according to the moving average speed of the eye focus.
3. A multifunctional mouse control method based on the internet of things, which is applicable to the multifunctional mouse according to any one of claims 1-2, and is characterized in that the method comprises the following steps:
acquiring an eye focus displacement state through the eye tracker;
acquiring the movement time t1 of the eye focus;
determining a moving average speed of the eye focus according to the running time of the eye focus and the displacement state;
and determining the position of the mouse cursor according to the moving average speed of the eye focus.
4. The method of claim 3, wherein determining the position of the mouse cursor based on the moving average velocity of the eye focus comprises:
acquiring a first position of an eye focus;
acquiring a second position of an eye focus, wherein the second position is a position of the mouse icon after the movement time t1;
determining a target position according to the first position and the second position, wherein the target position is a midpoint between the first position and the second position;
and if the moving average speed of the eye focus is greater than a preset threshold v1, determining the position of the mouse cursor as the target position.
5. The method of claim 4, further comprising determining the position of the mouse cursor as the second position if the moving average velocity of the eye focus is less than or equal to the preset threshold v1.
6. The method of claim 3, wherein determining the position of the mouse cursor based on the moving average velocity of the eye focus comprises:
acquiring a first position of an eye focus;
acquiring a second position of an eye focus, wherein the second position is a position of the mouse icon after the movement time t 2;
acquiring a third position of an eye focus, wherein the third position is a position of the mouse icon after the movement time t3, and the movement time t1 is equal to the sum of the movement time t2 and the movement time t 3;
determining a target position according to the first position, the second position and the third position, wherein the target position is a geometric center formed by connecting lines of the first position, the second position and the third position;
and if the moving average speed of the eye focus is greater than a preset threshold v2, determining the position of the mouse cursor as the target position.
7. The method of claim 6, wherein the method further comprises: and if the moving average speed of the eye focus is smaller than or equal to the preset threshold v2, determining the position of the mouse cursor as the third position.
8. The mouse control method according to claim 3, wherein the determining the moving average speed of the eye focus according to the running time of the eye focus and the displacement state includes:
determining a displacement distance according to the displacement state;
and determining a moving average speed according to the displacement distance and the moving time t1.
9. The method according to claim 3, wherein before determining the position of the mouse cursor as the target position if the moving average speed of the eye focus is greater than a preset threshold v1, further comprising:
and acquiring the preset threshold v1.
10. A computer readable storage medium storing a computer program, which when loaded and executed by a processor, implements the method according to any of claims 3-9.
CN202311630091.2A 2023-11-30 2023-11-30 Multifunctional mouse based on Internet of things and control method thereof Pending CN117631855A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311630091.2A CN117631855A (en) 2023-11-30 2023-11-30 Multifunctional mouse based on Internet of things and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311630091.2A CN117631855A (en) 2023-11-30 2023-11-30 Multifunctional mouse based on Internet of things and control method thereof

Publications (1)

Publication Number Publication Date
CN117631855A true CN117631855A (en) 2024-03-01

Family

ID=90026586

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311630091.2A Pending CN117631855A (en) 2023-11-30 2023-11-30 Multifunctional mouse based on Internet of things and control method thereof

Country Status (1)

Country Link
CN (1) CN117631855A (en)

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001117714A (en) * 1999-10-20 2001-04-27 Smk Corp Coordinate input device and method for controlling movement of cursor
WO2005059736A1 (en) * 2003-12-17 2005-06-30 National University Corporation Shizuoka University Device and method for controlling pointer by detecting pupil
CN101201695A (en) * 2006-12-26 2008-06-18 谢振华 Mouse system for extracting and tracing based on ocular movement characteristic
US20090196460A1 (en) * 2008-01-17 2009-08-06 Thomas Jakobs Eye tracking system and method
KR20110111830A (en) * 2010-04-05 2011-10-12 문장일 Glasses type mouse system
US20130169532A1 (en) * 2011-12-29 2013-07-04 Grinbath, Llc System and Method of Moving a Cursor Based on Changes in Pupil Position
CN103455147A (en) * 2013-09-10 2013-12-18 惠州学院 Cursor control method
US20140009395A1 (en) * 2012-07-05 2014-01-09 Asustek Computer Inc. Method and system for controlling eye tracking
KR20150025041A (en) * 2013-08-28 2015-03-10 삼성전자주식회사 Method and its apparatus for controlling a mouse cursor using eye recognition
CN104731340A (en) * 2015-03-31 2015-06-24 努比亚技术有限公司 Cursor position determining method and terminal device
US20150199005A1 (en) * 2012-07-30 2015-07-16 John Haddon Cursor movement device
WO2015133889A1 (en) * 2014-03-07 2015-09-11 -Mimos Berhad Method and apparatus to combine ocular control with motion control for human computer interaction
CN205983477U (en) * 2016-05-18 2017-02-22 北京森博克智能科技有限公司 Wired wireless changeable multimode mouse that possesses iris discernment and USB key function
CN106775023A (en) * 2017-01-09 2017-05-31 成都信息工程大学 Electro-ocular signal acquisition method and the bluetooth mouse system based on electro-ocular signal control
CN109324703A (en) * 2018-11-30 2019-02-12 上海与德科技有限公司 A kind of mouse
CN110446999A (en) * 2017-03-23 2019-11-12 谷歌有限责任公司 Ocular signal enhancing control
US20200117286A1 (en) * 2017-04-07 2020-04-16 Hewlett-Packard Development Company, L.P. Cursor positioning adjustments
KR20210073429A (en) * 2019-12-10 2021-06-18 한국전자기술연구원 Integration Interface Method and System based on Eye tracking and Gesture recognition for Wearable Augmented Reality Device
US20220334636A1 (en) * 2021-04-19 2022-10-20 Varjo Technologies Oy Display apparatuses and methods for calibration of gaze-tracking

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001117714A (en) * 1999-10-20 2001-04-27 Smk Corp Coordinate input device and method for controlling movement of cursor
WO2005059736A1 (en) * 2003-12-17 2005-06-30 National University Corporation Shizuoka University Device and method for controlling pointer by detecting pupil
CN101201695A (en) * 2006-12-26 2008-06-18 谢振华 Mouse system for extracting and tracing based on ocular movement characteristic
US20090196460A1 (en) * 2008-01-17 2009-08-06 Thomas Jakobs Eye tracking system and method
KR20110111830A (en) * 2010-04-05 2011-10-12 문장일 Glasses type mouse system
US20130169532A1 (en) * 2011-12-29 2013-07-04 Grinbath, Llc System and Method of Moving a Cursor Based on Changes in Pupil Position
US20140009395A1 (en) * 2012-07-05 2014-01-09 Asustek Computer Inc. Method and system for controlling eye tracking
US20150199005A1 (en) * 2012-07-30 2015-07-16 John Haddon Cursor movement device
KR20150025041A (en) * 2013-08-28 2015-03-10 삼성전자주식회사 Method and its apparatus for controlling a mouse cursor using eye recognition
CN103455147A (en) * 2013-09-10 2013-12-18 惠州学院 Cursor control method
WO2015133889A1 (en) * 2014-03-07 2015-09-11 -Mimos Berhad Method and apparatus to combine ocular control with motion control for human computer interaction
CN104731340A (en) * 2015-03-31 2015-06-24 努比亚技术有限公司 Cursor position determining method and terminal device
CN205983477U (en) * 2016-05-18 2017-02-22 北京森博克智能科技有限公司 Wired wireless changeable multimode mouse that possesses iris discernment and USB key function
CN106775023A (en) * 2017-01-09 2017-05-31 成都信息工程大学 Electro-ocular signal acquisition method and the bluetooth mouse system based on electro-ocular signal control
CN110446999A (en) * 2017-03-23 2019-11-12 谷歌有限责任公司 Ocular signal enhancing control
US20200117286A1 (en) * 2017-04-07 2020-04-16 Hewlett-Packard Development Company, L.P. Cursor positioning adjustments
CN109324703A (en) * 2018-11-30 2019-02-12 上海与德科技有限公司 A kind of mouse
KR20210073429A (en) * 2019-12-10 2021-06-18 한국전자기술연구원 Integration Interface Method and System based on Eye tracking and Gesture recognition for Wearable Augmented Reality Device
US20220334636A1 (en) * 2021-04-19 2022-10-20 Varjo Technologies Oy Display apparatuses and methods for calibration of gaze-tracking

Similar Documents

Publication Publication Date Title
US10162421B2 (en) Action recognition method and action recognition apparatus
EP2901118B1 (en) Facilitation of temperature compensation for contact lens sensors and temperature sensing
US10275037B2 (en) Action-recognition based control method and control apparatus
US9922624B2 (en) Computer input device with smart scroll
CN107463400B (en) Hot updating method of mobile application and terminal equipment
US9794268B2 (en) Privacy policy management method for a user device
US8990535B2 (en) Method for operating memory controller, and memory system including the same
US10938952B2 (en) Screen reader summary with popular link(s)
CN102109975A (en) Method, device and system for determining function call relationship
CN110426057B (en) Magnetometer data calibration method and magnetometer data calibration device
US20130246727A1 (en) Electronic circuit and arbitration method
KR20170053702A (en) Automatic sensor selection based on requested sensor characteristics
CN117631855A (en) Multifunctional mouse based on Internet of things and control method thereof
US9792225B2 (en) Host and computer system having the same
US20110200040A1 (en) Extremum route determining engine and method
US20140075303A1 (en) Method and apparatus for providing a cross-device macro framework
KR20200077391A (en) Electronic device for performing migration for virtual machine in cloud environment and operation method thereof
US10735245B2 (en) Information processing apparatus and information processing method
CN112217585B (en) Signal path determination method and device, computer equipment and storage medium
CN114428574A (en) Operation control method, operation control device, storage medium and electronic equipment
JP6227172B2 (en) SEARCH METHOD, APPARATUS, DEVICE, AND NONVOLATILE COMPUTER STORAGE MEDIUM
KR102287300B1 (en) Data processing architecture and data processing method
CN110046008B (en) Associated control interaction method and device
US9882991B2 (en) URL issuing device, URL issuing method, and URL issuing program
CN117008741B (en) High-precision control method and device for mouse direction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination