CN115291733B - Cursor control method and device - Google Patents

Cursor control method and device Download PDF

Info

Publication number
CN115291733B
CN115291733B CN202211187655.5A CN202211187655A CN115291733B CN 115291733 B CN115291733 B CN 115291733B CN 202211187655 A CN202211187655 A CN 202211187655A CN 115291733 B CN115291733 B CN 115291733B
Authority
CN
China
Prior art keywords
distance
cursor
target object
movement
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211187655.5A
Other languages
Chinese (zh)
Other versions
CN115291733A (en
Inventor
陈若楠
胡爽
柯乐思
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Joynext Technology Corp
Original Assignee
Ningbo Joynext Technology Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Joynext Technology Corp filed Critical Ningbo Joynext Technology Corp
Priority to CN202211187655.5A priority Critical patent/CN115291733B/en
Publication of CN115291733A publication Critical patent/CN115291733A/en
Application granted granted Critical
Publication of CN115291733B publication Critical patent/CN115291733B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects

Abstract

The application relates to a cursor control method and device. The method comprises the following steps: in response to the target object being identified by the camera device, starting a cursor control program and displaying a cursor in the display device; tracking a positioning point of the target object by using a cursor control program, and determining whether the target object moves or not based on a tracking result; the method comprises the steps of responding to movement of a target object, obtaining a focus position of a camera device and sensitivity information of a cursor control program, obtaining a position before movement and a position after movement of a positioning point, determining a first distance and a second distance according to the focus position, the position before movement and the position after movement, determining a cursor movement distance according to the sensitivity information, the first distance and the second distance, and controlling cursor movement according to the cursor movement distance. The method and the device support the user to control the cursor to move by moving the palm in a larger space range, and can calculate the proper cursor moving speed in a self-adaptive manner so as to make quick adjustment for different applications or users.

Description

Cursor control method and device
Technical Field
The present application relates to the field of human-computer interaction technologies, and in particular, to a cursor control method and apparatus.
Background
The following statements are merely provided to provide background information related to the present application and may not necessarily constitute prior art.
At present, when the movement of a cursor in a screen is controlled by using the motion of a palm, the position of the cursor is generally positioned by using the absolute position of the palm, and when the position of the cursor is positioned by using the absolute position, the palm needs to be in a range in front of the screen, however, the inventor realizes that a screen in some scenes does not directly face a user, for example, a rear-row screen of an automobile or a suspended screen does not directly face the user, in such scenes, the user needs to regularly place the palm right in front of the screen to be identified, and then the cursor in the screen is controlled, which obviously is not beneficial to the long-term operation of the user, and the user is limited by the size and the boundary of a space when operating. In addition, the inventor also recognized that some solutions can determine the moving speed of the cursor by learning the operation habits of the user, however, when the above solutions are adopted, the user needs to use the system for controlling the cursor for a period of time before the system for controlling the cursor can learn the operation habits of the user, and rapid adjustment cannot be made for different applications or different users.
Disclosure of Invention
In view of the above disadvantages, embodiments of the present application provide a cursor control method, apparatus, computer device and storage medium, which can support a user to control cursor movement by moving a palm in a wider spatial range, and also adaptively calculate a suitable cursor movement speed to make a quick adjustment for different applications or different users.
The present application provides, in accordance with a first aspect, a cursor control method, which in some embodiments comprises:
in response to the target object being identified by the camera device, starting a cursor control program and displaying a cursor in the display device;
tracking a positioning point of the target object by using a cursor control program, and determining whether the target object moves or not based on a tracking result;
in response to the target object moving, acquiring the focus position of the camera device, the sensitivity information of the cursor control program and acquiring the pre-movement position and the post-movement position of the positioning point;
determining a first distance and a second distance according to the focus position, the position before movement and the position after movement, wherein the first distance represents the movement distance of the positioning point, and the second distance represents the distance between the positioning point and the camera device;
and determining the cursor movement distance according to the sensitivity information, the first distance and the second distance, and controlling the cursor to move according to the cursor movement distance.
In some embodiments, determining the first distance and the second distance from the focal position, the pre-movement position, and the post-movement position comprises:
determining a first distance according to the position before the movement and the position after the movement;
the second distance is determined from the camera focus position and the pre-movement position.
In some embodiments, determining the cursor movement distance from the sensitivity information, the first distance, and the second distance comprises:
acquiring a preset standard distance, wherein the preset standard distance is a preset value representing the distance between a first preset positioning point and the camera device;
and determining the cursor movement distance according to the sensitivity information, the preset standard distance, the first distance and the second distance.
In some embodiments, determining the cursor movement distance according to the sensitivity information, the preset standard distance, the first distance and the second distance includes:
determining a cursor movement distance based on the following formula;
Figure 357749DEST_PATH_IMAGE001
wherein the content of the first and second substances,Lindicates the distance the cursor has moved by,Sthe information on the sensitivity is represented by,
Figure 554375DEST_PATH_IMAGE002
the first distance is represented as a function of,
Figure 761234DEST_PATH_IMAGE003
and represents the second distance or the second distance,D s indicating a preset standard distance.
In some embodiments, displaying a cursor in a display device includes: displaying a cursor at an initial position of a display device;
the method further comprises the following steps:
and switching the display position of the cursor in the display device from the current position to the initial position in response to recognizing that the target object completes the preset action.
In some embodiments, controlling cursor movement according to cursor movement distance includes:
acquiring a coordinate before movement of a cursor, wherein the coordinate before movement refers to a coordinate corresponding to a display position of the cursor in a display device before movement;
determining a moving direction of the target object;
determining the coordinates of the cursor after movement according to the cursor movement distance, the coordinates of the cursor before movement and the movement direction of the target object;
and controlling the cursor to move from the display position corresponding to the coordinates before moving to the display position corresponding to the coordinates after moving.
In some embodiments, the target object is a hand; tracking the positioning point of the target object by using a cursor control program, comprising:
acquiring a color image containing a hand by an image pickup device;
detecting joint points of a hand in the color image through a hand joint point detection model;
extracting palm center joint points related to the palm center from the joint points of the hand;
and determining the positioning points of the hands according to the number of the palm center joint points.
In some embodiments, determining the location point of the hand based on the number of metacarpal joint points comprises:
in response to the fact that the number of the palm center joint points is 1, the extracted palm center joint points are used as positioning points of the hand;
in response to that the number of the palm center joint points is 2, calculating the connecting line midpoints of the extracted palm center joint points, and taking the connecting line midpoints as positioning points of the hands;
and in response to the number of the palm joint points being more than 2, calculating the gravity center of a polygon formed by the extracted palm joint points, and taking the gravity center of the polygon as a positioning point of the hand.
In some embodiments, in response to identifying the target object by the camera, the method further comprises:
in response to the plurality of potential objects identified by the camera device, determining a positioning point of each current potential object and a distance between the positioning point of each potential object and a second preset positioning point, and locking the potential object with the closest distance between the positioning point of the plurality of potential objects and the second preset positioning point as a target object;
or, in response to the plurality of potential objects being identified by the camera device, locking the potential object which completes the preset starting control action first in the plurality of potential objects as the target object.
In some embodiments, in tracking the location point of the target object by using the cursor control program, the method further includes:
stopping tracking the positioning point of the target object by using the cursor control program in response to receiving the ending control instruction, and re-determining a new target object by using the camera device; the ending control instruction is triggered when the target object is determined to be away from the visual field range of the camera device, or when the target object is detected to complete a preset ending control action.
In some embodiments, the method is applied to a vehicle-mounted system, the camera device is an in-vehicle camera, and the display device is a vehicle-mounted display screen.
The present application provides according to a second aspect a cursor control device, which in some embodiments comprises:
the initialization module is used for responding to the identification of the target object through the camera device, starting a cursor control program and displaying a cursor in the display device;
the tracking module is used for tracking the positioning point of the target object by using the cursor control program and determining whether the target object moves or not based on the tracking result;
the information acquisition module is used for responding to the movement of the target object, acquiring the focus position of the camera device, the sensitivity information of the cursor control program and acquiring the position before the movement and the position after the movement of the positioning point;
the distance determining module is used for determining a first distance and a second distance according to the focus position, the position before movement and the position after movement, wherein the first distance represents the movement distance of the positioning point, and the second distance represents the distance between the positioning point and the camera device;
and the cursor control module is used for determining the cursor movement distance according to the sensitivity information, the first distance and the second distance and controlling the cursor to move according to the cursor movement distance.
According to a third aspect, the present application provides a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the cursor control method provided in any of the above embodiments when executing the computer program.
The present application provides according to a fourth aspect a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the cursor control method provided in any of the embodiments described above.
When the cursor moving distance is calculated, the distance from the target object to the camera and the sensitivity information are combined, and the appropriate cursor moving speed can be calculated in a self-adaptive mode; and the relative position is also adopted to calculate the cursor moving distance, so that the user can be supported to control the cursor movement by moving a target object, such as a palm, in a larger space range, and the appropriate cursor moving speed can be calculated in an adaptive manner so as to make quick adjustment for different applications or different users.
Furthermore, the embodiment of the application can ensure the stability of the cursor movement by introducing the preset standard distance, and then improves the control feeling of the user on the cursor. In addition, the embodiment of the application also supports that the user resets the display position of the cursor in the display device by completing the preset action, so that the user can control the cursor more comfortably.
Drawings
Fig. 1 is a schematic flowchart of a cursor control method according to some embodiments of the present application;
FIG. 2 is a schematic flow chart illustrating a process for determining an anchor point of a target object according to some embodiments of the present disclosure;
FIG. 3 illustrates a hand joint detection model according to some embodiments of the present application detecting joint points of a hand;
FIG. 4 is a block diagram of a cursor control device according to some embodiments of the present application;
fig. 5 is an internal structural diagram of a computer device according to some embodiments of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of and not restrictive on the broad application.
The cursor control method provided by the embodiment of the application comprises the steps shown in fig. 1, namely steps S110-S150. The following description will be given by taking an example in which the method is applied to a cursor control device, which may be a personal computer, a vehicle-mounted device, or a server.
S110: and in response to the target object being identified by the camera device, starting a cursor control program to display a cursor on the display device.
In some embodiments, the cursor control device detects whether the target object exists in the specific environment through the camera device, for example, the detection process may be to capture an image of the specific environment through the camera device, then detect the captured image, further determine the number of external objects if the external objects are identified from the captured image, lock the external objects as the target object if the number of identified external objects is 1, and determine the target object from the identified external objects based on a preset policy (the policy content will be described in detail later through some embodiments) if the number of identified external objects is greater than 1. If the target object is present in the specific environment, a flow of controlling the movement of the cursor using the target object is started (i.e., steps S120 to S150 to be described later), and if the target object is not present in the specific environment, the detection by the image pickup device is continued. Optionally, the user may first start a function of controlling the cursor by using the target object, and after the function is started, the cursor control device may detect whether the target object exists in the specific environment by using the camera device.
The image pickup device is a Depth camera or an RGB-D (Red Green Blue-Depth) camera, and the captured image may be a color image including Depth information, such as a Depth image or an RGB-D image; the specific environment may be the entire visual field range of the imaging device, further, the entire visual field range of the imaging device may be relatively large, and if a plurality of target objects are present in the range, the determination result may be disturbed, and further, if the target objects are located in the visual field range of the imaging device, but are far from the imaging device, the accuracy of the subsequent identification and tracking of the positioning points of the target objects may be affected, and therefore, the specific environment may be only a partial visual field range of the imaging device, exemplarily, the specific environment may be a space within 0-1 meter from the imaging device in the entire visual field range of the imaging device, and if the target objects are more than 1 meter from the imaging device, the target objects may be determined to be absent in the specific environment even if the external objects can be identified from the captured images.
The target object refers to an external object for controlling the movement of the cursor, and may be, for example, a head, a hand, and/or an eye. The camera device can be a camera unit in the cursor control device and can also be an external camera device of the cursor control device. Regarding the setting position of the image capturing device, the embodiment of the present application is not particularly limited, and the target object may be captured; in some embodiments, the position of the camera device can be adjusted by the user according to the needs of the user.
S120: the method includes tracking a location point of a target object with a cursor control program, and determining whether the target object moves based on a tracking result.
In some embodiments, after the process is started, the image capturing device continuously captures (or records) an image including the target object, the cursor control device detects the image captured by the image capturing device by using a cursor control program to identify a positioning point of the target object (i.e., a point for positioning a display position of a cursor), and tracks the positioning point, and if a distance between two identified positioning points exceeds a preset threshold, it is determined that the target object moves. The cursor control device determines the corresponding display position of the cursor in the display device through the positioning point, and correspondingly controls the cursor to move to the new display position when detecting that the positioning point moves to the new position. Wherein, the existing target tracking technology can be adopted to track the positioning point of the target object. In addition, the preset threshold and the frequency of the locating point of the identified target object can be flexibly adjusted according to a specific application scenario, which is not specifically limited in the embodiment of the present application.
In some embodiments, if the target object is a hand, the above step of determining the location point of the target object, as shown in fig. 2, may include steps S121-S124. S121: a color image including the hand, such as an RGB (Red Green Blue) image, is acquired by the image pickup device.
The color image may be an image directly captured by the image capturing device or a video frame extracted from a video captured by the image capturing device.
S122: joint points of the hand in the color image are detected by the hand joint point detection model.
The embodiment of the present application does not limit the selection of the hand joint point detection model, and specifically may be a MediaPipe model or other AI (Artificial Intelligence) models for hand joint point detection. Illustratively, the articulation points of the palm detected by the hand articulation point detection model may be the various articulation points identified as 1-20 as shown in FIG. 3.
S123: and extracting palm joint points related to the palm from the joint points of the hand.
After detecting the joint points of the hand, the cursor control device extracts the palm center joint points from the hand. The metacarpal joint points refer to joint points around the palm of the hand, such as joint points 0, 5, 9, 13 and 17 shown in fig. 3, and the remaining joint points may be referred to as finger joint points. Of course, which joint points of the hand are taken as the palm joint points can be selected according to actual needs.
S124: and determining the positioning point palm joint points of the hand according to the number of the palm joint points.
The cursor control device, after extracting the palm joint points, determines the positioning points of the hand based on the number of extracted palm joint points. In some embodiments, if the number of the palm joint points is 1, the extracted palm joint points are directly used as positioning points of the hand; if the number of the palm center joint points is 2, the connecting line midpoint of the extracted palm center joint points can be calculated, and then the connecting line midpoint is used as a positioning point of the hand; if the number of the palm center joint points is more than 2, the center of gravity of a polygon formed by the extracted palm center joint points is calculated, and then the center of gravity of the polygon is used as a positioning point of the hand.
Through steps S121 to S124, the embodiment of the present application can stably determine the location point of the hand.
Of course, the above description only describes some ways for determining the location point of the target object, and actually there may be other ways to determine the location point of the target object, for example, after detecting the joint point of the palm in the color image through the hand joint point detection model, the joint point (8 shown in fig. 3) of the fingertip of the index finger may be determined as the location point of the target object, and for example, assuming that the target object is a head, any one eye may be determined as the location point, or the midpoint of the connecting line of the two eyes may be determined as the location point.
S130: in response to the target object moving, the focus position of the imaging device, the sensitivity information of the cursor control program, and the pre-movement position and the post-movement position of the positioning point are acquired.
S140: the first distance and the second distance are determined according to the focal position, the pre-movement position and the post-movement position.
S150: and determining the cursor movement distance according to the sensitivity information, the first distance and the second distance, and controlling the cursor to move according to the cursor movement distance.
Image pickup device
The sensitivity information (hereinafter referred to as sensitivity) represents the moving speed of the cursor, and can be preset (can be any value) according to a specific application scene, and the sensitivity can be adjusted by a user according to requirements after being preset.
When the cursor control device detects that the target object moves, the display position of the cursor in the display device needs to be changed correspondingly. In some embodiments, the cursor control device acquires the focus position and sensitivity of the image pickup device, and acquires the pre-movement position and the post-movement position of the positioning point, and calculates the cursor movement distance based on the acquired data.
The cursor control device determines a first distance and a second distance according to the focus position, the position before moving and the position after moving, wherein the first distance represents the moving distance of the positioning point in a specific environment (namely the distance between the position corresponding to the positioning point before moving and the position corresponding to the position after moving), and the second distance represents the distance between the positioning point and the camera device actually. In some embodiments, the first distance may be determined from the pre-movement position and the post-movement position, and the second distance may be determined from the focus position and the pre-movement position. After the first distance and the second distance are determined, the cursor movement distance is determined according to the sensitivity, the first distance and the second distance.
In some implementations, the cursor movement distance can be determined based on the following equation:
Figure 435929DEST_PATH_IMAGE004
wherein, in the above formulaLIndicates the distance that the cursor has moved,Sthe sensitivity is indicated in the form of,
Figure 428156DEST_PATH_IMAGE005
it is indicated that the first distance is,
Figure 310530DEST_PATH_IMAGE006
representing the second distance.
In some embodiments, in the process, even if the target object is not detected to move, the cursor movement speed may be calculated based on the distance between the current target object and the camera and the preset sensitivity, so that when the target object is detected to move, the cursor movement distance may be determined based on the movement distance and the pre-calculated cursor movement speed by only acquiring the movement distance of the target object, and thus, the cursor control operation of the user may be responded to more quickly, the delay between the target object and the cursor in the moving process is further reduced, and the user obtains better cursor control experience.
When the cursor is controlled to move according to the cursor movement distance, the coordinates before the cursor moves (the coordinates corresponding to the display position in the display device before the cursor moves) and the movement direction of the target object may be obtained first, then the coordinates after the cursor moves are determined according to the cursor movement distance, the coordinates before the cursor moves and the movement direction of the target object, and finally the cursor may be controlled to move from the display position corresponding to the coordinates before the cursor moves to the display position corresponding to the coordinates after the cursor moves.
When the cursor moving distance is calculated, the distance from the target object to the camera device and the sensitivity of the cursor are combined, and the appropriate cursor moving speed can be calculated in a self-adaptive mode; and the relative position is also adopted to calculate the cursor movement distance, so that the user can be supported to control the cursor movement by a moving target object such as a palm in a larger space range, and the appropriate cursor movement speed can be calculated in an adaptive manner so as to adjust the cursor movement speed quickly for different applications or different users.
Based on the above embodiment, this embodiment further describes and optimizes the technical solution, and as a preferred implementation, the above steps: determining the cursor movement distance according to the sensitivity, the first distance and the second distance, and comprising the following steps: acquiring a preset standard distance, wherein the preset standard distance is a preset value representing the distance between a first preset positioning point and a camera device; and determining the cursor moving distance according to the sensitivity, the standard distance, the first distance and the second distance. Wherein the preset standard distance is preset by a developer
When controlling the movement of the cursor on the display device by using the target object, it is necessary to recognize the moving distance (denoted as S) of the target object in the real world first r ) Then, based on a specific mapping manner, the S is further mapped r Mapping is the distance traveled by the cursor on the display device (denoted S V ) However, the inventors found that when the actual moving distance of the target object is kept constant, if the distance between the target object and the imaging device is changed, S finally recognized is V Will change accordingly, resulting in the mapped S V Also, in other words, when the actual moving distance of the target object remains unchanged, the farther the target object is from the image capturing device, the obtained S r The smaller the size of the target object, the closer the target object is to the imaging device, and the smaller the distance S obtained r The larger. That is to say that the temperature of the molten steel is,when a user controls a target object to move the same distance in the real world, the moving distance of a cursor in a display device changes along with the change of the distance between the target object and a camera device, however, the user can easily change the distance between the target object and the camera device in an inadvertent way in actual operation (understandably, the cursor control device is more sensitive to the change of the distance, and the action of the user is more easy to change the distance between the target object and the camera device), which makes it difficult for the user to estimate how much distance the user can correspondingly control the cursor to move each time, so that the user has weak control feeling of the user on the cursor and high control difficulty of the cursor.
In some embodiments, a standard value, that is, the preset standard distance, is introduced in the embodiments of the present application to convert the recognized first distance into a more standard distance, so as to reduce the influence of the change between the target object and the imaging device on the moving distance of the cursor, and enhance the control feeling of the user on the cursor. Specifically, when determining the cursor movement distance according to the sensitivity, the first distance, and the second distance, the cursor control device first obtains the preset standard distance, and then determines the cursor movement distance according to the sensitivity, the preset standard distance, the first distance, and the second distance. The standard distance represents a distance from a preset positioning point to the camera device, and it should be noted that the preset standard distance is a preset fixed value, and a user cannot adjust the distance and can only adjust the distance according to the preset fixed value by a developer.
Further, in some embodiments, the cursor movement distance may be determined based on the following formula:
Figure 20997DEST_PATH_IMAGE007
wherein, in the above formulaLIndicates the distance the cursor has moved by,Sthe sensitivity is expressed in terms of the degree of sensitivity,
Figure 764962DEST_PATH_IMAGE002
it is indicated that the first distance is,
Figure 852174DEST_PATH_IMAGE003
it is indicated that the second distance is,D s indicating the standard distance.
Based on the above embodiments, this embodiment further describes and optimizes the technical solution, and as a preferred implementation, the displaying a cursor in a display device includes: a cursor is displayed at an initial position of the display device. I.e. when the target object is initially recognized, while the cursor is displayed at the initial position of the display device.
The cursor control method further comprises the following steps: and switching the display position of the cursor in the display device from the current position to the initial position in response to recognizing that the target object completes the preset action.
In some embodiments, when the cursor control device recognizes the target object through the image capturing device and starts the process, a cursor control program may be started to perform an initialization process of cursor control, that is, to acquire various parameters (which may include data of sensitivity, a preset standard distance, a focus position of the image capturing device, and the like in the above embodiments) and initially display the cursor at a predetermined initial position. The initial position (i.e. the initial display position of the cursor in the display device) can be set according to the actual situation; alternatively, the initial position may be set at the center of the display device; optionally, the initial position may be adjusted by the user at any time according to a requirement, and if the user does not adjust the initial position, the initial position may be set as the center of the display device by default. After the process is started, the cursor control device continuously tracks the positioning point of the target object, and synchronously displays the moving track of the cursor in the display device when the positioning point moves.
Considering that in some cases, when the target object is recognized for the first time and the process is started, the position of the target object is just at the edge of the field of view of the camera device (such as the edge of the upper side, the lower side, the left side or the right side), it can be understood that when the target object is moved at the edge of the field of view of the camera device, the target object is easily out of the field of view of the camera device, and therefore the interruption of the cursor control process is caused, which may affect the operation experience of the user on the cursor; in addition, in the above case, in order to control the cursor more comfortably, the user usually moves the position of the target object from the edge of the field of view of the camera to the center of the field of view of the camera, however, the user moves the target object while also moving the cursor, and if the initial position of the cursor is located at the center, when the user moves the position of the target object from the edge of the field of view of the camera to the center of the field of view of the camera, the display position of the cursor is moved from the center of the display to the edge of the display, and at this time, the user still cannot control the cursor comfortably. For the above deficiency, the embodiments of the present application provide a solution for resetting the cursor display position, that is, if the cursor control device recognizes that the target object completes the preset action in the process, the cursor display position in the display device is switched from the current position to the initial position, for example, after the user moves the position of the target object from the edge of the field of view of the camera device to the center of the field of view of the camera device, the preset action may be made to move the cursor display position from the edge of the screen of the display device to the center of the screen of the display device, so that the user may comfortably manipulate the cursor. In addition, the user resets the display position of the cursor through the preset action, and the effect that any spatial position in the range can be mapped to the central position of the display device through resetting the display position of the cursor in the visual field range (or the specific range) of the camera device can be achieved, so that the movable range of the target object is not limited by the position and the size of the display device any more, and the comfort level of the user in controlling the cursor is improved.
For some specific scenes, for example, when a user controls a cursor of a vehicle-mounted display screen to move through a target object (for example, a hand) in a vehicle, since the space in the vehicle is limited, the body or hand movement of the user is also limited, and therefore, if the user is allowed to adapt to the visual field range of the camera device to control the cursor by moving, the operation difficulty is high, and the user cannot control the cursor freely.
It should be noted that, the preset action is not specifically limited in the embodiments of the present application, and the preset action may be set according to a specific application scenario, for example, in some embodiments, if the target object is a palm, the preset action may be to make a quick fist, make a fist for a preset duration, and the like; for another example, assuming that the target object is an eye, the preset action may be blinking n times in succession, closing the eye for a preset duration, and the like.
In some embodiments, in response to identifying the target object by the camera, the method further comprises:
in response to the plurality of potential objects identified by the camera device, determining a positioning point of each current potential object and a distance between the positioning point of each potential object and a second preset positioning point, and locking the potential object with the closest distance between the positioning point of the plurality of potential objects and the second preset positioning point as a target object;
or in response to the plurality of potential objects being identified by the camera device, locking the potential object which completes the preset starting control action first in the plurality of potential objects as the target object.
In view of the fact that in some scenarios, the number of external objects recognized by the imaging device is multiple (i.e., greater than 1), the present embodiment provides two embodiments for determining a target object from the multiple external objects. The present embodiment refers to the identified plurality of external objects as potential objects (potential target objects).
In one embodiment, the target object is determined by comparing the distance between the location point of each potential object and a preset location point. For the convenience of distinguishing the preset positioning point according to the present embodiment from the preset positioning point for determining the preset standard distance according to the foregoing embodiment, the preset positioning point according to the present embodiment is referred to as a second preset positioning point, and the preset positioning point according to the foregoing embodiment is referred to as a first preset positioning point. Specifically, the embodiment takes the potential object with the positioning point closest to the second preset positioning point as the target object. Of course, optionally, a potential object whose locating point is farthest from the second preset locating point may also be used as the target object. It should be noted that, for a manner of determining an anchor point of a potential object, reference may be made to the above related embodiments, and details of this embodiment are not described again.
In another embodiment, the target object is determined by comparing the time for each potential object to complete a predetermined initiating control action. Specifically, the present embodiment takes a potential object, which is detected as the earliest to complete the preset start control action, as a target object. The preset starting control action can be flexibly set according to actual needs, and the embodiment is not limited.
In the above embodiment, after the target object is locked from the plurality of potential objects, the movement of the cursor is controlled according to the movement of the target object, and even if other potential objects are still in the field of view of the imaging device, the control of the cursor by the user corresponding to the target object is not affected.
In some embodiments, in tracking the location point of the target object by using the cursor control program, the method further includes:
stopping tracking the positioning point of the target object by using the cursor control program in response to receiving the ending control instruction, and re-determining a new target object by using the camera device; the ending control instruction is triggered when the target object is determined to be out of the visual field range of the camera device, or when the target object is detected to complete a preset ending control action.
In this embodiment, when the target object is tracked, if it is determined that the current target object leaves the visual field range of the image capturing device, or it is detected that the current target object completes the preset ending control action, the positioning point of the current target object is no longer tracked, the cursor is no longer controlled to move according to the current target object, and then the process of determining the target object is performed again through the image capturing device.
In some scenarios, the user of the current target object does not want to control the cursor any more, or needs to give control right of the cursor to other users, and then the target object may be moved out of the field of view of the image capturing device, or a preset action (i.e. the preset ending control action) is actively performed to indicate that the cursor control device stops the cursor control process, and a new target object is determined again.
Further, after the current target object is detected to perform the preset ending control action, in order to avoid determining the current target object as a new target object again, after all potential objects are detected, the current target object may be removed from the detected potential objects, and then the determination operation of the target object may be performed.
In some embodiments, the cursor control device may be applied to a vehicle-mounted system, the camera may be an in-vehicle camera, and the display device may be a vehicle-mounted display screen.
Further, if the cursor control device is provided for the driver, in order to ensure the safety of the driver, when the current vehicle speed is detected to be not 0, the cursor control program can be automatically closed, and even the cursor control function can be directly closed; if the cursor control device is provided for non-drivers, whether the cursor control function is activated or not is not limited according to the current vehicle speed,
further, in some scenarios, the vehicle may shake during driving, which may further cause the target object to shake during moving, for example, when a user in the vehicle controls the cursor by using a hand, if the vehicle drives to a bumpy road section, the hand of the user may shake, and the user may not control the cursor well. In contrast, when the cursor control program is used for tracking the positioning point of the target object, the jitter of the movement track of the target object is eliminated through the filtering algorithm, so that the movement of the cursor is smoother, and the user can control the cursor more freely.
It should be noted that, with respect to the steps included in the cursor control method provided in any of the above embodiments, unless explicitly stated herein, the execution of the steps is not strictly limited in order, and the steps may be executed in other orders. Moreover, at least some of the steps may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least some of the sub-steps or stages of other steps.
Based on the same inventive concept, the application also provides a cursor control device. In some embodiments, as shown in FIG. 4, the cursor control device includes the following modules:
an initialization module 110, configured to start a cursor control program in response to the target object being identified by the image capture device, and display a cursor on the display device;
a tracking module 120 for tracking a positioning point of the target object by using a cursor control program and determining whether the target object moves based on the tracking result;
an information acquisition module 130, configured to acquire a focal position of the imaging device, sensitivity information of the cursor control program, and a pre-movement position and a post-movement position of the positioning point in response to movement of the target object;
a distance determining module 140, configured to determine a first distance and a second distance according to the focus position, the before-movement position, and the after-movement position, where the first distance represents a movement distance of the positioning point, and the second distance represents a distance between the positioning point and the camera device;
and the cursor control module 150 is configured to determine a cursor movement distance according to the sensitivity information, the first distance and the second distance, and control cursor movement according to the cursor movement distance.
In some embodiments, the distance determination module 140, when determining the first distance and the second distance from the camera focus position, the pre-movement position, and the post-movement position, is configured to:
determining a first distance according to the position before the movement and the position after the movement;
the second distance is determined from the camera focus position and the pre-movement position.
In some embodiments, the cursor control module 150, when determining the cursor movement distance based on the sensitivity, the first distance, and the second distance, is configured to:
acquiring a preset standard distance, wherein the preset standard distance is a preset value representing the distance between a first preset positioning point and the camera device;
and determining the cursor movement distance according to the sensitivity information, the preset standard distance, the first distance and the second distance.
In some embodiments, when the cursor control module 150 determines the cursor moving distance according to the sensitivity information, the preset standard distance, the first distance and the second distance, it is configured to:
determining a cursor movement distance based on the following formula;
Figure 390603DEST_PATH_IMAGE007
wherein the content of the first and second substances,Lindicates the distance the cursor has moved by,Sthe information on the sensitivity is represented by,
Figure 588366DEST_PATH_IMAGE002
the first distance is represented as a function of,
Figure 119710DEST_PATH_IMAGE003
and represents the second distance or the second distance,D s representing a preset standard distance.
In some embodiments, the initialization module, when displaying the cursor in the display device, is to: displaying a cursor at an initial position of a display device;
the above-mentioned device still includes:
and a resetting module (not shown in the figure) for switching the display position of the cursor in the display device from the current position to the initial position in response to the recognition that the target object completes the preset action.
In some embodiments, the cursor control module 150 is configured to, when controlling the cursor movement according to the cursor movement distance:
acquiring a coordinate before movement of a cursor, wherein the coordinate before movement refers to a coordinate corresponding to a display position of the cursor in a display device before movement;
determining a moving direction of the target object;
determining the coordinates of the cursor after movement according to the cursor movement distance, the coordinates of the cursor before movement and the movement direction of the target object;
and controlling the cursor to move from the display position corresponding to the coordinates before moving to the display position corresponding to the coordinates after moving.
In some embodiments, the tracking module 120 is also used to determine the location point of the target object using a cursor control program.
In some embodiments, the target object is a hand; accordingly, the tracking module 120, when determining the location point of the target object using the cursor control program, is configured to:
acquiring a color image containing a hand by an image pickup device;
detecting joint points of a hand in the color image through a hand joint point detection model;
extracting palm joint points related to the palm of the hand from the joint points of the hand;
and determining the positioning points of the hands according to the number of the palm center joint points.
Further, in some embodiments, the tracking module 120, when determining the location point of the hand according to the number of the palm joint points, is configured to:
in response to the fact that the number of the palm center joint points is 1, the extracted palm center joint points are used as positioning points of the hand;
in response to that the number of the palm center joint points is 2, calculating the connecting line midpoints of the extracted palm center joint points, and taking the connecting line midpoints as positioning points of the hands;
and in response to the number of the palm joint points being more than 2, calculating the gravity center of the polygon formed by the extracted palm joint points, and taking the gravity center of the polygon as the positioning point of the hand.
In some embodiments, the apparatus further comprises a target object locking module (not shown). Before responding to the identification of the target object by the camera device, a target object locking module for:
in response to the plurality of potential objects identified by the camera device, determining a positioning point of each current potential object and a distance between the positioning point of each potential object and a second preset positioning point, and locking the potential object with the closest distance between the positioning point of the plurality of potential objects and the second preset positioning point as a target object;
or in response to the plurality of potential objects being identified by the camera device, locking the potential object which completes the preset starting control action first in the plurality of potential objects as the target object.
In some embodiments, in the process of tracking the location point of the target object by using the cursor control program, the tracking module 120 stops tracking the location point of the target object by using the cursor control program in response to receiving the end control instruction; the target object locking module determines a new target object again through the camera device; the ending control instruction is triggered when the target object is determined to leave the visual field range of the camera device, or triggered when the target object is detected to complete a preset ending control action.
In some embodiments, the device is applied to a vehicle-mounted system, the camera device is a vehicle-mounted camera, and the display device is a vehicle-mounted display screen.
For the specific definition of the cursor control device, reference may be made to the definition of the cursor control method above, and details are not described here. All or part of the modules in the cursor control device can be realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent of a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In some embodiments, a computer device is provided, the internal structure of which may be as shown in fig. 5.
The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing preset parameters such as a focal position, sensitivity, a standard distance and the like of the camera, and also can store data such as images shot by the camera, and the specific stored data can be referred to the definition in the above method embodiment. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a cursor control method.
It will be appreciated by those skilled in the art that the configuration shown in fig. 5 is a block diagram of only a portion of the configuration associated with the present application, and is not intended to limit the computing device to which the present application may be applied, and that a particular computing device may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
The present embodiment also provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the steps of the cursor control method provided in any of the above embodiments can be implemented.
In some embodiments, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, realizes the steps of the cursor control method provided in any of the above embodiments.
In the above embodiments of the present application, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
It will be understood by those skilled in the art that all or part of the processes of the embodiments of the methods described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the computer program is executed. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link (SLDRAM), rambus (Rambus), direct RAM (RDRAM), direct bused dynamic RAM (DRDRAM), and bused dynamic RAM (RDRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, and these are all within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (12)

1. A cursor control method, the method comprising:
in response to the target object being identified by the camera device, starting a cursor control program and displaying a cursor in the display device;
tracking the positioning point of the target object by using the cursor control program, and determining whether the target object moves or not based on the tracking result;
responding to the movement of the target object, acquiring the focus position of the camera device, the sensitivity information of the cursor control program and acquiring the pre-movement position and the post-movement position of the positioning point;
determining a first distance and a second distance according to the focus position, the position before movement and the position after movement, wherein the first distance represents the movement distance of the positioning point, and the second distance represents the distance between the positioning point and the camera device;
determining a cursor movement distance according to the sensitivity information, the first distance and the second distance, and controlling the cursor to move according to the cursor movement distance;
determining a cursor movement distance according to the sensitivity information, the first distance and the second distance, including:
determining the cursor movement distance based on the following formula:
Figure DEST_PATH_IMAGE001
wherein, in the above formula
Figure DEST_PATH_IMAGE003
Indicating the distance the cursor has moved,
Figure 155249DEST_PATH_IMAGE004
the sensitivity information is represented by a signal representing the sensitivity information,
Figure DEST_PATH_IMAGE005
the first distance is represented by a first distance,
Figure 512413DEST_PATH_IMAGE006
representing said second distance
Or, determining the cursor movement distance based on the following formula:
Figure DEST_PATH_IMAGE007
wherein, in the above formula
Figure 67022DEST_PATH_IMAGE003
Indicating the distance the cursor has moved,
Figure DEST_PATH_IMAGE009
the sensitivity information is represented by a signal representing the sensitivity information,
Figure 324566DEST_PATH_IMAGE005
the first distance is represented by a first distance,
Figure 735955DEST_PATH_IMAGE006
the second distance is represented by the second distance,
Figure 377152DEST_PATH_IMAGE010
indicating the standard distance.
2. The method of claim 1, wherein said determining a first distance and a second distance from said focal position, said pre-movement position, and said post-movement position comprises:
determining the first distance according to the pre-movement position and the post-movement position;
determining the second distance from the focal position and the pre-movement position.
3. The method of claim 1 or 2, wherein said determining a cursor movement distance from said sensitivity information, said first distance and said second distance comprises:
acquiring a preset standard distance, wherein the preset standard distance is a preset value representing the distance between a first preset positioning point and the camera device;
and determining the cursor movement distance according to the sensitivity information, the preset standard distance, the first distance and the second distance.
4. The method of claim 1, wherein said displaying a cursor in a display device comprises:
displaying the cursor at an initial position of the display device;
the method further comprises the following steps:
and switching the display position of the cursor in the display device from the current position to the initial position in response to recognizing that the target object completes a preset action.
5. The method of claim 1, wherein the target object is a hand; the tracking the positioning point of the target object by using the cursor control program comprises:
acquiring a color image including the hand by the image pickup device;
detecting joint points of the hand in the color image by a hand joint point detection model;
extracting palm joint points related to the palm of the hand from the joint points of the hand;
and determining the positioning point of the hand according to the number of the palm joint points.
6. The method of claim 5, wherein said determining the location point of the hand based on the number of metacarpal joint points comprises:
in response to the fact that the number of the palm joint points is 1, taking the extracted palm joint points as positioning points of the hand;
responding to the fact that the number of the palm joint points is 2, calculating the extracted connecting line middle points of the palm joint points, and taking the connecting line middle points as positioning points of the hand;
and responding to the number of the palm joint points being more than 2, calculating the gravity center of a polygon formed by the extracted palm joint points, and taking the gravity center of the polygon as the positioning point of the hand.
7. The method of claim 1, wherein in response to prior to identifying the target object by the camera, the method further comprises:
in response to a plurality of potential objects being identified through the camera device, determining a positioning point of each potential object and a distance between the positioning point of each potential object and a second preset positioning point, and locking a potential object with the closest distance between the positioning point of the potential objects and the second preset positioning point as the target object;
or, in response to a plurality of potential objects being identified by the camera device, locking a potential object which is the first potential object to complete a preset starting control action as the target object.
8. The method of claim 1 or 7, wherein in tracking the location point of the target object with the cursor control program, the method further comprises:
stopping tracking the positioning point of the target object by using the cursor control program in response to receiving a finishing control instruction, and re-determining a new target object by using the camera device; the ending control instruction is triggered when the target object is determined to leave the visual field range of the camera device, or when the target object is detected to complete a preset ending control action.
9. The method of claim 1, wherein the method is applied to a vehicle-mounted system, the camera device is an in-vehicle camera, and the display device is a vehicle-mounted display screen.
10. A cursor control device, characterized in that the device comprises:
the initialization module is used for responding to the identification of the target object through the camera device, starting a cursor control program and displaying a cursor in the display device;
the tracking module is used for tracking a positioning point of the target object by utilizing the cursor control program and determining whether the target object moves or not based on a tracking result;
the information acquisition module is used for responding to the movement of the target object, acquiring the focus position of the camera device, the sensitivity information of the cursor control program and acquiring the position before the movement and the position after the movement of the positioning point;
a distance determining module, configured to determine a first distance and a second distance according to the focal point position, the before-movement position, and the after-movement position, where the first distance represents a movement distance of the positioning point, and the second distance represents a distance between the positioning point and the image capturing apparatus;
the cursor control module is used for determining a cursor movement distance according to the sensitivity information, the first distance and the second distance and controlling the cursor to move according to the cursor movement distance;
when determining the cursor movement distance according to the sensitivity information, the first distance and the second distance, the cursor control module is configured to:
determining the cursor movement distance based on the following formula:
Figure 1032DEST_PATH_IMAGE001
wherein, in the above formula
Figure DEST_PATH_IMAGE011
Indicating the distance the cursor has moved,
Figure 145705DEST_PATH_IMAGE004
the sensitivity information is represented by a signal representing the sensitivity information,
Figure 960952DEST_PATH_IMAGE005
the first distance is represented by a first distance,
Figure 355024DEST_PATH_IMAGE006
representing said second distance
Or, determining the cursor movement distance based on the following formula:
Figure 782595DEST_PATH_IMAGE007
wherein, in the above formula
Figure 47354DEST_PATH_IMAGE011
Indicates the distance that the cursor has moved by,
Figure 800546DEST_PATH_IMAGE012
the sensitivity information is represented by a signal representing the sensitivity information,
Figure 416335DEST_PATH_IMAGE005
the first distance is represented by a first distance,
Figure 146132DEST_PATH_IMAGE006
and represents the second distance or the second distance,
Figure 530977DEST_PATH_IMAGE010
indicating the standard distance.
11. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method of any of claims 1 to 9 are implemented when the computer program is executed by the processor.
12. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 9.
CN202211187655.5A 2022-09-28 2022-09-28 Cursor control method and device Active CN115291733B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211187655.5A CN115291733B (en) 2022-09-28 2022-09-28 Cursor control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211187655.5A CN115291733B (en) 2022-09-28 2022-09-28 Cursor control method and device

Publications (2)

Publication Number Publication Date
CN115291733A CN115291733A (en) 2022-11-04
CN115291733B true CN115291733B (en) 2022-12-27

Family

ID=83833319

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211187655.5A Active CN115291733B (en) 2022-09-28 2022-09-28 Cursor control method and device

Country Status (1)

Country Link
CN (1) CN115291733B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106547339A (en) * 2015-09-22 2017-03-29 百度在线网络技术(北京)有限公司 The control method and device of computer equipment
CN109710071A (en) * 2018-12-26 2019-05-03 青岛小鸟看看科技有限公司 A kind of screen control method and device

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6501515B1 (en) * 1998-10-13 2002-12-31 Sony Corporation Remote control system
JP2011028366A (en) * 2009-07-22 2011-02-10 Sony Corp Operation control device and operation control method
KR20130072638A (en) * 2011-12-22 2013-07-02 엘지전자 주식회사 Method for operating an image display apparatus
KR20140052640A (en) * 2012-10-25 2014-05-07 삼성전자주식회사 Method for displaying a cursor on a display and system performing the same
US9063578B2 (en) * 2013-07-31 2015-06-23 Microsoft Technology Licensing, Llc Ergonomic physical interaction zone cursor mapping
CN105404384A (en) * 2015-11-02 2016-03-16 深圳奥比中光科技有限公司 Gesture operation method, method for positioning screen cursor by gesture, and gesture system
US10209785B2 (en) * 2016-02-02 2019-02-19 Microsoft Technology Licensing, Llc Volatility based cursor tethering
CN110928432B (en) * 2019-10-24 2023-06-23 中国人民解放军军事科学院国防科技创新研究院 Finger ring mouse, mouse control device and mouse control system
US11119570B1 (en) * 2020-10-29 2021-09-14 XRSpace CO., LTD. Method and system of modifying position of cursor
CN114860060A (en) * 2021-01-18 2022-08-05 华为技术有限公司 Method for hand mapping mouse pointer, electronic device and readable medium thereof

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106547339A (en) * 2015-09-22 2017-03-29 百度在线网络技术(北京)有限公司 The control method and device of computer equipment
CN109710071A (en) * 2018-12-26 2019-05-03 青岛小鸟看看科技有限公司 A kind of screen control method and device

Also Published As

Publication number Publication date
CN115291733A (en) 2022-11-04

Similar Documents

Publication Publication Date Title
CN110853076B (en) Target tracking method, device, equipment and storage medium
EP3488382B1 (en) Method and system for monitoring the status of the driver of a vehicle
CN110427850B (en) Method, system and device for predicting lane change intention of driver on expressway
CN111566612A (en) Visual data acquisition system based on posture and sight line
US9733703B2 (en) System and method for on-axis eye gaze tracking
CN108919958B (en) Image transmission method and device, terminal equipment and storage medium
US11249557B2 (en) Methods and systems for controlling a device using hand gestures in multi-user environment
EP3767520B1 (en) Method, device, equipment and medium for locating center of target object region
JP4586709B2 (en) Imaging device
US11715231B2 (en) Head pose estimation from local eye region
JP4991595B2 (en) Tracking system using particle filter
US20200081524A1 (en) Method and appartus for data capture and evaluation of ambient data
JP5737400B2 (en) Red eye detector
US11573627B2 (en) Method of controlling device and electronic device
CN114041175A (en) Neural network for estimating head pose and gaze using photorealistic synthetic data
CN104573622A (en) Face detection apparatus, face detection method, and program
WO2023045626A1 (en) Image acquisition method and apparatus, terminal, computer-readable storage medium and computer program product
CN114092985A (en) Terminal control method, device, terminal and storage medium
CN115291733B (en) Cursor control method and device
CN106371552B (en) Control method and device for media display at mobile terminal
CN112906571A (en) Living body identification method and device and electronic equipment
TWI718410B (en) Method and apparatus for pre-load display of object information
JP5128454B2 (en) Wrinkle detection device, wrinkle detection method and program
CN116820251B (en) Gesture track interaction method, intelligent glasses and storage medium
CN116962814A (en) Video image rendering method in VR scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant