KR20110033318A - Virtual mouse system using image recognition - Google Patents

Virtual mouse system using image recognition Download PDF

Info

Publication number
KR20110033318A
KR20110033318A KR1020090090743A KR20090090743A KR20110033318A KR 20110033318 A KR20110033318 A KR 20110033318A KR 1020090090743 A KR1020090090743 A KR 1020090090743A KR 20090090743 A KR20090090743 A KR 20090090743A KR 20110033318 A KR20110033318 A KR 20110033318A
Authority
KR
South Korea
Prior art keywords
mouse
fingertip
information
click
operator
Prior art date
Application number
KR1020090090743A
Other languages
Korean (ko)
Inventor
하영균
Original Assignee
주식회사 제노프릭스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 제노프릭스 filed Critical 주식회사 제노프릭스
Priority to KR1020090090743A priority Critical patent/KR20110033318A/en
Publication of KR20110033318A publication Critical patent/KR20110033318A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry

Abstract

The present invention forms a fingertip detection area on the front of the operator with a camera, detects the operator's fingertips moving within the area, converts it into two-dimensional positional information and transmits it to the computer to perform computer cursor position control. The present invention relates to a virtual mouse system that analyzes the trajectory of and performs the resetting of the mouse area in the fingertip detection area or the extraction and transfer of the click information of the operator. The present invention allows the operator to move the cursor on the computer by moving the mouse placed on the table, and to perform the same operation as using the mouse by freely moving the hand in the fingertip sensing area instead of clicking the button on the mouse. Allows the operator to perform input operations conveniently and intuitively without holding the mouse by hand.

The present invention for this purpose, the predetermined detection area forming means to detect the moving fingertips of the operator; Image acquisition means for obtaining an image sensed by the detection region forming means; Image preprocessing means for preprocessing the image data obtained by the image acquisition means; Finger position information extracting means for extracting position information of the fingertip from the data processed by the image preprocessing means; Means for correcting / analyzing fingertip information comprising the positional information; Mouse region resetting means for resetting a mouse region when the analysis result of the fingertip trace information is a mouse region setting trajectory; Click command determining means for generating a click command when the analysis result of the finger trace information is click trajectory; Cursor positioning means for generating a general cursor movement command when the analysis result of the fingertip trace information is general cursor movement; Finally, the cursor position and the final cursor position / click command transfer means for transmitting to the computer; characterized in that it comprises a.

Description

Virtual Mouse System Using Image Recognition

The present invention relates to a virtual mouse system using image recognition, and more particularly, to form a fingertip detection area on the front of the operator as a camera and to detect the fingertips of the operator moving in the area to convert it into two-dimensional position information It controls the position of the computer cursor by transferring it to the computer, analyzes the trajectory of the fingertip position, resets the mouse area in the fingertip detection area, or extracts and transmits the click information of the operator. Instead of moving the cursor on the computer by clicking and pressing the button on the mouse to perform a click operation, the operator can freely move the hand within the fingertip detection area to perform the same tasks as using the mouse, without the operator holding the mouse. Enables convenient and intuitive input And it relates to a virtual mouse system to the operator. The present invention can replace the mouse and perform control of a product adopting a display system and a cursor system, such as a computer and a TV. As an application field, an effective input in a game, information retrieval, kiosk control, and presentation. It can be used as a device.

In general, a man uses a so-called input device when using or manipulating a machine, and the present invention is a kind of man-machine interface means. have. For example, when using a computer, a person uses an input device such as a keyboard or a mouse, and when using a game machine, a joystick or a manipulation input means such as a handle or a button. On the other hand, a virtual reality system, which allows a user to feel as if lying in a real world, uses an input device that detects the movement of a body or a hand.

In the existing input devices as described above, the keyboard direction keys, the mouse, the joystick, the touch screen, etc. are inconvenient in using the human body must contact these input devices, and the space information to be input depending on the situation and the operation of the input device. This unintuitive problem and the input behavior of the inputter are greatly limited by the mechanical limitations of the input apparatus.

In addition, in the existing input devices, a data globe, a mouse and a joystick modified to be attached to the body, various body attachment sensors for motion capture (e.g., an optical marker, a magnet or a magnetic field) Body attachment devices used as input devices in virtual reality systems, such as sensors) and spatial position detection systems, must all be attached to the body, and the attachment is very inconvenient depending on the type, and reattachment work is very difficult when the operator changes. There was a troublesome problem. In addition, several people cause discomfort due to the use of the same device, the motion capture equipment has a problem that the price is very expensive. And, in the existing input devices, computer game operations in which the movements of the human body such as electronic contact points (eg, buttons, keypads, pedals, etc.), joysticks, specific position detection optical sensors are directly connected to the game contents. The input device has a problem of pressing a predefined electronic contact that is irrelevant to the exercise contents during the character movement manipulation in the game; Intuitive and unnatural problems because the exercise content and the operator's operation content must follow only the defined manner; In case of electronic contact or joystick, the operator's body must be in contact; In the case of a specific position detection optical sensor, the operator performs the movement without contact, but there is a problem that only a specific action is possible without the movement or the various movement commands that are inconsistent with the actual game contents. In addition, in many cases, the input devices are excessively designed and expensive in comparison with their intended use in a situation where most operations are possible even with a mouse movement and click function.

The present invention was created to solve the problems of the existing mouse and to provide a more convenient contactless virtual mouse in order to solve the above problems, the operator to form a fingertip detection area on the front of the operator as a camera and move within the area It detects the fingertips of the fingertips and converts them into two-dimensional positional information and transmits them to the computer to control the position of the computer cursor. The purpose is to provide a virtual mouse system for performing the extraction and delivery of.

 The present invention allows the operator to move the cursor on the computer by moving the mouse placed on the table, and to perform the same operation as using the mouse by freely moving the hand in the fingertip sensing area instead of clicking the button on the mouse. There is an effect that the operator can perform the input operation conveniently and intuitively without holding the mouse in the hand.

Still another object of the present invention is to provide a virtual mouse system using image recognition having the following effects.

-It is possible for the inputter to input information in a more realistic and intimate and intuitive context with the information to be input.

-The user can input spatial information directly connected to the user's will through the virtual reality system with natural and intuitive behavior without any device attached.

-The operator can perform the input action in a form that is directly related to the operator's will and the operation of the computer system.

-Achieve naturalness, directness, ease of use and ease of learning of manipulation.

Virtual mouse system using the image recognition of an embodiment according to the present invention for achieving the above object,

Predetermined sensing region forming means for sensing a moving finger of the operator; Image acquisition means for obtaining an image sensed by the detection region forming means; Image preprocessing means for preprocessing the image data obtained by the image acquisition means; Finger position information extracting means for extracting position information of the fingertip from the data processed by the image preprocessing means; Means for correcting / analyzing fingertip information comprising the positional information; Mouse region resetting means for resetting a mouse region when the analysis result of the fingertip trace information is a mouse region setting trajectory; Click command determining means for generating a click command when the analysis result of the finger trace information is click trajectory; Cursor positioning means for generating a general cursor movement command when the analysis result of the fingertip trace information is general cursor movement; Finally, the cursor position and the final cursor position / click command transfer means for transmitting to the computer; characterized in that it comprises a.

In a preferred embodiment of the present invention, the fingertip sensing region forming means includes a camera photographing an X-Y coordinate plane region forming the sensing region.

Hereinafter, a preferred embodiment of a virtual mouse system using image recognition according to the present invention will be described in detail with reference to the accompanying drawings.

1 is a block diagram of an embodiment of a virtual mouse system using image recognition according to the present invention, Figure 2 is a conceptual diagram of the application of a virtual mouse system using image recognition and the concept of the fingertip detection area formed by the camera according to the present invention. . 3, 4, and 5 are one embodiment showing the shape and determination conditions of the fingertip position and velocity trajectory according to the present invention, Figure 6 is an algorithm flow diagram for performing a virtual mouse operation using image recognition according to the present invention.

First, referring to FIG. 1, a system of an embodiment of the present invention includes a camera 10 forming the fingertip sensing region 1 to sense the fingertip 4 of the operator 3 in the fingertip sensing region 1. Include. This camera 10 may be anything that can sense the fingertip 4, but preferably uses a CCD camera having a digital output. The camera 10 senses the XY coordinate plane area toward the operator 3, and if the photographing position and angle of the camera include the movement range for the command of the fingertip 4 of the operator 3 without a great loss. It is possible in any condition, because the obtained image processing / extraction information can be mapped to the display screen through mathematical calculation. In addition, an image acquisition unit 110 for acquiring an image sensed by the camera 10, an image preprocessor 120 for preprocessing the image data acquired by the image acquisition unit 110, and the image preprocessing. The fingertip position information extracting unit 130 for extracting the position information of the fingertip from the data processed by the unit 120 and the correction / analysis unit 140 of the fingertip information consisting of the positional information and the fingertip trace A mouse region reset unit 150 for resetting the mouse region when the analysis result of the information is the trajectory of the mouse region, and a click command determiner 160 for generating a click command when the analysis result of the finger trajectory information is the click trajectory; When the analysis result of the fingertip trace information is a general cursor movement, the cursor position determiner 170 which generates a general cursor movement command and finally the cursor position and click command are transmitted to the computer. Species cursor is configured to include a position / click command transmission portion 180. The Here, the virtual mouse task performing module 100 may be composed of one embedded system hardware or may include a camera 100. In addition, the virtual mouse task performing module 100 may be installed in the form of a system program or a driver in the computer system 210. In this case, a camera connected to the computer may be used as the virtual mouse camera 100. Each unit 110.. 180 may be preferably integrated into one independent virtual mouse task performing module 100. The final cursor position / click command is transmitted to the computer system 210 operated by the operator 3 to change the position of the cursor 310 displayed on the display 220 connected to the computer system 210 or to click the command. In accordance with the above-described operation.

The fingertip detection area 1 is converted into a rectangular area and directly mapped to the screen area of the display 220. However, in the case where the trace drawn by the user's finger movement is a predefined mouse area reset request command, the mouse area 2 is changed accordingly. The mouse area resetting unit 150 may be reset and the mouse area 2 may be directly mapped to the screen area of the display 220.

Referring to FIG. 2, the fingertip detection area 1 formed by the camera 10 is located between the display 220 and the operator 3 in parallel with the screen of the display 220, and this operator 3 is Looking at the screen on the display 220, the fingertip 4 is moved within the mouse area 2 to control the cursor position. When the cursor is positioned at the desired position and the operator 3 draws a predefined trajectory to execute the click command, the computer system 210 executes the click command. same.

3, 4 and 5, the trajectory of the operator 3 moving the fingertip 4 and drawing the mouse area setting trajectory; General cursor movement trajectory; Click trajectory; There are three categories. As an example, the shape of the fingertip position trace, the length of the fingertip position (V), the length of the position trace segment element (L) and the angle between the position trace segment element are shown. The judgment is performed by associating (A) with the trajectory judgment related parameters. V A , V C , and V Z are the mouse zone setting reference speed, click reference speed, and override reference speed, respectively. In addition, V AB means the velocity moved from point A to point B and is a two-dimensional vector, the x-axis and Y-axis components are represented by (V AB ) X , (V AB ) Y respectively. The distance between point A and point B is L AB , and the angle between line segment BA and line segment AD is represented by A BAD . Click operation can be performed by the operator (3) by pinching the fingertips down and back up, the time it takes to click down, click down holding time, click up time is T D , T S , It is denoted by T U. These variables are calculated in real time when the trajectory element is generated and the trajectory command is executed when the trajectory determination conditions as shown in FIGS. 3, 4, and 5 are satisfied. It can be used to initialize the values or to reset them by learning to values specific to the operator. The definitions and trajectory determination conditions of the parameters shown in FIGS. 3, 4, and 5 are one embodiment, and may be variously set according to the processing capability and speed of the virtual mouse task performing module 100 and the conditions of the cursor resolution and the moving speed. .

Referring to FIG. 6, an image including the fingertip 4 entering the fingertip sensing region 1 is captured by the camera 10, and the image is converted into data by the image acquisition unit 110, which is an image preprocessor 120. This preprocessed data is processed and analyzed by the fingertip location information extraction unit 130 to extract fingertip location information. The extracted location information is transmitted to the finger locus correction / analysis unit 140, and the finger locus data is modified and analyzed in association with the locus judgment related database. The area reset unit 150 resets the mouse area, and if the current trajectory is a click command, the click command determiner 160 generates a click command. If the current trajectory is a general cursor position movement trajectory, the cursor position determiner 170 determines the cursor position, and the final cursor position / click command transfer unit 180 transmits the final cursor position and click command to the computer system 210. Repeat the operation.

Although preferred embodiments of the present invention have been described in detail above, those of ordinary skill in the art will appreciate that the present invention may be made without departing from the spirit and scope of the invention as defined in the appended claims. It will be appreciated that various modifications or changes can be made. Therefore, modifications of the embodiments of the present invention will not depart from the scope of the present invention.

As described above, according to the present invention, a fingertip detection zone is formed on the front of the operator as a camera, and the operator's fingertips are detected in the operator's fingertips. By performing the control and analyzing the trajectory of the fingertip position and resetting the mouse area in the fingertip detection area or extracting and transferring the click information of the operator, the operator holds the mouse on the table with his hand and moves it. Instead of clicking and pressing a button on the mouse to perform a click, instead of freely moving the hand within the fingertip detection area, the operator can perform the same tasks as using the mouse. It allows you to do it intuitively.

Meanwhile, the present invention is used as an input device for home appliances with a display such as a TV as well as a computer by replacing an existing mouse, which is intuitive and convenient for various applications such as device setting and control, game, information search, kiosk control, and presentation. It offers the advantage of being used, and has many other effects such as:

1. The inputter can enter the information in a more realistic and intimate and intuitive context.

2. The user can input spatial information directly connected to the inputter's will through the virtual reality system with natural and intuitive behavior without any device attached or contacted.

3. An operator can perform input actions in a form that is directly related to the operator's will and the operation of the computer system.

4. Achieve naturalness, directness, convenience, and ease of learning of manipulation.

1 is a block diagram of an embodiment of a virtual mouse system using image recognition according to the present invention.

Figure 2 is a conceptual diagram of the application of the virtual mouse system using the concept and image recognition of the fingertip detection area formed by the camera according to the present invention.

3, 4 and 5 is an embodiment showing the shape and determination conditions of the fingertip position and the speed trajectory according to the present invention.

6 is an algorithm flow diagram for performing a virtual mouse operation using image recognition according to the present invention.

<Description of the symbols for the main parts of the drawings>

1: fingertip detection area

2: mouse area

3: operator

4: fingertips

10: camera

100: virtual mouse operation execution module

110: image acquisition unit

120: image preprocessing unit

130: fingertip location information extraction unit

140: finger trajectory correction / analysis unit

150: mouse area reset unit

160: click command determination unit

170: cursor positioning unit

180: final cursor position / click command transmission unit

210: computer system

220: display

310: Cursor of computer system

Claims (3)

In the virtual mouse system using image recognition, Predetermined sensing region forming means for sensing a moving finger of the operator; Image acquisition means for obtaining an image sensed by the detection region forming means; Image preprocessing means for preprocessing the image data obtained by the image acquisition means; Finger position information extracting means for extracting position information of the fingertip from the data processed by the image preprocessing means; Means for correcting / analyzing fingertip trajectory information comprising the positional information; Click command determining means for generating a click command when the analysis result of the finger trace information is click trajectory; Cursor positioning means for generating a general cursor movement command when the analysis result of the fingertip trace information is general cursor movement; And a final cursor position / click instruction delivery means for transferring the generated cursor position and click instruction to a computer. And the final cursor position / click command is transmitted to a computer system to perform cursor position control and click commands of the computer system. The method according to claim 1, further comprising: mouse area resetting means for resetting a mouse area when the analysis result of the fingertip trace information is a mouse area setting trajectory. And resetting the size and position of the mouse area in the fingertip detection area by performing a mouse area reset operation. Each means other than the fingertip detection zone forming means is integrated into the computer system in the form of a mouse driver program and forms the fingertip detection zone using a camera connected to the computer system. Virtual mouse system using image recognition.
KR1020090090743A 2009-09-25 2009-09-25 Virtual mouse system using image recognition KR20110033318A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020090090743A KR20110033318A (en) 2009-09-25 2009-09-25 Virtual mouse system using image recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020090090743A KR20110033318A (en) 2009-09-25 2009-09-25 Virtual mouse system using image recognition

Publications (1)

Publication Number Publication Date
KR20110033318A true KR20110033318A (en) 2011-03-31

Family

ID=43937728

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020090090743A KR20110033318A (en) 2009-09-25 2009-09-25 Virtual mouse system using image recognition

Country Status (1)

Country Link
KR (1) KR20110033318A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013129730A1 (en) * 2012-02-28 2013-09-06 주식회사 케이쓰리아이 Server-based system for extracting original object by using user fingertip recognition on augmented screen
KR101330810B1 (en) * 2012-02-24 2013-11-18 주식회사 팬택 User device for recognizing gesture and method thereof
US9529443B2 (en) 2012-05-02 2016-12-27 Macron Co., Ltd Remote controller for motion recognition

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101330810B1 (en) * 2012-02-24 2013-11-18 주식회사 팬택 User device for recognizing gesture and method thereof
WO2013129730A1 (en) * 2012-02-28 2013-09-06 주식회사 케이쓰리아이 Server-based system for extracting original object by using user fingertip recognition on augmented screen
US9529443B2 (en) 2012-05-02 2016-12-27 Macron Co., Ltd Remote controller for motion recognition

Similar Documents

Publication Publication Date Title
Shriram et al. Deep learning-based real-time AI virtual mouse system using computer vision to avoid COVID-19 spread
KR101757080B1 (en) Method and system for human-to-computer gesture based simultaneous interactions using singular points of interest on a hand
JP6288372B2 (en) Interface control system, interface control device, interface control method, and program
US20140240267A1 (en) Method Using a Finger Above a Touchpad for Controlling a Computerized System
CN104331154B (en) Realize the man-machine interaction method and system of non-contact type mouse control
JP6350772B2 (en) Information processing system, information processing apparatus, control method, and program
JP2004078977A (en) Interface device
WO2015091638A1 (en) Method for providing user commands to an electronic processor and related processor program and electronic circuit.
Matlani et al. Virtual mouse using hand gestures
KR20110044391A (en) Apparatus and method for input
US20130229348A1 (en) Driving method of virtual mouse
US20140253515A1 (en) Method Using Finger Force Upon a Touchpad for Controlling a Computerized System
KR101233793B1 (en) Virtual mouse driving method using hand motion recognition
KR20110033318A (en) Virtual mouse system using image recognition
KR20160141023A (en) The method of dynamic and static gesture recognition using depth camera and interface of immersive media contents
Shukla et al. Gesture Recognition-based AI virtual mouse
Rustagi et al. Virtual Control Using Hand-Tracking
KR101588021B1 (en) An input device using head movement
KR20120062053A (en) Touch screen control how the character of the virtual pet
JP2015122124A (en) Information apparatus with data input function by virtual mouse
JP5733056B2 (en) Data input method with virtual mouse
Yoon et al. Vision-Based bare-hand gesture interface for interactive augmented reality applications
Takahashi et al. Extending Three-Dimensional Space Touch Interaction using Hand Gesture
Ranjith et al. IMPLEMENTING A REAL TIME VIRTUAL MOUSE SYSTEM USING COMPUTER VISION
KR20130129775A (en) Method for implementing user interface based on motion detection and apparatus thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application