KR20140139726A - Apparatus for recognizing gesture in Human Machine Interface - Google Patents

Apparatus for recognizing gesture in Human Machine Interface Download PDF

Info

Publication number
KR20140139726A
KR20140139726A KR20130060100A KR20130060100A KR20140139726A KR 20140139726 A KR20140139726 A KR 20140139726A KR 20130060100 A KR20130060100 A KR 20130060100A KR 20130060100 A KR20130060100 A KR 20130060100A KR 20140139726 A KR20140139726 A KR 20140139726A
Authority
KR
South Korea
Prior art keywords
arm
worker
cursor
screen
display
Prior art date
Application number
KR20130060100A
Other languages
Korean (ko)
Inventor
김경래
Original Assignee
엘에스산전 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘에스산전 주식회사 filed Critical 엘에스산전 주식회사
Priority to KR20130060100A priority Critical patent/KR20140139726A/en
Publication of KR20140139726A publication Critical patent/KR20140139726A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention is provided to execute a command according to a worker′s gesture by recognizing the worker′s gesture in an environment where the worker cannot directly touch and control equipment. The present invention relates to an apparatus and a method for recognizing a gesture in a human machine interface (HMI), which allow a central processing unit (CPU) to detect the positions of the joints of both elbows and wrists and to match a cursor to the hand of one of a worker′s arms, in response to a signal outputted from a gesture recognition module which recognizes the worker′s gesture, allow the CPU to move the cursor according to movements of the hand of one of the worker′s arms, and allow the CPU to execute a function corresponding to an area where the cursor is located when it is determined that the other of the worker′s arms moves within a range of a set angle.

Description

[0001] Apparatus for recognizing gesture in Human Machine Interface [

The present invention relates to an apparatus and method for recognizing an operation of an HMI (Human Machine Interface) for recognizing an operation of an operator and controlling the operation according to an operation of the recognized operator.

In general, various kinds of equipment such as an automation machine of an operator in an industrial field are provided with a control panel, and the control panel is provided with a push-type switch or a rotary switch that can be operated by an operator.

When an operator operates the control panel to set the operation of various equipment, the operator moves to the installation position of the control panel to operate the push-type switch or the rotary switch provided on the control panel.

When the position where the control panel is installed is located at a place where there is a chemical hazard or is located at a place isolated by high temperature / low temperature, the operator wears a separate protective clothing and operates the control panel after moving to the position where the control panel is installed.

In addition, if the operator's hands are contaminated with oil or other chemicals, the operator must clean his / her hands and move the control panel to the position where the control panel is installed.

Therefore, it takes a long time for the operator to move to the installation position of the control panel, so that it takes a long time until the equipment such as the predetermined automation machine can perform the predetermined operation requested by the operator.

An object of the present invention is to provide a method and apparatus for recognizing an operation of an operator with an operation recognition module, moving a cursor according to an operation of a recognized operator, executing an operation of a region where a cursor is located, An apparatus and method for recognizing an operation of an HMI capable of performing enlargement or reduction of a screen and switching of a screen are provided.

It is to be understood that both the foregoing general description and the following detailed description of the present invention are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. There will be.

According to the apparatus and method for recognizing the motion of the HMI of the present invention, an operation recognition module is provided to identify the positions of the joints of both the shoulders, the elbows, and both wrists of the worker and match the cursor to the hands of one arm of the worker.

Then, an operation command according to the movement of the joints of both the shoulders of the worker, the elbows and both wrists is determined, the cursor is moved according to the determined operation command, the operation of the area where the cursor is positioned is executed, Reduction or enlargement or reduction of the screen, and switching of the screen.

Therefore, it is an object of the present invention to provide an apparatus and method for recognizing motion of an HMI, including: an operation recognition module for recognizing a movement operation of an operator; a display unit for displaying a current operation state; The position of the joints of both wrists is grasped and the cursor is matched with the hand of one of the arms of the worker and the cursor is moved according to the movement of the hand of the arm and the other arm of the worker moves in the range of the set angle And controlling the operation of the area where the cursor is positioned to be executed when the determination is made.

Wherein the controller displays a current set value and a rotary switch on the display unit when the movement of the other arm is an execution command while the cursor is positioned in an area for adjusting a predetermined set value, When the arm is positioned on the switch, the set value is increased or decreased according to the rotation of the hand of the arm, and the other arm is controlled to apply the set value when the motion is determined within the set angle range.

The control unit may determine the magnification or reduction mode of the screen when the operator touches the two palms and enlarge or reduce the size of the screen displayed on the display unit based on the position of the cursor, It is possible to control the display unit to display the reduced image.

In addition, the controller may move the arm to the outside of the display screen of the display unit by moving the arm forward or downward, or move the other arm to the left or upward while moving the other arm forward, It is possible to control the current screen displayed on the display unit to be switched to another screen.

In addition, the controller may control the operator to move the arm to the right or to the left when the arm is moved forward, and to display the next screen when moving to the outside of the display screen of the display unit.

In addition, the control unit may control the operator to display the previous screen when the other arm is moved to the outside of the display screen of the display unit by moving the other arm in the left or upward direction.

According to the apparatus and method for recognizing the motion of the HMI of the present invention, it is possible to recognize the position of the joints of both the shoulders, both the elbows and the wrists of the operator, Executes the operation of the area where the cursor is located, increases or decreases the set value, enlarges or reduces the screen, and switches the screen.

Therefore, regardless of whether the location where the control panel is installed is a chemical hazard or where it is isolated by high temperature / low temperature, the operator can safely operate the equipment such as an automation machine, Can be changed and set.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, in which like reference numerals refer to like elements throughout.
1 is a block diagram showing a configuration of an embodiment of an operation recognition apparatus for an HMI of the present invention, and Fig.
2 is a signal flow diagram illustrating an operation of an embodiment of a central processing unit according to an operation recognition method of the HMI of the present invention.

The following detailed description is merely illustrative, and is merely an example of the present invention. Further, the principles and concepts of the present invention are provided for the purpose of being most useful and readily explaining.

Accordingly, it is not intended to provide a more detailed structure than is necessary for a basic understanding of the present invention, but it should be understood by those skilled in the art that various forms that can be practiced in the present invention are illustrated in the drawings.

1 is a block diagram showing a configuration of an embodiment of an operation recognition apparatus for an HMI of the present invention. Here, reference numeral 110 denotes an operation recognition module. The motion recognizing module 110 recognizes the worker positioned in the recognition area and recognizes both shoulders, both elbows and both wrist joints of the worker, and recognizes the movement according to the movement of both shoulders, both elbows and both wrist joints of the recognized worker Lt; / RTI >

Reference numeral 120 denotes a central processing unit. The central processing unit 120 recognizes the positions of both shoulders, both elbows and both wrist joints of the worker from the image data captured by the three-dimensional camera of the motion recognition module 110, Hand, and the wrist joints of the operator according to the movement motion recognized by the motion recognition module 110. The movement recognition module 110 determines movement of the cursor, Executes the operation of the area where the cursor is located, increases or decreases the set value, enlarges or reduces the screen, and switches the screen.

Reference numeral 130 denotes a display section. The display unit 130 displays the current operating state of the HMI under the control of the central processing unit 120.

2 is a signal flow diagram illustrating an operation of an embodiment of a central processing unit according to an operation recognition method of the HMI of the present invention. Referring to FIG. 2, the central processing unit 120 analyzes the image data of the operator located in the recognition area by the motion recognition module 110 to recognize the shoulders, elbows, and both wrist joints of the operator S200), the cursor is displayed on the display unit 130 on the basis of both shoulders, both elbows and both wrist joints of the recognized operator by matching the position of the right hand of one of the arms of the operator, for example, S202.

In this state, the central processing unit 120 inputs a signal of the motion recognition module 110 to move the hand of one arm of the worker or move the other arm of the worker, for example, Or whether the hand of the other arm of the worker or the hand of one arm moves to the outer area of the display screen of the display unit 130 (S204, S206, S208, S210 ).

The central processing unit 120 receives the signal of the operation recognition module 110 and determines whether the operator moves the hand of one arm of the worker in the step S204 And controls the display unit 130 to display and move the cursor position (S212).

The central processing unit 120 inputs a signal of the motion recognition module 110 and determines whether the current cursor is positioned in the region where the current cursor is positioned when it is determined that the other arm is raised by half based on the elbow in the step S206 And executes the operation (S214).

For example, when the worker has raised another arm by half with respect to the elbow, the central processing unit 120 executes the operation of the area where the cursor is currently located.

The central processing unit 120 also receives a signal from the operation recognition module 110 and displays the operation mode currently set on the display unit 130 when it is determined in step S208 that the operator puts both palms (S216). The screen is reduced or enlarged around the area where the current cursor is positioned, according to the distance between both hands of the operator (S218).

That is, the central processing unit 120 enlarges and displays the screen around the area where the current cursor is positioned as the worker opens the gap between both palms, and when the worker narrows the gap between both palms, The screen is reduced and displayed on the area where the screen is located.

In addition, the central processing unit 120 inputs a signal of the operation recognition module 110 and, in step S210, when the operator moves one arm or the other arm to the outside of the screen, (S220).

That is, when the operator moves the other arm forward or outward in the left or upward direction on the display screen of the display unit 130, the central processing unit 120 displays the previous screen on the display screen of the display unit 130 do. When the operator moves the arm forward and outward in the right or downward direction of the display screen of the display unit 130, the central processing unit 120 switches to the next screen on the display screen of the display unit 130 .

On the other hand, if the operator moves the arm of one arm in step S204 and places the cursor in the area for adjusting the predetermined set value in step S212, the other arm is raised by half in step S206, The central processing unit 120 displays the rotary switch on the screen of the display unit 130. In this case,

In this state, the operator moves the arm of one arm (S204), and the CPU 120 moves and displays the position of the cursor according to the movement of the arm of one arm (S212) Is the display position of the rotary switch (S222).

When the position of the cursor is the display position of the rotary switch, the CPU 120 displays the current operation setting value (S224). That is, the central processing unit 120 moves the cursor in accordance with the movement of one hand of the arm in the step S212, and sets the currently set setting for the predetermined operation of the area where the cursor is located in the step S214 Value is displayed on the display unit 130.

Here, the display of the normally set value can be displayed, for example, at the central portion of the rotary switch displayed on the screen.

The central processing unit 120 determines whether the operator rotates the hand of one arm (S226) while displaying the set value (S226). When the operator determines that the hand of one arm is to be rotated, 130, the set value is converted according to the rotation direction of the hand of the arm described above (S228). That is, when the worker rotates the hand of one arm in the counterclockwise direction, the rotary switch is rotated counterclockwise to display the set value while decreasing the set value. Further, when the worker rotates the hand of one arm in the clockwise direction, the rotary switch is rotated clockwise to display and increase the set value.

In this manner, the central processing unit 120 determines whether the worker raises the other arm by half in a state in which the set value is converted according to the rotation direction of the hand of the arm (S230).

As a result of the determination, if the operator raises the other arm by half, the central processing unit 120 applies the currently adjusted set value (S232). As a result, the corresponding equipment equipped with the HMI performs the operation according to the adjusted set value .

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, I will understand.

Therefore, the scope of the present invention should not be limited to the above-described embodiments, but should be determined by equivalents to the appended claims, as well as the appended claims.

110: motion recognition module 120: central processing unit
130:

Claims (6)

An operation recognition module for recognizing a movement motion of an operator;
A display unit for displaying a current operation state; And
The operation recognition module detects the positions of both the shoulders of the worker, the elbows and the wrists of the worker, matches the cursor to the hands of one of the arms of the worker, And a control unit for moving the cursor and controlling operation of an area in which the cursor is positioned when motion of the other arm of the worker is determined to be within the range of the set angle.
The method according to claim 1,
Wherein,
When the movement of the other arm is an execution command in a state where the cursor is located in an area for adjusting a predetermined set value, the rotary switch and the current set value are displayed on the display unit, and the cursor is positioned on the rotary switch Wherein the control unit controls the other arm to apply the set value when the movement of the other arm is determined within a range of the set angle.
The method according to claim 1,
Wherein,
The operator determines the enlargement or reduction mode of the screen when the two palms touch each other and enlarges or reduces the size of the screen displayed on the display unit according to the distance of the two palms with reference to the position of the cursor, Display of the HMI.
The method according to claim 1,
Wherein,
When the arm is moved to the outside of the display screen of the display unit by moving the arm forward or downward or moving to the left or upward when the other arm is moved forward, And controls to switch the current screen displayed on the display unit to another screen.
5. The method of claim 4,
Wherein,
Wherein the control unit controls the operator to display the next screen when the user moves the arm to the right or downward while moving the arm forward and moves to the outside of the display screen of the display unit.
5. The method of claim 4,
Wherein,
The operator moves the other arm in the left or upward direction while the other arm is moved forward, and controls to display a previous screen when moving to the outside of the display screen of the display unit.
KR20130060100A 2013-05-28 2013-05-28 Apparatus for recognizing gesture in Human Machine Interface KR20140139726A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR20130060100A KR20140139726A (en) 2013-05-28 2013-05-28 Apparatus for recognizing gesture in Human Machine Interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR20130060100A KR20140139726A (en) 2013-05-28 2013-05-28 Apparatus for recognizing gesture in Human Machine Interface

Publications (1)

Publication Number Publication Date
KR20140139726A true KR20140139726A (en) 2014-12-08

Family

ID=52457871

Family Applications (1)

Application Number Title Priority Date Filing Date
KR20130060100A KR20140139726A (en) 2013-05-28 2013-05-28 Apparatus for recognizing gesture in Human Machine Interface

Country Status (1)

Country Link
KR (1) KR20140139726A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107272878A (en) * 2017-02-24 2017-10-20 广州幻境科技有限公司 A kind of recognition methods for being applied to complicated gesture and device
WO2021244650A1 (en) * 2020-06-05 2021-12-09 北京字节跳动网络技术有限公司 Control method and device, terminal and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107272878A (en) * 2017-02-24 2017-10-20 广州幻境科技有限公司 A kind of recognition methods for being applied to complicated gesture and device
WO2021244650A1 (en) * 2020-06-05 2021-12-09 北京字节跳动网络技术有限公司 Control method and device, terminal and storage medium
EP4149116A4 (en) * 2020-06-05 2023-11-08 Beijing Bytedance Network Technology Co., Ltd. Control method and device, terminal and storage medium

Similar Documents

Publication Publication Date Title
US20160229052A1 (en) Robot operation apparatus, robot system, and robot operation program
TWI539362B (en) Interface switching method and electric device using the same
US11007651B2 (en) Haptic controller with touch-sensitive control knob
JP6497021B2 (en) Robot operation device, robot system, and robot operation program
TWI502474B (en) Method for operating user interface and electronic device thereof
WO2009128064A2 (en) Vision based pointing device emulation
CN103257811A (en) Picture display system and method based on touch screen
TW201501019A (en) Electronic device and judgment method for multi-window touch control instructions
US20150363037A1 (en) Control method of touch panel
CN107450820B (en) Interface control method and mobile terminal
US9262069B2 (en) Control device having an input display for detecting two touch points
WO2015091638A1 (en) Method for providing user commands to an electronic processor and related processor program and electronic circuit.
KR20140139726A (en) Apparatus for recognizing gesture in Human Machine Interface
KR101503159B1 (en) Method of controlling touch-screen detecting eyesight
US10102310B2 (en) Precise object manipulation system and method
JP6379902B2 (en) Robot operation device, robot system, and robot operation program
TWI505137B (en) Input device
JP5812582B2 (en) Information processing apparatus and information processing method
CN108345392A (en) A kind of multi-screen intelligently switches the method and system of touch control keyboard
CN104536597B (en) A kind of laptop realizes the method and device of multi-point touch
JP2016175174A (en) Robot operation device, and robot operation program
TWI603226B (en) Gesture recongnition method for motion sensing detector
WO2022196222A1 (en) Detection processing device, detection processing method, and information processing system
JP2024048077A (en) Information processing device, information processing method, robot system, article manufacturing method using robot system, program, and recording medium
CN105278661A (en) Gesture control method with mouse tracking control and five-direction-key waving control

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination