KR101482701B1 - Designing apparatus for gesture based interaction and designing system for gesture based interaction - Google Patents

Designing apparatus for gesture based interaction and designing system for gesture based interaction Download PDF

Info

Publication number
KR101482701B1
KR101482701B1 KR1020130088513A KR20130088513A KR101482701B1 KR 101482701 B1 KR101482701 B1 KR 101482701B1 KR 1020130088513 A KR1020130088513 A KR 1020130088513A KR 20130088513 A KR20130088513 A KR 20130088513A KR 101482701 B1 KR101482701 B1 KR 101482701B1
Authority
KR
South Korea
Prior art keywords
gesture
code
markup language
point
hurdle
Prior art date
Application number
KR1020130088513A
Other languages
Korean (ko)
Inventor
남택진
김주환
Original Assignee
한국과학기술원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국과학기술원 filed Critical 한국과학기술원
Priority to KR1020130088513A priority Critical patent/KR101482701B1/en
Application granted granted Critical
Publication of KR101482701B1 publication Critical patent/KR101482701B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F15/00Digital computers in general; Data processing equipment in general
    • G06F15/16Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The gesture interaction design apparatus 100 includes a display device 110 for displaying a gesture markup language as a graphic object, interface devices 120 and 130 for inputting or modifying data constituting a gesture markup language, A memory device 150 for storing a conversion code for converting a language into a specific gesture code, and a computer operation unit 140 for converting a gesture markup language input by the user into a gesture code using a conversion code.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a gesture interaction design apparatus and a gesture interaction design system,

The techniques described below relate to devices for designing gesture interactions and systems for designing gesture interactions.

Recently, various interface methods have been developed centering on mobile devices. Various technologies such as a touch screen input, a gesture input using an acceleration sensor of a terminal, and a command input using a camera have been commercialized.

Interaction designers studying various gesture inputs should implement and test new gestures. For example, if you are testing gesture interaction ideas that take your smartphone to your ear and connect it directly to a phone call, designers must develop and program their mobile applications directly. Or attach a gesture sensor such as an acceleration sensor to a design model of a mobile phone and connect it to an interface device such as a phidget and program it in a development environment such as Flash (Adobe Flash).

Programming-by-Demonstration is the most well-known method for easily authoring and testing and implementing complex sensor patterns. The Exemplar and the Gesture Coder, etc., use the method of detecting the pattern of the sensor while the designer performs the interaction and detecting the similar pattern.

B. Hartmann, L. Abdulla, M. Mittal, and SR Klemmer, "Human sensor-based interactions by demonstration with direct manipulation and pattern recognition," presented at the conference on Human factors in computing systems, 2007, pp. 145-154. H. Lu and Y. Li, "Gesture coder: a tool for programming multitouch gestures by demonstration," presented at the CHI 2012, New York, NY, USA, 2012, pp. 2875-2884.

The prior art has the difficulty that an interaction designer must code programming directly. Furthermore, a more difficult problem is that the programming task of recognizing the gesture pattern from the numerical value of the acceleration coming from the sensor requires a considerable amount of time to observe the sensor value pattern and try to implement the algorithm. In the end, interaction designers were not able to devote time to creative ideas and were unable to test their immediate ideas.

The demonstration programming method is difficult to understand and understand the internal algorithm because it only tells whether the pattern is detected or not.

The technique described below is intended to provide an environment in which interaction designers intuitively design and test a gesture using a gesture markup language, which is a simple graphical element.

The solutions to the technical problems described below are not limited to those mentioned above, and other solutions not mentioned can be clearly understood by those skilled in the art from the following description.

A gesture interaction designing apparatus for solving the above problems includes a display device for displaying a gesture markup language as a graphic object, an interface device for allowing a user to input or modify data constituting a gesture markup language, a gesture markup language as a specific gesture code And a computer arithmetic unit for converting the gesture markup language input by the user into a gesture code using the conversion code.

The interface device may be at least one of an input device including a mouse, a keyboard, a keypad, a touch screen, a light fan, a graphic tablet, a joystick, a trackball, and an image scanner, Lt; / RTI >

The gesture markup language includes a start object indicating a start point of a gesture, a sequence object indicating a movement path of the gesture, and a hurdle object for determining whether the gesture passes a specific point.

Wherein the starting object comprises a point or a specific mark and the ordering object comprises at least one of an arrow, a straight line or a curve, and the hurdle object is a distance object or gesture indicating a first point and a second point, And a direction object indicating a direction passing through a specific point.

The coordinates represented by the start object, the sequence object, or the hurdle object include at least two coordinate values among x-axis coordinate values, y-axis coordinate values, and z-axis coordinate values collected using the three-axis acceleration sensor, coordinate values on the touch screen, A coordinate value displayed by the device, or a coordinate value of a target object acquired by the camera.

The gesture code may be a script running in a separate graphics commercial program.

In another aspect of the present invention, a gesture interaction design system includes a display device for displaying a gesture markup language, which is a graphic object, and a user terminal including an interface device for inputting or modifying data constituting a gesture markup language, A gesture server for converting a gesture markup language inputted by a user into a gesture code using a conversion code and a conversion code for converting a language into a specific gesture code, and a network device for transmitting data between the user terminal and the gesture server do.

The gesture markup language includes a start object indicating a start point of a gesture, a sequence object indicating a movement path of the gesture, and a hurdle object for determining whether the gesture passes a specific point.

The gesture server converts the gesture code into a script driven by a separate graphic commercial program, transmits the script to the user terminal, inputs the script to the commercial program running on the gesture server, and transmits the result output from the program to the user terminal have.

The technique described below uses a gesture markup language, which is a simple graphic object, to generate a gesture language (script) used in a prototype development tool. This allows interaction designers to easily test their gesture interactions and prototype them in a short period of time.

The effects of the techniques described below are not limited to those mentioned above, and other effects not mentioned can be clearly understood by those skilled in the art from the following description.

1 is an example of a block diagram showing a configuration for a gesture interaction designing apparatus.
2 shows an example of a tool for inputting a gesture markup language and a gesture markup language displayed on a display device.
Fig. 3 (a) shows an example of a hurdle object, Fig. 3 (b) shows an example of a sequence object, and Fig. 3 (c) shows an example of a start object.
4 is an example illustrating the meaning expressed by the order object and the hurdle object.
5 is an example of a gesture of a mobile terminal and a gesture markup language corresponding to each gesture.
Figure 6 is an example of a block diagram illustrating the configuration for a gesture interaction design system.

While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that the invention is not intended to be limited to the particular embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.

The terms first, second, A, B, etc., may be used to describe various components, but the components are not limited by the terms, but may be used to distinguish one component from another . For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. And / or < / RTI > includes any combination of a plurality of related listed items or any of a plurality of related listed items.

As used herein, the singular " include "should be understood to include a plurality of representations unless the context clearly dictates otherwise, and the terms" comprises & , Parts or combinations thereof, and does not preclude the presence or addition of one or more other features, integers, steps, components, components, or combinations thereof.

Before describing the drawings in detail, it is to be clarified that the division of constituent parts in this specification is merely a division by main functions of each constituent part. That is, two or more constituent parts to be described below may be combined into one constituent part, or one constituent part may be divided into two or more functions according to functions that are more subdivided. In addition, each of the constituent units described below may additionally perform some or all of the functions of other constituent units in addition to the main functions of the constituent units themselves, and that some of the main functions, And may be carried out in a dedicated manner. The gesture interaction designing apparatus 100 and the gesture interaction designing system 500 according to the present invention are not limited to the structure of the components according to the present invention. It will be apparent that the present invention may be different from the corresponding drawings within the scope of achieving the object of the invention.

Also, in performing a method or an operation method, each of the processes constituting the method may take place differently from the stated order unless clearly specified in the context. That is, each process may occur in the same order as described, may be performed substantially concurrently, or may be performed in the opposite order.

The present invention facilitates the development of a gesture interface for a variety of devices where interaction designers use gestures as interactions.

The various devices include a mobile terminal such as a smart phone using the motion of the device itself as an interface, a device using a touch input as an interface (e.g., a computer device including a touch screen), a user's motion (For example, a smart phone, a console game machine, etc.) using a robot as an interface, a device for measuring motion using a three-axis acceleration sensor (e.g., a smart phone, a robot, etc.). As a result, various devices are devices capable of using the device itself, the user's motion, the touch movement on the touch screen, etc. as an interface. For convenience of explanation, it is assumed that a mobile terminal such as a smart phone is assumed below.

Mobile terminals on the market now use various gestures as interfaces. For example, when the mobile terminal is turned upside down, sound output is lost, and a call can be received by performing a touch drag in a certain direction. Interaction designers who design a variety of interactions that are convenient and useful to mobile terminals will want to test their ideas in a very short time.

Although the interaction designer does not know the meaning of the gesture implied by the output value generated by various sensors built in the mobile terminal, the gesture interaction is defined using the intuitive graphic object gesture markup language, The gesture code can be used to test gesture interactions on a computer device.

Before describing the present invention in full, some terms are defined. A gesture refers to a specific operation or operation of a target device input by a user, and a gesture interaction refers to a gesture that can be used as an interface in a mobile terminal or the like. A gesture code is a code that can be entered into a particular software or computer device to simulate a particular gesture interaction. Gesture markup language is the language used for gesture code generation, not intricate code, but intuitive graphical elements.

Hereinafter, the gesture interaction design apparatus 100 and the gesture interaction design system 500 will be described in detail with reference to the drawings.

1 is an example of a block diagram showing a configuration for a gesture interaction designing apparatus 100. As shown in FIG. The gesture interaction design apparatus 100 includes a display device 110 for displaying a gesture markup language as a graphic object, interface devices 120 and 130 for inputting or modifying data constituting a gesture markup language, A memory device 150 for storing a conversion code for converting a language into a specific gesture code, and a computer operation unit 140 for converting a gesture markup language input by the user into a gesture code using a conversion code.

The interface devices 120 and 130 may be at least one of a mouse, a keyboard, a keypad, a touch screen, a light fan, a graphic tablet, a joystick, a trackball, and an image scanner. In FIG. 1, this is referred to as a first interface device 120. The first interface device 120 includes various input devices capable of inputting predetermined data to a computer device or a mobile terminal.

Alternatively, the gesture markup language may be directly transmitted to the gesture interaction designing apparatus 100 without the user inputting it in real time. For example, a user may store a gesture markup language in advance on a data storage medium 50 such as a USB or the like, and transfer the gesture markup language, which is a file, to a computer operation device or to a memory device. This form is shown in FIG. 1 as a second interface device 130. In this case, the second interface device 130 is a device for transferring data from the physically connected storage medium 50. The second interface device 130 may be a device that reads and transmits data from a device such as a CD or a DVD, or a data bus device that transfers data from a device such as a USB.

Further, the second interface device 130 may be a communication device (data receiver) that transmits a gesture markup language stored in a separate storage medium 50 using a local wireless network (Bluetooth, Wifi communication, etc.), the Internet, Lt; / RTI >

The display device 110 is a device for outputting a gesture markup language input through the user interface devices 120 and 130. It will be appreciated that the display device 110 may use various devices for outputting graphic objects.

The memory device 150 stores a conversion code for converting the gesture markup language into a gesture code. The gesture code means a code capable of simulating the gesture of the mobile terminal in a separate commercial program or the like. Therefore, the conversion code must generate different gesture codes depending on the type of the simulation tool. A separate commercial program may be a flash program from Adobe. Adobe Flash is one of the platforms that commercial designers often use. In Adobe Flash, the gesture code is ActionScript programming code.

The computer operation unit 140 functions as an emulator for converting an intuitive gesture markup language into a code that can be simulated in a real computer using a conversion code. Computer computing device 140 refers to a processor, such as a central processing unit, used in a computing device.

The computer computing device 140 may execute the gesture code directly to output the gesture interaction represented by the gesture code to the display device 110. [ For example, if the gesture code is to be used in a flash of Adobe, then a gesture code can be executed immediately if a flash program is installed in the memory device 150. [ The gesture interaction design apparatus 100 may be an apparatus in which software capable of executing a gesture code is embedded.

2 shows an example of a tool for inputting a gesture markup language and a gesture markup language displayed on a display device. Figure 2 shows an example of a graphical tool for drawing a gesture markup language. In the screen shown in Fig. 2, various icons for creating (drawing) the gesture markup language and an icon for inputting / outputting the file are displayed on the upper bar.

The gesture markup language includes a start object indicating a start point of a gesture, a sequence object indicating a movement path of the gesture, and a hurdle object for determining whether the gesture passes a specific point.

Referring to the gesture markup language shown in FIG. 2, there is a start object indicating a start point of a gesture, and an order object 1 indicating a direction of a gesture movement in a start object. Sequence object 1 passes through the hurdle object 1, which represents a specific point through which the gesture passes, and then the gesture passes through the hurdle object 2 as it moves along sequence object 2. Finally, the gesture passes through the hurdle object 2, moves as the order object 3, and finally ends up passing through the hurdle object 3. 2 shows a gesture markup language, and each object as a graphic element can understand intuitively how a gesture is structured.

Describe each object that makes up the gesture markup language in detail. Fig. 3 (a) shows an example of a hurdle object, Fig. 3 (b) shows an example of a sequence object, and Fig. 3 (c) shows an example of a start object.

3 (a), the hurdle object includes at least one of a distance object indicating a first point and a second point in which a specific point is located or a direction object indicating a direction in which a gesture passes a specific point. (Between the first point and the second point) in which the gesture must pass using the distance object 1 and the distance object 2. The direction object also indicates the direction of the gesture in the area through which the gesture passes.

Referring to FIG. 3 (b), the order object is composed of an arrow and a straight line. However, since the gesture does not proceed only in a straight line direction, the sequence object is composed of an arrow, a straight line, or a curve.

Fig. 3 (c) shows several examples of starting objects. A polygon such as a circle, a rectangle, or a star. Since the starting object represents the starting point of the gesture, various types of graphic elements may be used.

FIG. 3 illustrates a representative hurdle object, a sequence object, and a start object. However, it is obvious that each object can be defined without using the graphic elements shown in FIG. 3 when the function object is understood.

4 is an example illustrating the meaning expressed by the order object and the hurdle object. 4 (a) shows a case where the gesture starts from the starting object and arrives at the hurdle object (a). Fig. 4 (b) shows a case where the gesture starts at the starting object and arrives at the hurdle object (b) through the hurdle object (a). Fig. 4 (c) shows an example in which the gesture starts at the starting object and arrives at the hurdle object (a) or the hurdle object (b). Fig. 4 (d) is an example in which the gesture arrives at the hurdle object (b) via the hurdle object (a) starting from the starting object, or the gesture arrives at the hurdle object (c) starting from the starting object. Fig. 4 (e) is an example in which the gesture arrives at the hurdle object (b) or the hurdle object (c) after starting from the starting object and passing through the hurdle object (a). Fig. 4 (f) shows an example in which the gesture starts from the starting object and rotates through the hurdle object a, the hurdle object b, the hurdle object c, and the hurdle object d.

On the other hand, the coordinates represented by the starting object, the sequence object, or the hurdle object include at least two coordinate values among the x-axis coordinate value, the y-axis coordinate value, and the z-axis coordinate value collected using the three-axis acceleration sensor, A coordinate value displayed by the pointing device, or a coordinate value of the target object acquired by the camera.

The starting object corresponds to the coordinates of the starting point from which the gesture starts, and the gesture moves in two or three dimensional space based on the starting point. As described above, the present invention can be applied to various gesture interactions such as a touch path on a touch screen, a path using a device such as a mouse, a path of an object acquired by a camera, etc., Can be applied.

5 is an example of a gesture of a mobile terminal and a gesture markup language corresponding to each gesture. The right side of FIG. 5 (a) shows a flip over which flips a mobile terminal located in the same plane as a desk, and the left side of FIG. 5 (a) shows an example of the corresponding gesture markup language. It can be seen that the position changes 180 degrees in the xz plane of the three-dimensional plane as it rotates about the bottom of the mobile terminal.

The right side of FIG. 5 (b) is a gesture in which the user puts the mobile terminal into the pants pocket, and the left side of FIG. 5 (b) shows an example of the corresponding gesture markup language. FIG. 5 (b) shows a case of arriving at the hurdle object 1 or the hurdle object 2 via the hurdle object 0 in the xy plane.

The right side of FIG. 5 (c) is a tap for the user to bounce the mobile terminal, and the left side of FIG. 5 (c) shows an example for the corresponding gesture markup language. In FIG. 5 (c), the case of arriving at the hurdle object 1 through the hurdle object 0 located near the starting object in the xz plane is shown. The hurdle object 1 is located closer to the starting object than the hurdle object 0. This explains a case where the user moves in the direction opposite to the initial movement direction by the recoil when the user bounces the mobile terminal to the palm of his hand.

As shown in FIG. 5, it can be seen that the gesture of the mobile terminal is a movement of a specific coordinate value on the three-dimensional coordinates and can be defined using the gesture markup language.

FIG. 6 is an example of a block diagram illustrating a configuration for a gesture interaction design system 500. FIG. The gesture interaction design system 500 includes a user terminal 510 including a display device for displaying a gesture markup language, which is a graphic object, and an interface device for inputting or modifying data constituting a gesture markup language, A gesture server 530 for converting a gesture markup language inputted by a user into a gesture code by using a conversion code and a conversion code for converting a language into a specific gesture code and a gesture server 530 for transferring data between the user terminal and the gesture server, Device 520 as shown in FIG.

The gesture interaction design system 500 performs a function of converting a gesture markup language into a gesture code in a gesture server 530 connected to a network, and the user terminal 510 performs a terminal function. The gesture interaction design system 500 is meant to include a clouding system.

The network device 520 may be a general wired Internet network or a mobile communication network provided by a mobile communication service provider. It will be appreciated that the network device 520 may be implemented by a variety of devices available to those of ordinary skill in the art.

The block diagram at the bottom of FIG. 6 shows the configuration for the user terminal 510 in detail. The user terminal 510 includes a configuration in which a display device and an input device are the main components, but can input data and transmit the data. Accordingly, the user terminal 510 includes an interface device 511 through which a user can input a gesture markup language, a processor 512 that processes data, etc., a display device that displays an editor tool for producing a gesture markup language A cache memory 514 for temporarily storing data while producing a gesture markup language and a gesture markup language input by the user to the gesture server 530 and outputs the output of the gesture server 530 A receiving communication module 515, and the like.

Gesture markup language and the like are the same as those described in the gesture interaction design apparatus 100 described above. Accordingly, the gesture markup language includes a start object indicating a start point of a gesture, a sequence object indicating a movement path of the gesture, and a hurdle object for determining whether the gesture passes a specific point.

The gesture server 530 may convert the gesture code into a script that is run in a separate graphic commercial program (e.g., Adobe Flash) and may transmit the script to the user terminal 510. In this case, if the commercial program is installed in the user terminal 510, the user can drive the commercial program to simulate the gesture interaction through the display device 513.

Alternatively, the gesture server 530 may directly input a script into a commercial program driven by the gesture server 530, and transmit only the result (graphic data) output from the program to the user terminal 510.

The gesture server 530 includes a communication module 531 for receiving the gesture markup language (data) transmitted from the user terminal 510, a memory device 532 for storing a conversion code for converting the gesture markup language into a gesture code, A processor 533 for converting the gesture markup language into a gesture code using a conversion code, and the like.

Further, the gesture server 530 analyzes the operation of a target device (e.g., a mobile terminal) performing a specific gesture, and displays an example of a gesture markup language expressing the target device gesture on the display device 513 It might be. For example, the mobile terminal connected to the user terminal 510 may transmit the value output by the sensor of the mobile terminal to the gesture server 530 through the user terminal 510 according to the gesture. Or directly connect the mobile terminal to the server and transmit the relevant sensor value to the gesture server 530. [ This embodiment analyzes the gesture of the mobile terminal to illustrate the gesture markup language corresponding to the user and illustrate the gesture markup language for the gesture that the interaction designer intuitively understands. Interaction designers will be able to define the gestures they want to test by referring to the gesture markup language that the system has exemplified.

It is to be understood that both the foregoing general description and the following detailed description of the present invention are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. It will be understood that variations and specific embodiments which may occur to those skilled in the art are included within the scope of the present invention.

50: Storage medium
100: gesture interaction designing device 110: display device
120: first interface device 130: second interface device
140: COMPUTER OPERATING DEVICE 150: MEMORY DEVICE
500: gesture interaction design system 510: user terminal
511: Interface device 512: Processor
513: Display device 514: Cache memory
515: Communication module 520: Network device
530: Gesture server

Claims (13)

A display device for displaying a gesture markup language which is a graphic object;
An interface device for allowing the user to input or modify data constituting the gesture markup language;
A memory device for storing a conversion code for converting the gesture markup language into a specific gesture code; And
And a computer arithmetic unit for converting the gesture markup language input by the user into the gesture code using the conversion code,
Wherein the gesture markup language includes a start object indicating a start point of a gesture, an order object indicating a movement path of the gesture, and a hurdle object for determining whether a gesture passes a specific point.
The method according to claim 1,
The interface device
An input device including a mouse, a keyboard, a keypad, a touch screen, a light fan, a graphic tablet, a joystick, a trackball, and an image scanner,
Wherein the gesture interaction design device is a device for transferring data previously input to a separate storage medium by the user.
delete The method according to claim 1,
The starting object comprising a point or a specific mark,
Wherein the order object comprises at least one of an arrow, a straight line or a curve,
Wherein the hurdle object includes at least one of a separation object indicating a first point and a second point at which the specific point is located or a direction object indicating a direction in which the gesture passes the specific point.
The method according to claim 1,
The coordinate represented by the starting object, the sequence object or the hurdle object is
Axis coordinate values, y-axis coordinate values, and z-axis coordinate values collected using the three-axis acceleration sensor, coordinate values on the touch screen, coordinate values displayed by the pointing device, or target objects Of the gesture interaction design device.
The method according to claim 1,
And the computer arithmetic unit executes the gesture code to output the gesture represented by the gesture code to the display device.
The method according to claim 1,
Wherein the gesture code is a script that is driven by a separate graphic commercial program.
A display device for displaying a gesture markup language which is a graphic object, and an interface device for inputting or modifying data constituting the gesture markup language by a user;
A conversion code for converting the gesture markup language into a specific gesture code and a gesture server for converting the gesture markup language input by the user into the gesture code using the conversion code; And
And a network device that is a path for transmitting data between the user terminal and the gesture server,
Wherein the gesture markup language comprises a start object indicating a start point of a gesture, an order object indicating a movement path of the gesture, and a hurdle object for determining whether a gesture passes a specific point.
delete 9. The method of claim 8,
The starting object comprising a point or a specific mark,
Wherein the order object comprises at least one of an arrow, a straight line or a curve,
Wherein the hurdle object includes at least one of a separation object indicating a first point and a second point at which the specific point is located or a direction object indicating a direction in which the gesture passes the specific point.
9. The method of claim 8,
The coordinate represented by the starting object, the sequence object or the hurdle object is
Axis coordinate values, y-axis coordinate values, and z-axis coordinate values collected using the three-axis acceleration sensor, coordinate values on the touch screen, coordinate values displayed by the pointing device, or target objects Of the gesture interaction design system.
9. The method of claim 8,
The gesture server
And outputs the gesture represented by the gesture code to the display device by executing the gesture code.
9. The method of claim 8,
The gesture server
Converting the gesture code into a script driven by a separate graphic commercial program, transmitting the script to the user terminal,
A gesture interaction design system for inputting the script to the commercial program driven by the gesture server and transmitting a result output from the program to the user terminal.
KR1020130088513A 2013-07-26 2013-07-26 Designing apparatus for gesture based interaction and designing system for gesture based interaction KR101482701B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130088513A KR101482701B1 (en) 2013-07-26 2013-07-26 Designing apparatus for gesture based interaction and designing system for gesture based interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130088513A KR101482701B1 (en) 2013-07-26 2013-07-26 Designing apparatus for gesture based interaction and designing system for gesture based interaction

Publications (1)

Publication Number Publication Date
KR101482701B1 true KR101482701B1 (en) 2015-01-15

Family

ID=52589027

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130088513A KR101482701B1 (en) 2013-07-26 2013-07-26 Designing apparatus for gesture based interaction and designing system for gesture based interaction

Country Status (1)

Country Link
KR (1) KR101482701B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160100527A (en) 2015-02-16 2016-08-24 한국과학기술원 Designing method for gesture interface and designing apparatus for gesture interface

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
GUI 어플리케이션 제어를 위한 제스처 인터페이스 모델 설계(2013년 1월) *
논문(2010.02) *
논문(2013.01) *
인터랙티브 제품 프로토타이핑을 위한 디자인 프로그래밍 툴킷 개발(2010년 1월) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160100527A (en) 2015-02-16 2016-08-24 한국과학기술원 Designing method for gesture interface and designing apparatus for gesture interface
KR101680084B1 (en) * 2015-02-16 2016-11-28 한국과학기술원 Designing method for gesture interface and designing apparatus for gesture interface

Similar Documents

Publication Publication Date Title
US11369879B2 (en) Method and system for interactive imitation learning in video games
Beattie et al. Taking the LEAP with the Oculus HMD and CAD-Plucking at thin Air?
CN102955568A (en) Input unit recognizing user's motion
CN107273037A (en) Virtual object control method and device, storage medium, electronic equipment
CN108431734A (en) Touch feedback for non-touch surface interaction
KR102021851B1 (en) Method for processing interaction between object and user of virtual reality environment
CN110389659A (en) The system and method for dynamic haptic playback are provided for enhancing or reality environment
CN105302407A (en) Application icon display method and apparatus
EP3007030A1 (en) Portable device and control method via gestures
Alshaal et al. Enhancing virtual reality systems with smart wearable devices
US20220365660A1 (en) Automatic translation of user interface elements from wireframe tools to production augmented reality framework
CN110114194A (en) System and method for determining the grip locations of double-grip industrial object
CN104516649A (en) Intelligent cell phone operating technology based on motion-sensing technology
US10474763B2 (en) Computer-implemented method for defining initial conditions for dynamic simulation of an assembly of objects in a three-dimensional scene of a system of computer-aided design
US9665232B2 (en) Information-processing device, storage medium, information-processing method, and information-processing system for enlarging or reducing an image displayed on a display device
KR101482701B1 (en) Designing apparatus for gesture based interaction and designing system for gesture based interaction
Tran et al. Easy-to-use virtual brick manipulation techniques using hand gestures
JP2014085816A (en) Program, information processing device, information processing method, and information processing system
CN117130518A (en) Control display method, head display device, electronic device and readable storage medium
CN104834410B (en) Input unit and input method
CN113593314B (en) Equipment virtual disassembly and assembly training system and training method thereof
KR101680084B1 (en) Designing method for gesture interface and designing apparatus for gesture interface
CN112287708A (en) Near Field Communication (NFC) analog card switching method, device and equipment
CN104750905B (en) Computer-implemented method for designing a three-dimensional modeled object
US10553249B2 (en) Storage medium, information processing apparatus, information processing system and information processing method

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20180102

Year of fee payment: 4

LAPS Lapse due to unpaid annual fee