KR101482701B1 - Designing apparatus for gesture based interaction and designing system for gesture based interaction - Google Patents
Designing apparatus for gesture based interaction and designing system for gesture based interaction Download PDFInfo
- Publication number
- KR101482701B1 KR101482701B1 KR1020130088513A KR20130088513A KR101482701B1 KR 101482701 B1 KR101482701 B1 KR 101482701B1 KR 1020130088513 A KR1020130088513 A KR 1020130088513A KR 20130088513 A KR20130088513 A KR 20130088513A KR 101482701 B1 KR101482701 B1 KR 101482701B1
- Authority
- KR
- South Korea
- Prior art keywords
- gesture
- code
- markup language
- point
- hurdle
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F15/00—Digital computers in general; Data processing equipment in general
- G06F15/16—Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The gesture interaction design apparatus 100 includes a display device 110 for displaying a gesture markup language as a graphic object, interface devices 120 and 130 for inputting or modifying data constituting a gesture markup language, A memory device 150 for storing a conversion code for converting a language into a specific gesture code, and a computer operation unit 140 for converting a gesture markup language input by the user into a gesture code using a conversion code.
Description
The techniques described below relate to devices for designing gesture interactions and systems for designing gesture interactions.
Recently, various interface methods have been developed centering on mobile devices. Various technologies such as a touch screen input, a gesture input using an acceleration sensor of a terminal, and a command input using a camera have been commercialized.
Interaction designers studying various gesture inputs should implement and test new gestures. For example, if you are testing gesture interaction ideas that take your smartphone to your ear and connect it directly to a phone call, designers must develop and program their mobile applications directly. Or attach a gesture sensor such as an acceleration sensor to a design model of a mobile phone and connect it to an interface device such as a phidget and program it in a development environment such as Flash (Adobe Flash).
Programming-by-Demonstration is the most well-known method for easily authoring and testing and implementing complex sensor patterns. The Exemplar and the Gesture Coder, etc., use the method of detecting the pattern of the sensor while the designer performs the interaction and detecting the similar pattern.
The prior art has the difficulty that an interaction designer must code programming directly. Furthermore, a more difficult problem is that the programming task of recognizing the gesture pattern from the numerical value of the acceleration coming from the sensor requires a considerable amount of time to observe the sensor value pattern and try to implement the algorithm. In the end, interaction designers were not able to devote time to creative ideas and were unable to test their immediate ideas.
The demonstration programming method is difficult to understand and understand the internal algorithm because it only tells whether the pattern is detected or not.
The technique described below is intended to provide an environment in which interaction designers intuitively design and test a gesture using a gesture markup language, which is a simple graphical element.
The solutions to the technical problems described below are not limited to those mentioned above, and other solutions not mentioned can be clearly understood by those skilled in the art from the following description.
A gesture interaction designing apparatus for solving the above problems includes a display device for displaying a gesture markup language as a graphic object, an interface device for allowing a user to input or modify data constituting a gesture markup language, a gesture markup language as a specific gesture code And a computer arithmetic unit for converting the gesture markup language input by the user into a gesture code using the conversion code.
The interface device may be at least one of an input device including a mouse, a keyboard, a keypad, a touch screen, a light fan, a graphic tablet, a joystick, a trackball, and an image scanner, Lt; / RTI >
The gesture markup language includes a start object indicating a start point of a gesture, a sequence object indicating a movement path of the gesture, and a hurdle object for determining whether the gesture passes a specific point.
Wherein the starting object comprises a point or a specific mark and the ordering object comprises at least one of an arrow, a straight line or a curve, and the hurdle object is a distance object or gesture indicating a first point and a second point, And a direction object indicating a direction passing through a specific point.
The coordinates represented by the start object, the sequence object, or the hurdle object include at least two coordinate values among x-axis coordinate values, y-axis coordinate values, and z-axis coordinate values collected using the three-axis acceleration sensor, coordinate values on the touch screen, A coordinate value displayed by the device, or a coordinate value of a target object acquired by the camera.
The gesture code may be a script running in a separate graphics commercial program.
In another aspect of the present invention, a gesture interaction design system includes a display device for displaying a gesture markup language, which is a graphic object, and a user terminal including an interface device for inputting or modifying data constituting a gesture markup language, A gesture server for converting a gesture markup language inputted by a user into a gesture code using a conversion code and a conversion code for converting a language into a specific gesture code, and a network device for transmitting data between the user terminal and the gesture server do.
The gesture markup language includes a start object indicating a start point of a gesture, a sequence object indicating a movement path of the gesture, and a hurdle object for determining whether the gesture passes a specific point.
The gesture server converts the gesture code into a script driven by a separate graphic commercial program, transmits the script to the user terminal, inputs the script to the commercial program running on the gesture server, and transmits the result output from the program to the user terminal have.
The technique described below uses a gesture markup language, which is a simple graphic object, to generate a gesture language (script) used in a prototype development tool. This allows interaction designers to easily test their gesture interactions and prototype them in a short period of time.
The effects of the techniques described below are not limited to those mentioned above, and other effects not mentioned can be clearly understood by those skilled in the art from the following description.
1 is an example of a block diagram showing a configuration for a gesture interaction designing apparatus.
2 shows an example of a tool for inputting a gesture markup language and a gesture markup language displayed on a display device.
Fig. 3 (a) shows an example of a hurdle object, Fig. 3 (b) shows an example of a sequence object, and Fig. 3 (c) shows an example of a start object.
4 is an example illustrating the meaning expressed by the order object and the hurdle object.
5 is an example of a gesture of a mobile terminal and a gesture markup language corresponding to each gesture.
Figure 6 is an example of a block diagram illustrating the configuration for a gesture interaction design system.
While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that the invention is not intended to be limited to the particular embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.
The terms first, second, A, B, etc., may be used to describe various components, but the components are not limited by the terms, but may be used to distinguish one component from another . For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. And / or < / RTI > includes any combination of a plurality of related listed items or any of a plurality of related listed items.
As used herein, the singular " include "should be understood to include a plurality of representations unless the context clearly dictates otherwise, and the terms" comprises & , Parts or combinations thereof, and does not preclude the presence or addition of one or more other features, integers, steps, components, components, or combinations thereof.
Before describing the drawings in detail, it is to be clarified that the division of constituent parts in this specification is merely a division by main functions of each constituent part. That is, two or more constituent parts to be described below may be combined into one constituent part, or one constituent part may be divided into two or more functions according to functions that are more subdivided. In addition, each of the constituent units described below may additionally perform some or all of the functions of other constituent units in addition to the main functions of the constituent units themselves, and that some of the main functions, And may be carried out in a dedicated manner. The gesture
Also, in performing a method or an operation method, each of the processes constituting the method may take place differently from the stated order unless clearly specified in the context. That is, each process may occur in the same order as described, may be performed substantially concurrently, or may be performed in the opposite order.
The present invention facilitates the development of a gesture interface for a variety of devices where interaction designers use gestures as interactions.
The various devices include a mobile terminal such as a smart phone using the motion of the device itself as an interface, a device using a touch input as an interface (e.g., a computer device including a touch screen), a user's motion (For example, a smart phone, a console game machine, etc.) using a robot as an interface, a device for measuring motion using a three-axis acceleration sensor (e.g., a smart phone, a robot, etc.). As a result, various devices are devices capable of using the device itself, the user's motion, the touch movement on the touch screen, etc. as an interface. For convenience of explanation, it is assumed that a mobile terminal such as a smart phone is assumed below.
Mobile terminals on the market now use various gestures as interfaces. For example, when the mobile terminal is turned upside down, sound output is lost, and a call can be received by performing a touch drag in a certain direction. Interaction designers who design a variety of interactions that are convenient and useful to mobile terminals will want to test their ideas in a very short time.
Although the interaction designer does not know the meaning of the gesture implied by the output value generated by various sensors built in the mobile terminal, the gesture interaction is defined using the intuitive graphic object gesture markup language, The gesture code can be used to test gesture interactions on a computer device.
Before describing the present invention in full, some terms are defined. A gesture refers to a specific operation or operation of a target device input by a user, and a gesture interaction refers to a gesture that can be used as an interface in a mobile terminal or the like. A gesture code is a code that can be entered into a particular software or computer device to simulate a particular gesture interaction. Gesture markup language is the language used for gesture code generation, not intricate code, but intuitive graphical elements.
Hereinafter, the gesture
1 is an example of a block diagram showing a configuration for a gesture
The
Alternatively, the gesture markup language may be directly transmitted to the gesture
Further, the
The
The
The
The
2 shows an example of a tool for inputting a gesture markup language and a gesture markup language displayed on a display device. Figure 2 shows an example of a graphical tool for drawing a gesture markup language. In the screen shown in Fig. 2, various icons for creating (drawing) the gesture markup language and an icon for inputting / outputting the file are displayed on the upper bar.
The gesture markup language includes a start object indicating a start point of a gesture, a sequence object indicating a movement path of the gesture, and a hurdle object for determining whether the gesture passes a specific point.
Referring to the gesture markup language shown in FIG. 2, there is a start object indicating a start point of a gesture, and an
Describe each object that makes up the gesture markup language in detail. Fig. 3 (a) shows an example of a hurdle object, Fig. 3 (b) shows an example of a sequence object, and Fig. 3 (c) shows an example of a start object.
3 (a), the hurdle object includes at least one of a distance object indicating a first point and a second point in which a specific point is located or a direction object indicating a direction in which a gesture passes a specific point. (Between the first point and the second point) in which the gesture must pass using the
Referring to FIG. 3 (b), the order object is composed of an arrow and a straight line. However, since the gesture does not proceed only in a straight line direction, the sequence object is composed of an arrow, a straight line, or a curve.
Fig. 3 (c) shows several examples of starting objects. A polygon such as a circle, a rectangle, or a star. Since the starting object represents the starting point of the gesture, various types of graphic elements may be used.
FIG. 3 illustrates a representative hurdle object, a sequence object, and a start object. However, it is obvious that each object can be defined without using the graphic elements shown in FIG. 3 when the function object is understood.
4 is an example illustrating the meaning expressed by the order object and the hurdle object. 4 (a) shows a case where the gesture starts from the starting object and arrives at the hurdle object (a). Fig. 4 (b) shows a case where the gesture starts at the starting object and arrives at the hurdle object (b) through the hurdle object (a). Fig. 4 (c) shows an example in which the gesture starts at the starting object and arrives at the hurdle object (a) or the hurdle object (b). Fig. 4 (d) is an example in which the gesture arrives at the hurdle object (b) via the hurdle object (a) starting from the starting object, or the gesture arrives at the hurdle object (c) starting from the starting object. Fig. 4 (e) is an example in which the gesture arrives at the hurdle object (b) or the hurdle object (c) after starting from the starting object and passing through the hurdle object (a). Fig. 4 (f) shows an example in which the gesture starts from the starting object and rotates through the hurdle object a, the hurdle object b, the hurdle object c, and the hurdle object d.
On the other hand, the coordinates represented by the starting object, the sequence object, or the hurdle object include at least two coordinate values among the x-axis coordinate value, the y-axis coordinate value, and the z-axis coordinate value collected using the three-axis acceleration sensor, A coordinate value displayed by the pointing device, or a coordinate value of the target object acquired by the camera.
The starting object corresponds to the coordinates of the starting point from which the gesture starts, and the gesture moves in two or three dimensional space based on the starting point. As described above, the present invention can be applied to various gesture interactions such as a touch path on a touch screen, a path using a device such as a mouse, a path of an object acquired by a camera, etc., Can be applied.
5 is an example of a gesture of a mobile terminal and a gesture markup language corresponding to each gesture. The right side of FIG. 5 (a) shows a flip over which flips a mobile terminal located in the same plane as a desk, and the left side of FIG. 5 (a) shows an example of the corresponding gesture markup language. It can be seen that the position changes 180 degrees in the xz plane of the three-dimensional plane as it rotates about the bottom of the mobile terminal.
The right side of FIG. 5 (b) is a gesture in which the user puts the mobile terminal into the pants pocket, and the left side of FIG. 5 (b) shows an example of the corresponding gesture markup language. FIG. 5 (b) shows a case of arriving at the
The right side of FIG. 5 (c) is a tap for the user to bounce the mobile terminal, and the left side of FIG. 5 (c) shows an example for the corresponding gesture markup language. In FIG. 5 (c), the case of arriving at the
As shown in FIG. 5, it can be seen that the gesture of the mobile terminal is a movement of a specific coordinate value on the three-dimensional coordinates and can be defined using the gesture markup language.
FIG. 6 is an example of a block diagram illustrating a configuration for a gesture
The gesture
The
The block diagram at the bottom of FIG. 6 shows the configuration for the
Gesture markup language and the like are the same as those described in the gesture
The
Alternatively, the
The
Further, the
It is to be understood that both the foregoing general description and the following detailed description of the present invention are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. It will be understood that variations and specific embodiments which may occur to those skilled in the art are included within the scope of the present invention.
50: Storage medium
100: gesture interaction designing device 110: display device
120: first interface device 130: second interface device
140: COMPUTER OPERATING DEVICE 150: MEMORY DEVICE
500: gesture interaction design system 510: user terminal
511: Interface device 512: Processor
513: Display device 514: Cache memory
515: Communication module 520: Network device
530: Gesture server
Claims (13)
An interface device for allowing the user to input or modify data constituting the gesture markup language;
A memory device for storing a conversion code for converting the gesture markup language into a specific gesture code; And
And a computer arithmetic unit for converting the gesture markup language input by the user into the gesture code using the conversion code,
Wherein the gesture markup language includes a start object indicating a start point of a gesture, an order object indicating a movement path of the gesture, and a hurdle object for determining whether a gesture passes a specific point.
The interface device
An input device including a mouse, a keyboard, a keypad, a touch screen, a light fan, a graphic tablet, a joystick, a trackball, and an image scanner,
Wherein the gesture interaction design device is a device for transferring data previously input to a separate storage medium by the user.
The starting object comprising a point or a specific mark,
Wherein the order object comprises at least one of an arrow, a straight line or a curve,
Wherein the hurdle object includes at least one of a separation object indicating a first point and a second point at which the specific point is located or a direction object indicating a direction in which the gesture passes the specific point.
The coordinate represented by the starting object, the sequence object or the hurdle object is
Axis coordinate values, y-axis coordinate values, and z-axis coordinate values collected using the three-axis acceleration sensor, coordinate values on the touch screen, coordinate values displayed by the pointing device, or target objects Of the gesture interaction design device.
And the computer arithmetic unit executes the gesture code to output the gesture represented by the gesture code to the display device.
Wherein the gesture code is a script that is driven by a separate graphic commercial program.
A conversion code for converting the gesture markup language into a specific gesture code and a gesture server for converting the gesture markup language input by the user into the gesture code using the conversion code; And
And a network device that is a path for transmitting data between the user terminal and the gesture server,
Wherein the gesture markup language comprises a start object indicating a start point of a gesture, an order object indicating a movement path of the gesture, and a hurdle object for determining whether a gesture passes a specific point.
The starting object comprising a point or a specific mark,
Wherein the order object comprises at least one of an arrow, a straight line or a curve,
Wherein the hurdle object includes at least one of a separation object indicating a first point and a second point at which the specific point is located or a direction object indicating a direction in which the gesture passes the specific point.
The coordinate represented by the starting object, the sequence object or the hurdle object is
Axis coordinate values, y-axis coordinate values, and z-axis coordinate values collected using the three-axis acceleration sensor, coordinate values on the touch screen, coordinate values displayed by the pointing device, or target objects Of the gesture interaction design system.
The gesture server
And outputs the gesture represented by the gesture code to the display device by executing the gesture code.
The gesture server
Converting the gesture code into a script driven by a separate graphic commercial program, transmitting the script to the user terminal,
A gesture interaction design system for inputting the script to the commercial program driven by the gesture server and transmitting a result output from the program to the user terminal.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130088513A KR101482701B1 (en) | 2013-07-26 | 2013-07-26 | Designing apparatus for gesture based interaction and designing system for gesture based interaction |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130088513A KR101482701B1 (en) | 2013-07-26 | 2013-07-26 | Designing apparatus for gesture based interaction and designing system for gesture based interaction |
Publications (1)
Publication Number | Publication Date |
---|---|
KR101482701B1 true KR101482701B1 (en) | 2015-01-15 |
Family
ID=52589027
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020130088513A KR101482701B1 (en) | 2013-07-26 | 2013-07-26 | Designing apparatus for gesture based interaction and designing system for gesture based interaction |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101482701B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160100527A (en) | 2015-02-16 | 2016-08-24 | 한국과학기술원 | Designing method for gesture interface and designing apparatus for gesture interface |
-
2013
- 2013-07-26 KR KR1020130088513A patent/KR101482701B1/en not_active IP Right Cessation
Non-Patent Citations (4)
Title |
---|
GUI 어플리케이션 제어를 위한 제스처 인터페이스 모델 설계(2013년 1월) * |
논문(2010.02) * |
논문(2013.01) * |
인터랙티브 제품 프로토타이핑을 위한 디자인 프로그래밍 툴킷 개발(2010년 1월) * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160100527A (en) | 2015-02-16 | 2016-08-24 | 한국과학기술원 | Designing method for gesture interface and designing apparatus for gesture interface |
KR101680084B1 (en) * | 2015-02-16 | 2016-11-28 | 한국과학기술원 | Designing method for gesture interface and designing apparatus for gesture interface |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11369879B2 (en) | Method and system for interactive imitation learning in video games | |
Beattie et al. | Taking the LEAP with the Oculus HMD and CAD-Plucking at thin Air? | |
CN102955568A (en) | Input unit recognizing user's motion | |
CN107273037A (en) | Virtual object control method and device, storage medium, electronic equipment | |
CN108431734A (en) | Touch feedback for non-touch surface interaction | |
KR102021851B1 (en) | Method for processing interaction between object and user of virtual reality environment | |
CN110389659A (en) | The system and method for dynamic haptic playback are provided for enhancing or reality environment | |
CN105302407A (en) | Application icon display method and apparatus | |
EP3007030A1 (en) | Portable device and control method via gestures | |
Alshaal et al. | Enhancing virtual reality systems with smart wearable devices | |
US20220365660A1 (en) | Automatic translation of user interface elements from wireframe tools to production augmented reality framework | |
CN110114194A (en) | System and method for determining the grip locations of double-grip industrial object | |
CN104516649A (en) | Intelligent cell phone operating technology based on motion-sensing technology | |
US10474763B2 (en) | Computer-implemented method for defining initial conditions for dynamic simulation of an assembly of objects in a three-dimensional scene of a system of computer-aided design | |
US9665232B2 (en) | Information-processing device, storage medium, information-processing method, and information-processing system for enlarging or reducing an image displayed on a display device | |
KR101482701B1 (en) | Designing apparatus for gesture based interaction and designing system for gesture based interaction | |
Tran et al. | Easy-to-use virtual brick manipulation techniques using hand gestures | |
JP2014085816A (en) | Program, information processing device, information processing method, and information processing system | |
CN117130518A (en) | Control display method, head display device, electronic device and readable storage medium | |
CN104834410B (en) | Input unit and input method | |
CN113593314B (en) | Equipment virtual disassembly and assembly training system and training method thereof | |
KR101680084B1 (en) | Designing method for gesture interface and designing apparatus for gesture interface | |
CN112287708A (en) | Near Field Communication (NFC) analog card switching method, device and equipment | |
CN104750905B (en) | Computer-implemented method for designing a three-dimensional modeled object | |
US10553249B2 (en) | Storage medium, information processing apparatus, information processing system and information processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant | ||
FPAY | Annual fee payment |
Payment date: 20180102 Year of fee payment: 4 |
|
LAPS | Lapse due to unpaid annual fee |