KR101635628B1 - System and methods for applying haptic interaction of generic haptic devices to existing 2d gui environments and collecting data reflecting the user's movement behavior through the haptic interaction - Google Patents

System and methods for applying haptic interaction of generic haptic devices to existing 2d gui environments and collecting data reflecting the user's movement behavior through the haptic interaction Download PDF

Info

Publication number
KR101635628B1
KR101635628B1 KR1020150059664A KR20150059664A KR101635628B1 KR 101635628 B1 KR101635628 B1 KR 101635628B1 KR 1020150059664 A KR1020150059664 A KR 1020150059664A KR 20150059664 A KR20150059664 A KR 20150059664A KR 101635628 B1 KR101635628 B1 KR 101635628B1
Authority
KR
South Korea
Prior art keywords
haptic
data
haptic device
user
interface
Prior art date
Application number
KR1020150059664A
Other languages
Korean (ko)
Inventor
박진아
송덕재
Original Assignee
한국과학기술원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국과학기술원 filed Critical 한국과학기술원
Priority to KR1020150059664A priority Critical patent/KR101635628B1/en
Application granted granted Critical
Publication of KR101635628B1 publication Critical patent/KR101635628B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Social work or social welfare, e.g. community support activities or counselling services

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Human Resources & Organizations (AREA)
  • General Health & Medical Sciences (AREA)
  • Child & Adolescent Psychology (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a haptic interactive haptic device for a general-purpose haptic device, in which an existing computing environment based on a 2D graphical user interface (GUI) made without taking haptic interaction into consideration (in particular, interactive content such as a game in which active interaction with a user occurs) To a framework for collecting and analyzing operating characteristic data of a user of a haptic device, and a method of constructing the same.
The present invention includes a technique of applying haptic rendering and input to a conventional 2D GUI environment created without considering a haptic function, in which 3D depth is added through a general purpose haptic device. In this way, it is possible to extract additional cognitive and motor stimuli in existing contents and interface environment, and to generate 3D operation data reflecting the behavior characteristics of the user based on this. The present invention includes a technique for collecting data representing the operational characteristics of a user through such an interaction technique. This enables user manipulation data collection to be tailored to general haptic interactions that do not rely on specific content. The present invention includes a framework construction technique to help interpret and collect data collected in this way. Thus, various methods of interpreting data according to various environments using the system in which the method according to the present invention is implemented can be tried.
The present invention makes it possible to realize user condition monitoring in a wide variety of environments in a rehabilitation environment using a haptic device.

Description

TECHNICAL FIELD The present invention relates to a system and method for collecting behavior data of a user by applying an interaction of a universal haptic device to an existing 2D graphical user interface environment, Movement Behavior through the Haptic Interaction}

The present invention relates to an image-based haptic rendering method and a framework for monitoring cognition and exercise health status through haptic interaction, and more particularly, The present invention relates to a system and a method for collecting and analyzing data based on 3D haptic interaction and understanding the behavior characteristics of a user using the haptic device.

The following is a list of prior art documents which become a background technique of the present invention.

[1] Omar A. Daud, Francesco Biral, Roberto Oboe and Lamberto Piron, "A general framework for a rehabilitative oriented haptic interface," IEEE International Workshop on Advanced Motion Control, pp. 685? 690, March 2010.

[2] Deok-Jae Song and Jinah Park, "AnyHaptics: A haptic plug-in for existing interactive 3D graphics applications," Proceedings of the 20th ACM Symposium on Virtual Reality Software and Technology, pp. 27? 30, November 2014.

[Literature 3] Shahzad Rasool and Alexei Sourin. &Quot; Image-Driven Haptic Rendering, " Transactions on Computational Science XXIII, Springer, LNCS 8490, pp. 58? 77, 2014

[Document 4] Jialu Li, Aiguo Song and Xiaorui Zhang. "Image-based haptic texture rendering," In Proc. 9th ACM SIGGRAPH Conference on Virtual-Reality Continuum and its Applications in Industry, pp. 237? 242, 2010.

[Literature 5] Broeren, J., Bjorkdahl, A., Claesson, L., et al. &Quot; Virtual rehabilitation after stroke, " Studies in Health Technology and Informatics, 136, pp. 77-82, 2008.

Previous studies on rehabilitation systems using haptic and other virtual reality techniques have suggested new devices or contents developed for specific purposes. Generally, these are assumed to be used in special facilities such as hospitals, and have been developed depending on specific hardware or middleware platforms, making them unsuitable for easy use when assuming a smart healthcare environment used at remote sites such as a home.

The research on haptic rendering method based on existing 2D image found in the process of designing haptic interface as a solution to the above mentioned problem has shown that 3D information is restored for realistic haptic rendering even with limited information, And utilizing the characteristics of color.

These studies have various purposes, but they assume the usual image of the surrounding area, such as a photograph or a picture, as the image to be applied. However, considering the case where the haptic rendering is applied to the 2D image composed of the multi-layered object and the GUI elements implemented on the computer, restoration of the depth information considering the 3D stereo image or illumination environment is not suitable.

In order to solve the above-described problems, a haptic rendering method and system for a 2D GUI-based image according to the present invention aims at deriving suitable haptic rendering for an existing computer application screen different from a real image or a picture, In addition, the haptic input method for a 2D GUI-based application program can be applied to a number of computer application programs designed to use a mouse by applying a haptic interaction using a universal haptic device, And to generate 3D motion data that reflects the behavior of the user. In addition, the data collection method and the analysis system configuration method based thereon are intended to enable a wider use of the cognition and exercise health state monitoring based on the user's behavior pattern.

According to another aspect of the present invention, there is provided a method of collecting user behavior data using an interaction of a general purpose haptic device for an existing 2D graphical user interface environment, Acquiring an output image for a 2D GUI (Graphic User Interface) based computing environment and an interactive application program; Calculating a feedback force for haptic rendering by interpreting the acquired image based on a GUI element; Simulating a manipulation of a handle of a haptic device with a mouse button click input; Collecting the haptic device operation pattern as data in a formatted format; And providing an interface configured to interpret the collected data in an arbitrary manner.

The method of collecting user behavior data according to an embodiment of the present invention may further include providing a haptic device driver as a device independent interface for supporting a plurality of types of general purpose haptic devices.

The step of calculating the feedback force for haptic rendering by interpreting the obtained image based on the GUI element may include calculating the height level of a certain pixel in real time for each haptic frame, Calculating a height level of the pixels of the image by determining the possibility of movement in the four directions from the pixel to each side of the image and determining the height level by measuring the number of times the movement is blocked in each of the four directions .

Wherein the step of simulating the manipulation of the handle of the haptic device by the mouse button click input is performed when the height level determined for the image pixel area in which the haptic cursor is currently positioned is equal to or greater than a predetermined value, Or when it is operated at a predetermined speed or more, a mouse button click signal can be generated.

When a manipulation of the handle of the haptic device is attempted, a force for a dynamic reaction, which reflects a change in height level felt on the surface of the output image of the application program, and a guiding force for restricting unintentional movement in the horizontal direction, It may be.

The step of collecting the haptic device operation pattern into the formatted format data may include a step of storing the time when the mouse button click input is attempted by the handle operation of the haptic device as the recording period of one data set, And may include data reflecting user characteristics including at least one of a movement distance, a speed, and a feedback force of the steering wheel.

The step of providing an interface configured to interpret the collected data in an arbitrary manner may include the steps of implementing a code module that defines a format of the collected data and a script interface, A code module for defining a format of the collected data and a script interface may be implemented independently of other modules of the overall system.

Meanwhile, the user behavior data collection system according to an embodiment of the present invention is a system for collecting user behavior data using the interaction of a general purpose haptic device for an existing 2D graphical user interface environment, A target environment communicating unit for communicating between processes for a graphical user interface (PC) -based computing environment and an interactive application program to acquire an output image and generate a simulated mouse button click input; A haptic rendering processor for analyzing the acquired image based on a GUI element to calculate a feedback force for haptic rendering; An input processing unit for simulating a manipulation of a handle of the haptic device by a mouse button click input and collecting the manipulation pattern of the haptic device as data of a formatted format; And an interaction data management unit for loading reference data and managing collected data to provide an interface configured to interpret the collected data in an arbitrary manner.

In addition, the user behavior data collection system according to an embodiment of the present invention may further include a haptic device driver as a device independent interface for supporting a plurality of kinds of general purpose haptic devices.

The haptic rendering processing unit may calculate a height level of an arbitrary pixel for each haptic frame in real time, and the haptic rendering processing unit may determine the possibility of moving in four directions from the pixel toward each side of the image So that the height level can be determined by measuring the number of times the movement is blocked for each of the four directions.

When the height level determined for the image pixel area in which the current haptic cursor is located is equal to or greater than a predetermined value and the handle of the haptic device is manipulated at a predetermined depth or more in the depth direction or at a predetermined speed or more, A button click signal may be generated.

When a manipulation of the handle of the haptic device is attempted, a force for a dynamic reaction, which reflects a change in height level felt on the surface of the output image of the application program, and a guiding force for restricting unintentional movement in the horizontal direction, It may be.

Wherein the input processing unit is configured to set the time at which the mouse button click input is attempted by the handle operation of the haptic device as a recording period of one data set, Or a user characteristic including at least one of < RTI ID = 0.0 > a < / RTI >

The interaction data management unit may implement a code module that defines the format and the script interface of the collected data and define an interpretation method of the collected data in a dynamic script interlocked with the code module, A code module that defines the format and script interface of the entire system may be implemented independently of other modules of the overall system.

The technique disclosed in the present invention can have the following effects. It is to be understood, however, that the scope of the disclosed technology is not to be construed as limited thereby, as it is not meant to imply that a particular embodiment should include all of the following effects or only the following effects.

The method and system for collecting user behavior data according to the present invention can be applied to real-time interactive programs based on a 2D GUI to lead to a haptic interaction in which 3D depth is felt while maintaining the performance of 1000 Hz. The behavior pattern data of the user can be collected and interpreted in an arbitrary manner.

1 is a schematic diagram of an overall system including a user behavior characteristic data acquisition system in accordance with an embodiment of the present invention.
FIG. 2 shows an example of a system configuration method targeting a specific application program, and shows an example of data acquisition and input simulation through DLL insertion.
3 shows a method for calculating a real time height level through instant multi-layer analysis of an image based on a GUI element.
FIG. 4 shows an example of a height level result extracted from a specific image and an overall haptic rendering configuration including the result.
5 is a conceptual diagram of two input methods using a haptic device.
6 shows an example of a common code module in which a format of interaction result data and a script interlock interface are defined, and a data interpretation script module linked to the common code module.
FIG. 7 shows a case where the present invention is applied to a 2D GUI of existing cognitive rehabilitative contents.
8 is a flowchart illustrating a method for collecting user behavior characteristic data according to an embodiment of the present invention.

For the embodiments of the invention disclosed herein, specific structural and functional descriptions are set forth for the purpose of describing an embodiment of the invention only, and it is to be understood that the embodiments of the invention may be practiced in various forms, The present invention should not be construed as limited to the embodiments described in Figs.

The present invention is capable of various modifications and various forms, and specific embodiments are illustrated in the drawings and described in detail in the text. It is to be understood, however, that the invention is not intended to be limited to the particular forms disclosed, but on the contrary, is intended to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.

The terms first, second, etc. may be used to describe various components, but the components should not be limited by the terms. The terms may be used for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between. Other expressions that describe the relationship between components, such as "between" and "between" or "neighboring to" and "directly adjacent to" should be interpreted as well.

The terminology used in this application is used only to describe a specific embodiment and is not intended to limit the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In the present application, the terms "comprise", "having", and the like are intended to specify the presence of stated features, integers, steps, operations, elements, components, or combinations thereof, , Steps, operations, components, parts, or combinations thereof, as a matter of principle.

Unless otherwise defined, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries should be construed as meaning consistent with meaning in the context of the relevant art and are not to be construed as ideal or overly formal in meaning unless expressly defined in the present application .

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. The same reference numerals are used for the same constituent elements in the drawings and redundant explanations for the same constituent elements are omitted.

FIG. 1 is a schematic diagram of an overall system including a user behavior data collection system according to an embodiment of the present invention, and FIG. 8 is a flowchart of a method of collecting user behavior data according to an embodiment of the present invention .

Referring to Figure 1, the overall system is configured around an independent build unit denoted " mainframe ". Key functions such as haptic rendering processing, user input processing, and data collection are implemented in the mainframe. The "Interaction Data Common Code" module among the sub-modules constituting the main frame is interlocked with the script that contains the core formula of the data interpretation, and the other sub-modules constituting the main frame It is designed to be module-independent.

Referring to FIG. 8, a user behavior data collection method according to an embodiment of the present invention is a user behavior data collection method using an interaction of a general purpose haptic device for an existing 2D graphical user interface environment, A step of obtaining an output image for a 2D GUI (Graphic User Interface) based computing environment and an interactive application program (S110), calculating a feedback force for haptic rendering by interpreting the acquired image based on a GUI element (S130) of providing a haptic device driver as a device independent interface for supporting a plurality of kinds of general purpose haptic devices, a step (S140) of simulating a handle operation of the haptic device by a mouse button click input (S140) Collecting the operation pattern as data in a format of a format (S150), and collecting the collected data in an arbitrary manner And providing an interface configured to be interpreted (S160).

The present invention is based on a computer application using a 2D GUI or a system operating on a computer user interface environment. This captures the target environment image and transmits it to the mainframe (a unit of independent build where the functions including the haptic interface and data collection are implemented) and converts the processed haptic input signal through the main frame into the mouse input signal of the target environment . There are many ways to implement this, but in particular, when targeting a computer application, the DLL module may be infiltrated into the target application. The module that directly processes the input / output with the haptic device can be implemented in a device independent manner through an abstract interface configuration wrapping a special API provided for each device. FIG. 2 illustrates an example of a system configuration method targeting a specific application program, and shows an example of data acquisition and input simulation through DLL insertion as described above.

3 shows a method for calculating a real time height level through instant multi-layer analysis of an image based on a GUI element. Referring to FIG. 3, divisible graphic elements (buttons and other graphic elements) forming an image are given a height level according to the occluded state of each other for haptic rendering of a 2D GUI-based image. A moveability check is made to each side 4 of the image at a specific pixel point indicated by the current haptic cursor for the height level determination. If the difference between the RGB value of the current pixel and the pixel to be shifted is greater than a certain limit, it is considered that the movement is blocked. If the movement is blocked, it is checked whether or not the rotation can be performed in the other direction. After increasing the number of times blocked, the movement continues at the blocked point. After reaching the edge of each target image, the minimum value is selected from the number of times blocked in each direction, and the height level is determined by the number of times. This means that the level value is increased by the number of times blocked by Level 1. This calculation is not intended to be a preprocessing for the entire image before any image is used, but is intended to be performed for a particular pixel every moment of the haptic frame. The height level thus obtained is used to give a three-dimensional depth sense in the haptic rendering, and this height level can be used alone or in combination with other methods for the overall haptic rendering implementation. FIG. 4 shows a result of height level extracted from a specific image and shows an example of a whole haptic rendering configuration including the result.

5 is a conceptual diagram of two input methods using a haptic device. Referring to FIG. 5, two modes are provided for mouse click input processing using a haptic device handle. One is depth-based (DBBP) input mode. Here, an input signal is generated when the haptic device handle is pushed beyond a predetermined depth in the image depth (Z) direction. And the other is a velocity-based (VBBP) input mode. In this case, the input signal is generated when the haptic device handle is pushed beyond the predetermined speed in the image depth direction. Both modes work when a feedback force is created that makes you feel that you are touching the image, that is, when the haptic cursor is positioned inside the image height level.

The following additional feedback forces are applied to improve the sense of realism and ease of use in the two input modes. First, the height level of the image surface is dynamically moved while the pressing operation is performed to simulate the feeling when the physical button is pressed. To this end, the direction of the reaction force by the spring-damper model It operates in the opposite direction. Secondly, in order to minimize undesired X and Y direction operations when operating in the Z direction, the haptic cursor moving speed in the Z direction is higher than a specific speed and the Z position (depth) is located below the image height level beyond a certain depth If there is a spring-damper model that limits movement in the X and Y directions, the force is activated.

Both input modes can be made to operate only when the height level of the image pixel region is above a certain value, by referring to the calculated height level for haptic rendering. This is in order to make the interface environment applying this input method appear to be implemented so as to provide the haptic function from the beginning, and it is considered that the GUI button mainly intended to operate such input is at least level 2 higher than the image background .

In the method for collecting user behavior data according to an embodiment of the present invention, the following data is collected for each input mode in order to collect data utilizing the characteristics of the interface that are not dependent on a specific content.

(a) depth-based (DBBP) input

A data set is generated for each input attempt over a certain depth. Here, the reference depth is smaller than the reference depth that produces the input signal. A data set consists of the following data items:

- Parameters defining the interaction: input generation reference depth, input response spring constant, feedback force scale, haptic device workspace scale

- Result values that reflect user's operational characteristics: Success of input, maximum penetration depth from image surface, maximum penetration time, time remaining after penetration beyond the reference depth, maximum feedback force, average feedback force, The sum of moving distances, the total time taken in this data set recording interval

(b) Velocity-based (VBBP) input

Creates a dataset for every input attempt above a certain speed. Here, the reference speed is smaller than the reference speed at which the input signal is generated. A data set consists of the following data items:

- Parameters defining the interaction: input generation reference velocity, input response spring constant, feedback force scale, haptic device workspace scale

- Result values reflecting the user's operational characteristics: input success, maximum operation speed, maximum penetration depth from the image surface, maximum feedback force, average feedback force, sum of moving distance in each direction, Total time in

The data set described above can be interpreted and utilized in various ways. To do this, we combine the data items into code that yields meaningful numbers in a dynamic loading and execution script that is separate from the mainframe implementation.

6 shows an example of a common code module in which a format of interaction result data and a script interlock interface are defined, and a data interpretation script module linked to the common code module. Referring to FIG. 6, a code module that primarily defines a data format and a script interworking interface is prepared in a main frame. This is referred to as an interaction data common code (IDC) module. This module is not dependent on other subsystems such as the haptic interface constituting the main frame as shown in FIG. 1, and can be used for various purposes To be used for independent module implementation of. Although the Lua script is exemplarily shown in FIG. 6, other script languages may be interlocked depending on detailed usage and conditions.

FIG. 7 shows a case where the present invention is applied to a 2D GUI of existing cognitive rehabilitative contents. Referring to FIG. 7, this is an example in which the present invention is used. It is assumed that a plurality of options are placed on the screen, and a content is selected according to a specific rule. The rule of choosing an option may be designed specifically to require the user's memory, concentration or problem-solving ability, but basically it may be assumed that the rehabilitation function is not intended to be a more general environment. In any case, whether or not intended content is intended to be used in the haptic device at the time of production, and whether or not intended manipulation was performed according to the rules provided by the content is irrelevant to the present invention. Here, the present invention can give new cognitive stimulation by inducing more active arm motion of the user and applying force feedback to the user. Among the data items generated in this process, it is possible to measure how effectively the user handled contents by using the input success rate, the speed of the input operation, the distance traveled by each direction between the input operations, It is possible to monitor the state of operation characteristics and cognitive ability in terms of haptic interaction according to the present invention. If there is more movement in the Z direction than in the X and Y direction motions at a specific feedback force scale value set as a more specific example, it can be considered that the selection input is made efficient so that a high score is given. In the above-mentioned list of prior art [Document 5], a case of measurement of rehabilitation effect which is a reference for utilization of the present invention is presented. A meaningful value can be derived by implementing a formula for selecting and combining an appropriate data item according to each content in the script framework of the present invention if the content is a content that induces a specific pattern of input.

Meanwhile, the user behavior data collection system according to an embodiment of the present invention is a system for collecting user behavior data using the interaction of a general purpose haptic device for an existing 2D graphical user interface environment, A target environment communicating unit for communicating between processes for a graphical user interface (PC) -based computing environment and an interactive application program to acquire an output image and generate a simulated mouse button click input; A haptic rendering processor for analyzing the acquired image based on a GUI element to calculate a feedback force for haptic rendering; An input processing unit for simulating a manipulation of a handle of the haptic device by a mouse button click input and collecting the manipulation pattern of the haptic device as data of a formatted format; And an interaction data management unit for loading reference data and managing collected data to provide an interface configured to interpret the collected data in an arbitrary manner.

As described above, the method and system for collecting user behavior data according to the present invention can be applied to real-time interactive programs based on a 2D GUI, and the haptic interaction in which 3D depth is felt while maintaining the performance of 1000 Hz It is possible to collect the behavior pattern data of the user based on this, and interpret the behavior pattern data in an arbitrary manner.

The foregoing description is merely illustrative of the technical idea of the present invention, and various changes and modifications may be made by those skilled in the art without departing from the essential characteristics of the present invention. Therefore, the embodiments disclosed in the present invention are intended to illustrate rather than limit the scope of the present invention, and the scope of the technical idea of the present invention is not limited by these embodiments. The scope of protection of the present invention should be construed according to the following claims, and all technical ideas within the scope of equivalents should be construed as falling within the scope of the present invention.

In order to construct a rehabilitation system using virtual reality, special devices and contents have been developed. This system is designed to use only virtual reality rehabilitation and user monitoring in limited space despite the benefits of virtual reality and computerization. Has been a factor of. The haptic interaction method for the existing 2D GUI environment proposed by the present invention and the system implemented through the data collection and configurable script based analysis method based on the haptic interaction method can be used in a wider healthcare environment.

Taking into account only the haptic interaction part, this technique presents a new way of using interactive computer applications. Especially, when it is used in a game or educational contents based on a 2D GUI, more realistic interaction becomes possible and immersion can be improved. Another possibility to utilize the haptic interaction method is the pre-prototyping of haptic function implementation targeting 2D GUI based interface environment. The haptic rendering and interaction method in the present invention may not satisfy the actual industrial demand function. However, even in such a case, a preliminary evaluation for evaluating the value or possibility of implementing the function is an important part, and the present invention can help quick decision in such a situation.

Claims (19)

A method of collecting user behavior data using interaction of a general purpose haptic device with an existing 2D graphical user interface environment,
Acquiring an output image for a previously developed 2D GUI (Graphic User Interface) based computing environment and an interactive application program;
Calculating a feedback force for haptic rendering by interpreting the acquired image based on a GUI element and calculating a height level of an arbitrary pixel of the acquired image in real time for each haptic frame;
Simulating a manipulation of a handle of a haptic device with a mouse button click input;
Collecting the haptic device operation pattern as data in a formatted format; And
And providing an interface configured to interpret the collected data in an arbitrary manner.
The method according to claim 1,
Further comprising providing a haptic device driver as a device independent interface for supporting multiple types of general purpose haptic devices.
delete The method according to claim 1,
Wherein the step of calculating the height level of the arbitrary pixel in real time includes determining the possibility of movement in four directions from the pixel toward the respective sides of the image and measuring the number of times the movement is blocked in each of the four directions, Of the user behavior data.
The method according to claim 1,
Wherein the step of simulating the manipulation of the handle of the haptic device by a mouse button click input comprises:
When the height level determined for the image pixel area in which the haptic cursor is currently located is equal to or greater than a predetermined value and the handle of the haptic device is manipulated at a predetermined depth or more in the depth direction or at a predetermined speed or more, A method for collecting user behavior characteristic data.
The method according to claim 1,
When a manipulation of the handle of the haptic device is attempted, a force for a dynamic reaction, which reflects a change in height level felt on the surface of the output image of the application program, and a guiding force for restricting unintentional movement in the horizontal direction, A method for collecting user behavior data.
The method according to claim 1,
The step of collecting the haptic device operation pattern into the formatted format data may include a step of storing the time when the mouse button click input is attempted by the handle operation of the haptic device as the recording period of one data set, Wherein the controller collects data that reflects a user characteristic including at least one of a movement distance, a speed, and a feedback force of the handle by the user.
The method according to claim 1,
The step of providing an interface configured to interpret the collected data in an arbitrary manner may include the steps of implementing a code module that defines a format of the collected data and a script interface, A method of collecting user behavior characteristics data defining a method of interpreting data.
9. The method of claim 8,
Wherein the code module defining the format and the script interface of the collected data is implemented independently of other modules of the overall system.
9. A computer-readable recording medium on which program instructions for implementing the method for collecting user behavior characteristic data according to any one of claims 1 to 8 and 9 to 9 are recorded. 1. A system for collecting user behavior data using interactions of a general purpose haptic device with an existing 2D graphical user interface environment,
A target environment communication unit for acquiring an output image and generating a simulated mouse button click input by communicating between processes to a 2D GUI (Graphic User Interface) based computing environment and an interactive application program that have been developed;
A haptic rendering processor for analyzing the acquired image based on a GUI element and calculating a feedback force for haptic rendering by calculating a height level of an arbitrary pixel of the acquired image in real time for each haptic frame;
An input processing unit for simulating a manipulation of a handle of the haptic device by a mouse button click input and collecting the manipulation pattern of the haptic device as data of a formatted format; And
And an interaction data manager for loading reference data and managing collected data to provide an interface configured to interpret the collected data in an arbitrary manner.
12. The method of claim 11,
Further comprising a haptic device driver as a device independent interface for supporting multiple types of general purpose haptic devices.
delete 12. The method of claim 11,
Wherein the haptic rendering processing unit determines a height level by determining the possibility of movement in four directions toward each side of the image from the pixel and measuring the number of times the movement is blocked in each of the directions in the four directions, system.
12. The method of claim 11,
When the height level determined for the image pixel area in which the current haptic cursor is located is equal to or greater than a predetermined value and the handle of the haptic device is manipulated at a predetermined depth or more in the depth direction or at a predetermined speed or more, And generates a button click signal.
12. The method of claim 11,
When a manipulation of the handle of the haptic device is attempted, a force for a dynamic reaction, which reflects a change in height level felt on the surface of the output image of the application program, and a guiding force for restricting unintentional movement in the horizontal direction, User behavior data collection system.
12. The method of claim 11,
Wherein the input processing unit is configured to set the time at which the mouse button click input is attempted by the handle operation of the haptic device as a recording period of one data set, Wherein the user behavior characteristic data collection system collects data that reflects a user characteristic that includes at least one of:
12. The method of claim 11,
Wherein the interaction data management unit includes a user behavior data collection system that implements a code module that defines the format and the script interface of the collected data and defines an interpretation method of the collected data in a dynamic script interlocked with the code module, .
19. The method of claim 18,
Wherein the code module defining the format of the collected data and the script interface is implemented independently of other modules of the overall system.
KR1020150059664A 2015-04-28 2015-04-28 System and methods for applying haptic interaction of generic haptic devices to existing 2d gui environments and collecting data reflecting the user's movement behavior through the haptic interaction KR101635628B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150059664A KR101635628B1 (en) 2015-04-28 2015-04-28 System and methods for applying haptic interaction of generic haptic devices to existing 2d gui environments and collecting data reflecting the user's movement behavior through the haptic interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150059664A KR101635628B1 (en) 2015-04-28 2015-04-28 System and methods for applying haptic interaction of generic haptic devices to existing 2d gui environments and collecting data reflecting the user's movement behavior through the haptic interaction

Publications (1)

Publication Number Publication Date
KR101635628B1 true KR101635628B1 (en) 2016-07-20

Family

ID=56680153

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150059664A KR101635628B1 (en) 2015-04-28 2015-04-28 System and methods for applying haptic interaction of generic haptic devices to existing 2d gui environments and collecting data reflecting the user's movement behavior through the haptic interaction

Country Status (1)

Country Link
KR (1) KR101635628B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108665513A (en) * 2017-03-27 2018-10-16 腾讯科技(深圳)有限公司 Drawing practice based on user behavior data and device
CN109829458A (en) * 2019-01-14 2019-05-31 上海交通大学 The method of the journal file of record system operatio behavior is automatically generated in real time

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050197800A1 (en) * 2002-04-26 2005-09-08 Sensable Technologies, Inc. 3-D selection and manipulation with a multiple dimension haptic interface
KR20100030737A (en) * 2008-09-11 2010-03-19 이필규 Implementation method and device of image information based mouse for 3d interaction

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050197800A1 (en) * 2002-04-26 2005-09-08 Sensable Technologies, Inc. 3-D selection and manipulation with a multiple dimension haptic interface
KR20100030737A (en) * 2008-09-11 2010-03-19 이필규 Implementation method and device of image information based mouse for 3d interaction

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108665513A (en) * 2017-03-27 2018-10-16 腾讯科技(深圳)有限公司 Drawing practice based on user behavior data and device
CN108665513B (en) * 2017-03-27 2021-04-06 腾讯科技(深圳)有限公司 Drawing method and device based on user behavior data
CN109829458A (en) * 2019-01-14 2019-05-31 上海交通大学 The method of the journal file of record system operatio behavior is automatically generated in real time
CN109829458B (en) * 2019-01-14 2023-04-04 上海交通大学 Method for automatically generating log file for recording system operation behavior in real time

Similar Documents

Publication Publication Date Title
US8223147B1 (en) Method and system for vision-based interaction in a virtual environment
Ferracani et al. Locomotion by natural gestures for immersive virtual environments
Borst et al. Evaluation of a haptic mixed reality system for interactions with a virtual control panel
Noghabaei et al. Object manipulation in immersive virtual environments: Hand Motion tracking technology and snap-to-fit function
KR101635628B1 (en) System and methods for applying haptic interaction of generic haptic devices to existing 2d gui environments and collecting data reflecting the user's movement behavior through the haptic interaction
Kim et al. Walking-in-place for omnidirectional VR locomotion using a single RGB camera
Raynal et al. Towards unification for pointing task evaluation in 3D desktop virtual environment
Nedel et al. Testing the Use of Egocentric Interactive Techniques in Immersive Virtual Environments.
Zhang et al. Towards cross-reality interaction and collaboration: A comparative study of object selection and manipulation in reality and virtuality
CN106547339A (en) The control method and device of computer equipment
Moreno et al. An open source framework to manage kinect on the web
CN107728811A (en) Interface control method, apparatus and system
Sziebig et al. Navigating in 3D Immersive Environments: a VirCA usability study
Noël et al. Qualitative comparison of 2D and 3D perception for information sharing dedicated to manufactured product design
Kavanagh Facilitating natural user interfaces through freehand gesture recognition
Abdullah et al. A virtual environment with haptic feedback for better distance estimation
WO2015030623A1 (en) Methods and systems for locating substantially planar surfaces of 3d scene
Wang et al. Multiple perspectives integration for virtual reality-aided assemblability assessment in complex virtual assembly environment
Matsuo et al. VirSen1. 0: toward sensor configuration recommendation in an interactive optical sensor simulator for human gesture recognition
Khundam et al. Evaluation of Adaptive Interaction Systems for Virtual Museum Development
Glawischnig et al. Human Interface Devices and Building Information Systems—A Usability Study
Therén Optimization of graphical performance in a motion-based web game: Improving design and implementation of a game measured by frame rate
Wang Comparing tangible and multi-touch interfaces for a spatial problem solving task
Lau et al. A VR-based visualization framework for effective information perception and cognition
Chavez An immersive approach for exploring multiple coordinated 3D visualizations in immersive virtual environments

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20190603

Year of fee payment: 4