KR101838939B1 - System for measuring the ability of ordinary performance - Google Patents

System for measuring the ability of ordinary performance Download PDF

Info

Publication number
KR101838939B1
KR101838939B1 KR1020150051607A KR20150051607A KR101838939B1 KR 101838939 B1 KR101838939 B1 KR 101838939B1 KR 1020150051607 A KR1020150051607 A KR 1020150051607A KR 20150051607 A KR20150051607 A KR 20150051607A KR 101838939 B1 KR101838939 B1 KR 101838939B1
Authority
KR
South Korea
Prior art keywords
screen output
unit
sensing
task
output unit
Prior art date
Application number
KR1020150051607A
Other languages
Korean (ko)
Other versions
KR20160121858A (en
Inventor
반호영
최용근
이수빈
유경환
노성준
박대관
Original Assignee
주식회사 네오펙트
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 네오펙트 filed Critical 주식회사 네오펙트
Priority to KR1020150051607A priority Critical patent/KR101838939B1/en
Publication of KR20160121858A publication Critical patent/KR20160121858A/en
Application granted granted Critical
Publication of KR101838939B1 publication Critical patent/KR101838939B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Social work

Abstract

The present invention relates to a task performance measurement system.
According to another aspect of the present invention, there is provided a system for measuring performance of a task, comprising: a sensing unit for sensing motion data for a specific task performed by a user; A calculation unit for calculating a real time performance result of the specific task based on the sensed motion data and generating a real time output image corresponding to the task; And a screen output unit arranged horizontally on the floor surface and displaying the real time output image, wherein the operation corresponds to a cooking operation performed on the screen as the task execution space, And the image is a result of performing the cooking operation by the operation.
According to the present invention, there is an effect that a patient can perform rehabilitation training through work necessary for daily life. In other words, it is possible to improve the speed of social adaptation by allowing patients to perform cooking tasks that are frequently used in daily life, in a similar manner to actual situations.

Description

[0001] SYSTEM FOR MEASURING THE ABILITY OF ORDINARY PERFORMANCE [0002]

[0001] The present invention relates to a performance measuring system, and more particularly, to a system for providing a simulated rehabilitation training required for real life for rehabilitation of a patient and measuring the results of rehabilitation training.

In recent industrialization, rehabilitation is becoming popular in the process of rehabilitation of people with disabilities due to rapidly increasing industrial disasters.

However, in the past, rehabilitation exercises were performed with general actions such as simply lifting objects and moving them to the other side. Therefore, patients had difficulty adapting quickly to the society because they could not perform necessary activities in daily life after rehabilitation training.

In recent years, vocational rehabilitation has been carried out, aiming at rehabilitation so that people with disabilities can have future jobs, beyond simple physical rehabilitation treatment of persons with disabilities.

Although rehabilitation through the occupation of persons with disabilities requires active support from the society and a lot of effort, the rehabilitation and social rehabilitation through the occupation, which occupies the most important part in the life of disabled people due to the lack of professional knowledge base and research activities of vocational rehabilitation It is a passive thing to be able to adapt to the reality. Therefore, many people with disabilities can not get a job and have no job.

The present invention provides a system for measuring performance of work performance, which enables a user who is a rehabilitation patient to perform rehabilitation exercises and perform work necessary for real life such as cooking during rehabilitation training, .

According to another aspect of the present invention, there is provided a system for measuring performance of a task, comprising: a sensing unit for sensing motion data for a specific task performed by a user; A calculation unit for calculating a real time performance result of the specific task based on the sensed motion data and generating a real time output image corresponding to the task; And a screen output unit arranged horizontally on the floor surface and displaying the real time output image, wherein the operation corresponds to a cooking operation performed on the screen by using the screen output unit as a work performing space, Is a result image obtained by the cooking operation.

The operation may be at least one of a bubbling operation, a dough expanding operation, a slicing operation, a flipping operation, a pouring operation, a spraying operation, a valve adjusting operation, and a frying operation of the user.

The sensing unit may include a first inertia sensor to measure at least one of tilt degree, linear acceleration, linear velocity, rotation acceleration, rotation speed, number of rotations, and degree of twist in the motion data have.

The sensing unit and the screen output unit are respectively included in a separate sensing device and a screen output device, and the screen output unit includes a pressure sensor for measuring a pressure applied on the screen by the operation performed by the user And the calculation unit may generate a real-time output image reflecting the intensity of the pressure recognized by the screen output unit.

In addition, the sensing unit and the screen output unit are respectively included in a separate sensing device and a screen output device, and the sensing device includes a weight that is capable of adjusting the degree of difficulty by adjusting the height, The rotation acceleration and the number of rotations of the work to be performed can be measured.

The screen output unit may measure a work point where a spatial position of the sensing unit on which the operation is performed is vertically projected on a screen output unit, The output image may be generated by applying the output image.

In addition, the screen output unit recognizes the touch operation by the operation corresponding to the touch screen.

The screen output unit may display the target value of the job on the screen, and the calculation unit may calculate the accuracy of the touch operation with respect to the target value.

In the case where the operation is a slicing operation, the sensing device corresponds to a knife-shaped device capable of the contact operation, and the calculating unit may calculate the operation result reflecting the operation accomplishment time or the accuracy, And the degree of difficulty is adjusted by adjusting a plurality of the target value placement intervals according to the degree of difficulty.

When the operation is the pouring operation of the liquid flow, the pouring degree of the liquid flow is calculated on the basis of the degree of tilt sensed through the sensing unit, and an image spreading liquid on the screen is displayed can do.

Further, when the operation is a flip operation, the operation unit may further include an operation object including a second inertial sensor, and the second inertial sensor may recognize the arrangement state of the operation object on the screen output unit.

The screen output unit may include a plurality of job execution spaces for providing the job, and the calculation unit may generate the output image that provides a plurality of jobs in an arbitrary order, Based on the motion data, it is possible to determine whether an operation corresponding to the output image is performed.

According to the present invention as described above, the following various effects are obtained.

First, there is an effect that the patient can perform the rehabilitation training through the work necessary for daily life. In other words, it is possible to improve the speed of social adaptation by allowing patients to perform cooking tasks that are frequently used in daily life, in a similar manner to actual situations.

Second, patients may be interested in rehabilitation exercises while performing activities that are performed in daily life rather than not having fun without rehabilitation such as moving cups. In order to adapt to daily life, motivation is needed And to increase the motivation for rehabilitation training.

Third, this system has the effect of training various cooking tasks at once.

Fourth, the performance measuring system according to an embodiment of the present invention can evaluate the achievement of the rehabilitation training rather than simply presenting the rehabilitation training to the patient. In addition, the difficulty level can be adjusted according to the degree of rehabilitation training performed by the patient, thereby providing a level of rehabilitation training suitable for the patient.

1 is an internal configuration diagram of a performance measuring system according to an embodiment of the present invention.
Figure 2 is an exemplary view of performing a pour operation of a liquid stream in accordance with one embodiment of the present invention.
Figure 3 is an exemplary view of performing a foaming operation according to one embodiment of the present invention.
FIG. 4A is an exemplary view showing a state before performing a kneading expansion operation according to an embodiment of the present invention. FIG.
FIG. 4B is an exemplary view showing a state after performing a kneading expansion operation according to an embodiment of the present invention. FIG.
FIG. 5 is an exemplary view for performing a flip operation of an operating object according to an embodiment of the present invention. FIG.
6 is an exemplary diagram illustrating a slicing operation according to an embodiment of the present invention.
7 is an exemplary diagram illustrating a fire-off operation according to an embodiment of the present invention.
8 is an exemplary diagram for recognizing a position on a screen output unit of a sensing unit and reflecting the position on an output image according to an embodiment of the present invention.
9 is an exemplary diagram for recognizing a position of a sensing unit through a vision sensor according to an embodiment of the present invention.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention, and the manner of achieving them, will be apparent from and elucidated with reference to the embodiments described hereinafter in conjunction with the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. To fully disclose the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims. Like reference numerals refer to like elements throughout the specification.

Unless defined otherwise, all terms (including technical and scientific terms) used herein may be used in a sense commonly understood by one of ordinary skill in the art to which this invention belongs. Also, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.

The terminology used herein is for the purpose of illustrating embodiments and is not intended to be limiting of the present invention. In the present specification, the singular form includes plural forms unless otherwise specified in the specification. The terms " comprises "and / or" comprising "used in the specification do not exclude the presence or addition of one or more other elements in addition to the stated element.

1 is an internal configuration diagram of a performance measuring system according to an embodiment of the present invention. Figure 2 is an exemplary view of performing a pour operation of a liquid stream in accordance with one embodiment of the present invention. Figure 3 is an exemplary view of performing a foaming operation according to one embodiment of the present invention. FIG. 4A is an exemplary view showing a state before performing a kneading expansion operation according to an embodiment of the present invention. FIG. FIG. 4B is an exemplary view showing a state after performing a kneading expansion operation according to an embodiment of the present invention. FIG. FIG. 5 is an exemplary view for performing a flip operation of an operating object according to an embodiment of the present invention. FIG. 6 is an exemplary diagram illustrating a slicing operation according to an embodiment of the present invention. 7 is an exemplary diagram illustrating a fire-off operation according to an embodiment of the present invention. 8 is an exemplary diagram for recognizing a position on a screen output unit of a sensing unit and reflecting the position on an output image according to an embodiment of the present invention. 9 is an exemplary diagram for recognizing a position of a sensing unit through a vision sensor according to an embodiment of the present invention.

1 to 9 show a system 10; A calculation unit 100; A sensing unit 200, a rotation weight 210; A screen output unit 300; Target value 310; Guideline 320; Task performing space 330; A vision sensor 340; And the manipulating object 400 are shown.

Hereinafter, a work performance measuring system according to embodiments of the present invention will be described with reference to the drawings.

1 is an internal configuration diagram of a work performance measuring system 10 according to an embodiment of the present invention.

Referring to FIG. 1, a work performance measuring system 10 according to an embodiment of the present invention includes a sensing unit 200; A calculation unit 100; And a screen output unit 300.

The sensing unit 200 performs a function of measuring motion data for a specific task performed by the user. The sensing unit 200 may directly sense or indirectly sense an operation for performing a task by the user. The direct sensing may include sensing a movement of a user holding the sensing unit 200 or sensing a pushing or turning operation of a user-operable configuration. Indirect measurement may include recognizing the motion of a distant user through a camera or the like. However, the direct sensing and the indirect sensing are not limited to this, and may include various methods of sensing movement of a user to acquire motion data.

The operation may correspond to a cooking operation performed on the screen by using the screen output unit 300 as a work performing space 330. [ That is, the work performance measuring system 10 is required for real life adaptation of the rehabilitation patients and can perform a cooking operation that can be utilized for job search, etc., and can measure the result of the performance and display a result screen accordingly. The operation may include a bubbling operation, a kneading operation, a kneading operation, a flipping operation, a follow operation, a sprinkling operation, a valve adjusting operation, a frying operation, etc. of the user.

The calculation unit 100 performs a function of performing overall control and information processing required on the system 10. That is, the calculation unit 100 may calculate a real-time performance result of the specific task based on the sensed motion data, and generate a real-time output image corresponding to the performance result. In addition, the calculation unit 100 can determine the accuracy of the task performed by the user, and can calculate the task performance of the user based on the accuracy of the task.

The screen output unit 300 is horizontally disposed on a floor surface, and performs a function of displaying the real-time output image. That is, the screen output unit 300 may be horizontally disposed on the floor surface, and may display an image corresponding to a result of a specific operation performed by the user on the screen output unit 300. In addition, the screen output unit 300 can display the job performing ability calculated by the calculation unit 100 on one side of the screen in numerical or graphic form.

The output image may be a result of performing the cooking operation. For example, as shown in FIG. 2, when the operation performed by the user is a pouring operation of the liquid flow, the degree of spread according to the amount of liquid flow calculated as poured according to the degree of tilt measured by the sensing unit 200 It can be created and displayed as an output image. More specifically, based on the motion data measured in real time from the sensing unit 200 (i.e., the degree of tilt), the calculation unit 100 calculates the degree to which the liquid flow corresponding to the tilt value is swollen and spreads on the floor surface Can be calculated and displayed in real time as an output image.

Also, the screen output unit 300 may be implemented as a touch screen. When the screen output unit 300 is a touch screen, the screen output unit 300 can recognize a touch operation by an operation. The touch operation can be performed by the sensing unit 200 as well as the user's body. For example, as shown in FIG. 6, when the slicing operation is performed, the touch screen (the screen output unit 300) recognizes the touch operation of the sensing unit 200 configured as a knife, It can be measured whether or not the operation is properly performed on the screen. For this, the sensing unit 200 may be provided with a conductive material in a portion contacting the screen output unit 300.

In addition, the screen output unit 300 may display the job guidelines 320 and 310 on the screen. The guidelines 320 and 310 correspond to goals that a user must perform to accomplish and reach. For example, as shown in FIG. 2, in the case of a work following a liquid flow, the screen output unit 300 may include a circular guide line 320 (310) corresponding to a circle in which a liquid spreads It can be displayed on the screen. In this way, the user can perform a liquid flow pouring operation that accurately aligns the work execution graphics on the displayed guidelines 320 (310). 6, the screen output unit 300 may display a linear guide line for performing a slicing operation using the sensing unit 200 configured by a user as a knife, And can be expressed by the guidelines 320 and 310. [

When the screen output unit 300 displays the guide lines 320 and 310 on the screen, the calculation unit 100 can calculate the accuracy of the touch operation with respect to the guide lines 320 and 310 have. For example, as shown in FIG. 6, the accuracy of the task can be measured by accumulating the degree of the user's action performed along the guideline output on the screen output unit 300 based on the guideline. A method of calculating the accuracy of the slicing operation may be a method in which the calculation unit 100 measures an area between a line where the user performs the slicing operation with the sensing unit 200 and a straight line corresponding to the guide lines 320 and 310, And a method of accumulating the error between the line at which the work is performed and a line at a specific interval of the guide line.

In addition, the screen output unit 300 may further include a pressure sensor. The pressure sensor can recognize the intensity with which the user presses the screen when the work performed by the user applies force (or pressure) to the screen. The screen output unit 300 may need to measure whether or not a pressure higher than a specific intensity is applied to the floor surface according to the operation of the operation to be performed. For example, as shown in FIG. 6, when the knife-shaped sensing unit 200 performs a slicing operation on the screen output unit 300, as in the case of applying a pressure of a specific intensity or more to cut a specific material, It is possible to determine whether the pressure applied to the screen output unit 300 is equal to or greater than a specific intensity. If the touch screen can recognize the intensity of the touch operation, it is possible to measure the pressure intensity applied to the screen output unit 300 by the touch screen without a separate pressure sensor.

In addition, the screen output unit 300 may include a plurality of job performing spaces 330 for providing jobs. The screen output unit 300 may display an image corresponding to a different job in the plurality of job execution spaces 330 and may request the user to perform a plurality of jobs in an arbitrary order. The calculation unit 100 generates the output image providing a plurality of jobs in an arbitrary order and transmits the generated output image to the screen output unit 300 for outputting the image, receives the motion data from the sensing unit 200, It is possible to determine whether or not the user performs an operation corresponding to the requested job. As will be described later, the screen output unit 300 recognizes the operation point of the sensing device and measures whether or not the operation is performed at a point corresponding to the operation performing space 330 where the operation is requested. It can be reflected in the image generation.

The sensing unit 200 and the screen output unit 300 may be included in one apparatus and may be included in the separate sensing apparatus and screen output apparatus, respectively. When the sensing unit 200 and the screen output unit 300 are included in one device, the sensing unit 200 may measure (or sense) the user's operation at a distance as in the vision sensor 340, It is possible to recognize the operation of the provided buttons or switches. When the sensing unit 200 and the screen output unit 300 are included in a separate device, the sensing device may measure the motion data corresponding to the operation of the user and transmit the measured motion data to the screen output device. At this time, the calculation unit 100 may be included in the sensing device or the screen output device, and may be configured as a separate device in some cases. For example, when the calculation unit 100 is included in the screen output device together with the screen output unit 300, the sensing unit 200 in the sensing device measures the movement data of the user, (E.g., a cable connected to the device) or wireless (e.g., an indoor network such as a local area network or Wi-Fi). Thereafter, the screen output device may analyze the motion data, calculate the execution result, and generate a corresponding real-time output image to display on the screen.

The sensing unit 200 can measure the degree of tilt, linear acceleration, linear velocity, rotation acceleration, rotation speed, number of rotations, degree of twist, and the like as motion data. For example, when the sensing unit 200 is implemented as a separate sensing device, the degree of tilt, the linear acceleration, the linear velocity, the rotation acceleration, the rotational speed, the number of rotations, All or a part of the motion data of the motion vector can be obtained. When a user performs an operation after a specific job is selected, the sensing unit 200 may measure only the motion data necessary for calculating the execution result of the selected job.

The sensing unit 200 may include a first inertial sensor (IMU sensor) to acquire motion data. The IMU sensor may be a MEMS (Micro Mechanical System) based 9-axis IMU sensor. The 9-axis IMU sensor includes a 3-axis acceleration sensor, a 3-axis gyroscope sensor, and a 3-axis terrestrial magnetism sensor. The 3-axis acceleration sensor measures the mobile inertia (acceleration) of the x-, y-, and z-axes. The 3-axis gyroscope sensor measures the rotational inertia (angular velocity) of the x, y, and z axes. The 3-axis geomagnetic sensor measures the azimuths of the x-axis, y-axis, and z-axis (direction of the geomagnetism). According to another embodiment, the sensing unit 200 may include at least one of an acceleration sensor and a geomagnetic sensor instead of the IMU sensor.

In addition, the work performance measuring system 10 according to an embodiment of the present invention may further include an operation object 400. [ The manipulated object 400 may correspond to an object including the second inertial sensor which is operated by the sensing unit 200 on the screen output unit 300. [ For example, as shown in FIG. 5, when the flip operation is performed, an object (for example, a hull-shaped object such as a flasher) disposed on the screen output unit 300 corresponds to the operation object 400 can do. The user can perform a flip operation of the operation object 400 using a sensing device including the sensing unit 200 (for example, a reversing device in Fig. 5) And transmits the measured state to the calculation unit 100.

In addition, when the operation is a roasting operation, the operation object 400 having a specific shape can be measured and transmitted to the calculation unit 100. The calculation unit 100 can measure the performance of the user based on the changed arrangement state of the manipulated object 400 according to the manipulation using the sensing device of the user.

The second inertial sensor may correspond to an IMU sensor. The IMU sensor may be a MEMS (Micro Mechanical System) based 9-axis IMU sensor. The 9-axis IMU sensor includes a 3-axis acceleration sensor, a 3-axis gyroscope sensor, and a 3-axis terrestrial magnetism sensor. The 3-axis acceleration sensor measures the mobile inertia (acceleration) of the x-, y-, and z-axes. The 3-axis gyroscope sensor measures the rotational inertia (angular velocity) of the x, y, and z axes. The 3-axis geomagnetic sensor measures the azimuths of the x-axis, y-axis, and z-axis (direction of the geomagnetism).

Hereinafter, a specific operation capability measurement process performed according to various cooking operations according to an embodiment of the present invention, and a configuration added thereto for the operation will be described.

Figure 2 is an exemplary view of performing a pour operation of a liquid stream in accordance with one embodiment of the present invention.

Referring to FIG. 2, according to an embodiment of the present invention, when the operation performed by the user is a pouring operation of the liquid flow, the sensing unit 200 can measure the degree of real-time tilting of the sensing device. The calculation unit 100 may calculate the amount of the liquid poured on the basis of the measured degree of tilt and generate the degree of spread according to the amount of the liquid as an output image. For example, when the user tilts the sensing device such as pouring the liquid in the cup by lifting the sensing device, the amount of liquid poured according to each slope is set as in the actual situation, and if the sensing device is not tilted above a certain slope, Can be prevented from being poured any further. When the user further tilts the sensing device, the calculation unit 100 can generate an output image reflecting the sensed tilt in real time. Thereafter, the screen output unit 300 may display an output image of a degree spreading the liquid generated in real time. As described later, various kinds of motion data other than the degree of inclination of the position on the screen output unit 300 on which the operation is performed can be reflected together with the measurement of the work capacity according to the pouring operation of the liquid flow.

In addition, the calculation unit 100 may further include a figure corresponding to the guide lines 320 and 310 to which the liquid area poured into the output image reaches or is to be aligned. The user attempts to align the liquid area generated through the performance of the work with the guidelines 320 and 310 and the calculation unit 100 calculates the difference between the execution time of the user and the resultant figure and the guide line 320 The user's ability to perform tasks can be evaluated. For example, when the circular liquid region extends from the center of the circular target shape corresponding to the guide lines 320 and 310 according to the operation of the user, the circular shape corresponding to the guide lines 320 and 310 The radius of the circle corresponding to the liquid area can be compared.

Figure 3 is an exemplary view of performing a foaming operation according to one embodiment of the present invention.

Referring to FIG. 3, the performance measuring system 10 according to an exemplary embodiment of the present invention may measure a hand-shaking motion such as a bubbling operation. The sensing device may include a weight 210. The weight 210 can adjust the degree of difficulty by adjusting the height. Accordingly, the sensing device can measure the rotational acceleration and the number of rotations performed by the patient according to the set degree of difficulty by adjusting the height of the weight 210.

The calculation unit 100 may generate the real-time bubble generation image by applying the degree of bubble generation corresponding to the measured rotation acceleration and the number of rotation. The screen output unit 300 may receive and output the generated image so that the user can confirm the result according to the degree of performance of the user.

FIG. 4A is an exemplary view showing a state before performing a kneading expansion operation according to an embodiment of the present invention. FIG. FIG. 4B is an exemplary view showing a state after performing a kneading expansion operation according to an embodiment of the present invention. FIG.

In the kneading expansion operation, the first inertial sensor of the sensing unit 200 can recognize the degree of rolling including the rotational acceleration, the rotational speed, the number of rotations, and the like. The calculation unit 100 may expand the dough figure displayed on the screen on the basis of the degree of rolling from the sensing unit 200. [ That is, before the sensing device including the sensing unit 200 passes the area corresponding to the dough, the dough pattern is small (see FIG. 4A), and the dough spreading operation (that is, the sensing device on the screen output unit 300) The rotational motion of the kneading mold is applied to the kneading pattern (see FIG. 4B).

The work performance measuring system 10 may further include a pressure sensor on the screen output unit 300 to measure the kneading expansion operation. The pressure sensor can measure the pressure applied to the screen output unit 300 during the dough spreading operation. That is, the pressure applied to the screen output unit 300 in the course of the pressure sensor moving the sensing unit 200 may be calculated by calculating the pressure applied to the screen output unit 300, such that the degree of spreading of the dough varies depending on the strength of the force applied to the dough in an actual situation. So that it can be reflected in the output image generation of the unit 100.

6 is an exemplary diagram illustrating a slicing operation according to an embodiment of the present invention.

Referring to FIG. 6, it is possible to measure an ability to perform a task when a user performs a slicing operation according to an exemplary embodiment of the present invention. The sensing device including the sensing unit 200 may correspond to a knife-shaped device capable of performing the contact operation. The calculation unit 100 can generate a guide line 320 (310) of the slicing operation in the output image, and can measure the accuracy by grasping the path of the slicing operation performed by the user.

In addition, the calculation unit 100 may adjust the difficulty level by adjusting the placement intervals of the plurality of guide lines 320 and 310 (that is, the guide lines) by reflecting the work accomplishment time or the accuracy. For example, when the precision of the slicing operation performed by the user along the guideline is higher than a predetermined standard, subsequent guiding lines may be closely arranged to make it difficult to perform the work. In addition, the calculation unit 100 may adjust a time limit for performing a task, and may change a form of a guideline for performing a slicing operation to a form of a higher degree of difficulty.

7 is an exemplary diagram illustrating a fire-off operation according to an embodiment of the present invention.

Referring to FIG. 7, according to an exemplary embodiment of the present invention, a task performing capability when a task performed by a user is a non-performing operation can be measured. The screen output unit 300 may display an output image including a plurality of job execution spaces 330 and the calculation unit 100 may request a job in a specific job execution space 330 on the screen output unit 300 . Accordingly, the user can operate the sensing unit 200 corresponding to the requested task performing space 330. In this embodiment, the sensing unit 200 may correspond to a plurality of valves. The user can test the response speed and the like for a sudden situation by allowing the user to rotate the valve corresponding to the specific work execution space 330 in which the screen output unit 300 is lit.

FIG. 8 is an exemplary diagram for recognizing a position on the screen output unit 300 of the sensing unit 200 according to an exemplary embodiment of the present invention and reflecting the position on the output image. 9 is an exemplary diagram for recognizing the position of the sensing unit 200 through the vision sensor 340 according to an embodiment of the present invention.

Referring to FIG. 8, the work performance measuring system 10 according to an exemplary embodiment of the present invention measures a position on the screen output unit 300 of an operation performed by a user and calculates an output image Can be utilized. The screen output unit 300 may measure a work point on the vertically projected screen output unit 300 where the sensing unit 200 is located. 8, when the user's hand moves from a previous work point (a hand indicated by a dotted line) to another work point (a hand indicated by a solid line), the screen output unit 300 displays the position of the user's hand And the calculation unit 100 may reflect and display the output image. That is, as the position of the hand moves, the position of the center on the screen output unit 300 where the liquid flow spreads can also be moved. Accordingly, it is possible to determine whether or not the user can maintain a predetermined position for performing the task.

In one embodiment, the screen output unit 300 may include a vision sensor 340, such as a camera, as in FIG. The vision sensor 340 acquires an image including an operation performed on the screen output unit 300 and the calculation unit 100 receives an image and can recognize a work point where an operation is performed through image analysis. Then, the calculation unit 100 may generate an output image to which the execution result (i.e., the image to be generated by the motion data measured by the sensing unit 200) is applied to the work point measured in real time. However, the method of recognizing the work point on the screen output unit 300 is not limited to the method by the vision sensor 340, and the position of the identification mark attached to the sensing unit 200 may be sensed by the screen output unit 300 And a method in which the data is transmitted and received.

According to the present invention as described above, the following various effects are obtained.

First, there is an effect that the patient can perform the rehabilitation training through the work necessary for daily life. In other words, it is possible to improve the speed of social adaptation by allowing patients to perform cooking tasks that are frequently used in daily life, in a similar manner to actual situations.

Second, patients may be interested in rehabilitation exercises while performing activities that are performed in daily life rather than not having fun without rehabilitation such as moving cups. In order to adapt to daily life, motivation is needed And to increase the motivation for rehabilitation training.

Third, this system has the effect of training various cooking tasks at once.

Fourth, the performance measuring system according to an embodiment of the present invention can evaluate the achievement of the rehabilitation training rather than simply presenting the rehabilitation training to the patient. In addition, the difficulty level can be adjusted according to the degree of rehabilitation training performed by the patient, thereby providing a level of rehabilitation training suitable for the patient.

While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, You will understand. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive.

10: system 100: calculation unit
200: sensing part 210: weight
300: screen output unit 310: target value
320: Guideline 330: Work space
340: vision sensor 400: manipulated object

Claims (12)

A sensing unit for sensing motion data for a specific operation performed by the user;
A calculation unit for calculating a real time performance result of the specific task based on the sensed motion data and generating a real time output image corresponding to the task; And
And a screen output unit arranged horizontally on the bottom surface for displaying the real time output image,
The sensing unit
A position sensing unit for acquiring first motion data corresponding to a hand position of a user performing the specific task on the screen output unit; And
And a motion sensing unit for acquiring second motion data including at least one of a working type of a hand, a linear acceleration, a linear velocity, a rotation acceleration, a rotation speed and a slope,
Wherein the sensing unit, the calculation unit, and the screen output unit are included in a separate sensing device and a screen output device,
The sensing device includes the motion sensing unit and is operated in a state of being held in a user's hand,
Wherein the screen output device includes the calculation unit, the position sensing unit, and the screen output unit, and displays an output image corresponding to the second motion data on the basis of a working point corresponding to the first motion data,
Wherein the first motion data is a point at which the spatial position of the sensing device is vertically projected onto the screen output unit.
The method according to claim 1,
The operation corresponds to a cooking operation performed on the screen by using the screen output unit as a work performing space,
Wherein the output image is a result image obtained by the cooking operation.
The method according to claim 1,
This operation corresponds to the pouring operation of the liquid flow,
The calculation unit may calculate,
Calculating a pouring degree of the liquid flow based on the inclination sensed by the motion sensing unit, and displaying a liquid image on the screen,
Wherein the liquid image is an image unfolding as a liquid flow is swollen.
The method of claim 3,
Wherein the screen output unit comprises:
A guide line corresponding to the shape of the liquid flow is displayed,
Wherein the guideline is a target value of the job.
The method of claim 3,
The calculation unit may calculate,
Recognizes movement from a first work point to a second work point through the second motion data,
Generating a liquid image by spreading a liquid flow on the basis of the second working point,
Wherein the second work point is a point at which a work position of a user performing the specific job on the screen output unit is changed from the first work point.
5. The method of claim 4,
The calculation unit may calculate,
Comparing the liquid image with the guide line to calculate an operation accuracy,
A task performance measurement system for assessing a user's ability to perform a task based on the task execution time and the task accuracy.
The method according to claim 6,
The calculation unit may calculate,
Calculating an area between the guide line and the liquid image, and determining the work accuracy to be lower as the area is larger.
The method according to claim 1,
Wherein the screen output device has a screen output unit horizontally disposed on a floor,
Wherein the position sensing unit is a vision sensor that recognizes the sensing device in which work is performed in the space on the screen output device.
The method according to claim 1,
The motion sensing unit includes:
Wherein the first inertial sensor is provided to measure at least one of an inclination degree, a linear acceleration, a linear velocity, a rotation acceleration, a rotation speed, a rotation number, and a warp degree with the second motion data, .
delete delete delete
KR1020150051607A 2015-04-13 2015-04-13 System for measuring the ability of ordinary performance KR101838939B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150051607A KR101838939B1 (en) 2015-04-13 2015-04-13 System for measuring the ability of ordinary performance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150051607A KR101838939B1 (en) 2015-04-13 2015-04-13 System for measuring the ability of ordinary performance

Publications (2)

Publication Number Publication Date
KR20160121858A KR20160121858A (en) 2016-10-21
KR101838939B1 true KR101838939B1 (en) 2018-03-15

Family

ID=57257154

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150051607A KR101838939B1 (en) 2015-04-13 2015-04-13 System for measuring the ability of ordinary performance

Country Status (1)

Country Link
KR (1) KR101838939B1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102418994B1 (en) * 2019-12-31 2022-07-11 주식회사 버넥트 Method for providng work guide based augmented reality and evaluating work proficiency according to the work guide

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009000156A (en) * 2007-06-19 2009-01-08 Cooking Mama Ltd Cooking game program

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009000156A (en) * 2007-06-19 2009-01-08 Cooking Mama Ltd Cooking game program

Also Published As

Publication number Publication date
KR20160121858A (en) 2016-10-21

Similar Documents

Publication Publication Date Title
CN107577045B (en) The method, apparatus and storage medium of predicting tracing for head-mounted display
JP2019521452A (en) Detecting Range of User Movement for Virtual Reality User Interface
US11194407B2 (en) Controller with situational awareness display
KR101726894B1 (en) Testing/training visual perception speed and/or span
AU2014336974B2 (en) Alignment apparatus for use in hip arthroplasty
US8972055B1 (en) Methods and systems for selecting a velocity profile for controlling a robotic device
JP5804553B2 (en) Posture balance measuring device
KR101813522B1 (en) Apparatus for analyzing golf swing and system for virtual golf simulation using the same
JP2009011362A (en) Information processing system, robot apparatus, and its control method
JP2021503646A (en) VR walking mechanism and walking method in virtual reality scene
TW201842432A (en) Method, electronic apparatus and recording medium for automatically configuring sensors
US20210068674A1 (en) Track user movements and biological responses in generating inputs for computer systems
KR20210136043A (en) Interacting with smart devices using pointing controllers
JP6796197B2 (en) Information processing equipment, information processing methods and programs
JP2024020292A (en) Operation request system, operation request method, and operation request program
KR101838939B1 (en) System for measuring the ability of ordinary performance
JPWO2016079828A1 (en) User interface system, operation signal analysis method and program for batting operation
CN106507479A (en) A kind of method of base station network, alignment system and building topology structure
JP6694333B2 (en) Rehabilitation support control device and computer program
US20200113502A1 (en) Apparatus and method for evaluating otolith dysfunction
JP2019128631A (en) Program and image display system
KR102115501B1 (en) Body care motion tracking device and body care management method using the same
CN109634427B (en) AR (augmented reality) glasses control system and control method based on head tracking
KR102218089B1 (en) Virtual Reality Control System
KR101997967B1 (en) Stair gait measuring device and method using the same

Legal Events

Date Code Title Description
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant