KR101838939B1 - System for measuring the ability of ordinary performance - Google Patents
System for measuring the ability of ordinary performance Download PDFInfo
- Publication number
- KR101838939B1 KR101838939B1 KR1020150051607A KR20150051607A KR101838939B1 KR 101838939 B1 KR101838939 B1 KR 101838939B1 KR 1020150051607 A KR1020150051607 A KR 1020150051607A KR 20150051607 A KR20150051607 A KR 20150051607A KR 101838939 B1 KR101838939 B1 KR 101838939B1
- Authority
- KR
- South Korea
- Prior art keywords
- screen output
- unit
- sensing
- task
- output unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/22—Social work
Abstract
The present invention relates to a task performance measurement system.
According to another aspect of the present invention, there is provided a system for measuring performance of a task, comprising: a sensing unit for sensing motion data for a specific task performed by a user; A calculation unit for calculating a real time performance result of the specific task based on the sensed motion data and generating a real time output image corresponding to the task; And a screen output unit arranged horizontally on the floor surface and displaying the real time output image, wherein the operation corresponds to a cooking operation performed on the screen as the task execution space, And the image is a result of performing the cooking operation by the operation.
According to the present invention, there is an effect that a patient can perform rehabilitation training through work necessary for daily life. In other words, it is possible to improve the speed of social adaptation by allowing patients to perform cooking tasks that are frequently used in daily life, in a similar manner to actual situations.
Description
[0001] The present invention relates to a performance measuring system, and more particularly, to a system for providing a simulated rehabilitation training required for real life for rehabilitation of a patient and measuring the results of rehabilitation training.
In recent industrialization, rehabilitation is becoming popular in the process of rehabilitation of people with disabilities due to rapidly increasing industrial disasters.
However, in the past, rehabilitation exercises were performed with general actions such as simply lifting objects and moving them to the other side. Therefore, patients had difficulty adapting quickly to the society because they could not perform necessary activities in daily life after rehabilitation training.
In recent years, vocational rehabilitation has been carried out, aiming at rehabilitation so that people with disabilities can have future jobs, beyond simple physical rehabilitation treatment of persons with disabilities.
Although rehabilitation through the occupation of persons with disabilities requires active support from the society and a lot of effort, the rehabilitation and social rehabilitation through the occupation, which occupies the most important part in the life of disabled people due to the lack of professional knowledge base and research activities of vocational rehabilitation It is a passive thing to be able to adapt to the reality. Therefore, many people with disabilities can not get a job and have no job.
The present invention provides a system for measuring performance of work performance, which enables a user who is a rehabilitation patient to perform rehabilitation exercises and perform work necessary for real life such as cooking during rehabilitation training, .
According to another aspect of the present invention, there is provided a system for measuring performance of a task, comprising: a sensing unit for sensing motion data for a specific task performed by a user; A calculation unit for calculating a real time performance result of the specific task based on the sensed motion data and generating a real time output image corresponding to the task; And a screen output unit arranged horizontally on the floor surface and displaying the real time output image, wherein the operation corresponds to a cooking operation performed on the screen by using the screen output unit as a work performing space, Is a result image obtained by the cooking operation.
The operation may be at least one of a bubbling operation, a dough expanding operation, a slicing operation, a flipping operation, a pouring operation, a spraying operation, a valve adjusting operation, and a frying operation of the user.
The sensing unit may include a first inertia sensor to measure at least one of tilt degree, linear acceleration, linear velocity, rotation acceleration, rotation speed, number of rotations, and degree of twist in the motion data have.
The sensing unit and the screen output unit are respectively included in a separate sensing device and a screen output device, and the screen output unit includes a pressure sensor for measuring a pressure applied on the screen by the operation performed by the user And the calculation unit may generate a real-time output image reflecting the intensity of the pressure recognized by the screen output unit.
In addition, the sensing unit and the screen output unit are respectively included in a separate sensing device and a screen output device, and the sensing device includes a weight that is capable of adjusting the degree of difficulty by adjusting the height, The rotation acceleration and the number of rotations of the work to be performed can be measured.
The screen output unit may measure a work point where a spatial position of the sensing unit on which the operation is performed is vertically projected on a screen output unit, The output image may be generated by applying the output image.
In addition, the screen output unit recognizes the touch operation by the operation corresponding to the touch screen.
The screen output unit may display the target value of the job on the screen, and the calculation unit may calculate the accuracy of the touch operation with respect to the target value.
In the case where the operation is a slicing operation, the sensing device corresponds to a knife-shaped device capable of the contact operation, and the calculating unit may calculate the operation result reflecting the operation accomplishment time or the accuracy, And the degree of difficulty is adjusted by adjusting a plurality of the target value placement intervals according to the degree of difficulty.
When the operation is the pouring operation of the liquid flow, the pouring degree of the liquid flow is calculated on the basis of the degree of tilt sensed through the sensing unit, and an image spreading liquid on the screen is displayed can do.
Further, when the operation is a flip operation, the operation unit may further include an operation object including a second inertial sensor, and the second inertial sensor may recognize the arrangement state of the operation object on the screen output unit.
The screen output unit may include a plurality of job execution spaces for providing the job, and the calculation unit may generate the output image that provides a plurality of jobs in an arbitrary order, Based on the motion data, it is possible to determine whether an operation corresponding to the output image is performed.
According to the present invention as described above, the following various effects are obtained.
First, there is an effect that the patient can perform the rehabilitation training through the work necessary for daily life. In other words, it is possible to improve the speed of social adaptation by allowing patients to perform cooking tasks that are frequently used in daily life, in a similar manner to actual situations.
Second, patients may be interested in rehabilitation exercises while performing activities that are performed in daily life rather than not having fun without rehabilitation such as moving cups. In order to adapt to daily life, motivation is needed And to increase the motivation for rehabilitation training.
Third, this system has the effect of training various cooking tasks at once.
Fourth, the performance measuring system according to an embodiment of the present invention can evaluate the achievement of the rehabilitation training rather than simply presenting the rehabilitation training to the patient. In addition, the difficulty level can be adjusted according to the degree of rehabilitation training performed by the patient, thereby providing a level of rehabilitation training suitable for the patient.
1 is an internal configuration diagram of a performance measuring system according to an embodiment of the present invention.
Figure 2 is an exemplary view of performing a pour operation of a liquid stream in accordance with one embodiment of the present invention.
Figure 3 is an exemplary view of performing a foaming operation according to one embodiment of the present invention.
FIG. 4A is an exemplary view showing a state before performing a kneading expansion operation according to an embodiment of the present invention. FIG.
FIG. 4B is an exemplary view showing a state after performing a kneading expansion operation according to an embodiment of the present invention. FIG.
FIG. 5 is an exemplary view for performing a flip operation of an operating object according to an embodiment of the present invention. FIG.
6 is an exemplary diagram illustrating a slicing operation according to an embodiment of the present invention.
7 is an exemplary diagram illustrating a fire-off operation according to an embodiment of the present invention.
8 is an exemplary diagram for recognizing a position on a screen output unit of a sensing unit and reflecting the position on an output image according to an embodiment of the present invention.
9 is an exemplary diagram for recognizing a position of a sensing unit through a vision sensor according to an embodiment of the present invention.
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention, and the manner of achieving them, will be apparent from and elucidated with reference to the embodiments described hereinafter in conjunction with the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. To fully disclose the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims. Like reference numerals refer to like elements throughout the specification.
Unless defined otherwise, all terms (including technical and scientific terms) used herein may be used in a sense commonly understood by one of ordinary skill in the art to which this invention belongs. Also, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.
The terminology used herein is for the purpose of illustrating embodiments and is not intended to be limiting of the present invention. In the present specification, the singular form includes plural forms unless otherwise specified in the specification. The terms " comprises "and / or" comprising "used in the specification do not exclude the presence or addition of one or more other elements in addition to the stated element.
1 is an internal configuration diagram of a performance measuring system according to an embodiment of the present invention. Figure 2 is an exemplary view of performing a pour operation of a liquid stream in accordance with one embodiment of the present invention. Figure 3 is an exemplary view of performing a foaming operation according to one embodiment of the present invention. FIG. 4A is an exemplary view showing a state before performing a kneading expansion operation according to an embodiment of the present invention. FIG. FIG. 4B is an exemplary view showing a state after performing a kneading expansion operation according to an embodiment of the present invention. FIG. FIG. 5 is an exemplary view for performing a flip operation of an operating object according to an embodiment of the present invention. FIG. 6 is an exemplary diagram illustrating a slicing operation according to an embodiment of the present invention. 7 is an exemplary diagram illustrating a fire-off operation according to an embodiment of the present invention. 8 is an exemplary diagram for recognizing a position on a screen output unit of a sensing unit and reflecting the position on an output image according to an embodiment of the present invention. 9 is an exemplary diagram for recognizing a position of a sensing unit through a vision sensor according to an embodiment of the present invention.
1 to 9 show a
Hereinafter, a work performance measuring system according to embodiments of the present invention will be described with reference to the drawings.
1 is an internal configuration diagram of a work
Referring to FIG. 1, a work
The
The operation may correspond to a cooking operation performed on the screen by using the
The calculation unit 100 performs a function of performing overall control and information processing required on the
The
The output image may be a result of performing the cooking operation. For example, as shown in FIG. 2, when the operation performed by the user is a pouring operation of the liquid flow, the degree of spread according to the amount of liquid flow calculated as poured according to the degree of tilt measured by the
Also, the
In addition, the
When the
In addition, the
In addition, the
The
The
The
In addition, the work
In addition, when the operation is a roasting operation, the
The second inertial sensor may correspond to an IMU sensor. The IMU sensor may be a MEMS (Micro Mechanical System) based 9-axis IMU sensor. The 9-axis IMU sensor includes a 3-axis acceleration sensor, a 3-axis gyroscope sensor, and a 3-axis terrestrial magnetism sensor. The 3-axis acceleration sensor measures the mobile inertia (acceleration) of the x-, y-, and z-axes. The 3-axis gyroscope sensor measures the rotational inertia (angular velocity) of the x, y, and z axes. The 3-axis geomagnetic sensor measures the azimuths of the x-axis, y-axis, and z-axis (direction of the geomagnetism).
Hereinafter, a specific operation capability measurement process performed according to various cooking operations according to an embodiment of the present invention, and a configuration added thereto for the operation will be described.
Figure 2 is an exemplary view of performing a pour operation of a liquid stream in accordance with one embodiment of the present invention.
Referring to FIG. 2, according to an embodiment of the present invention, when the operation performed by the user is a pouring operation of the liquid flow, the
In addition, the calculation unit 100 may further include a figure corresponding to the
Figure 3 is an exemplary view of performing a foaming operation according to one embodiment of the present invention.
Referring to FIG. 3, the
The calculation unit 100 may generate the real-time bubble generation image by applying the degree of bubble generation corresponding to the measured rotation acceleration and the number of rotation. The
FIG. 4A is an exemplary view showing a state before performing a kneading expansion operation according to an embodiment of the present invention. FIG. FIG. 4B is an exemplary view showing a state after performing a kneading expansion operation according to an embodiment of the present invention. FIG.
In the kneading expansion operation, the first inertial sensor of the
The work
6 is an exemplary diagram illustrating a slicing operation according to an embodiment of the present invention.
Referring to FIG. 6, it is possible to measure an ability to perform a task when a user performs a slicing operation according to an exemplary embodiment of the present invention. The sensing device including the
In addition, the calculation unit 100 may adjust the difficulty level by adjusting the placement intervals of the plurality of
7 is an exemplary diagram illustrating a fire-off operation according to an embodiment of the present invention.
Referring to FIG. 7, according to an exemplary embodiment of the present invention, a task performing capability when a task performed by a user is a non-performing operation can be measured. The
FIG. 8 is an exemplary diagram for recognizing a position on the
Referring to FIG. 8, the work
In one embodiment, the
According to the present invention as described above, the following various effects are obtained.
First, there is an effect that the patient can perform the rehabilitation training through the work necessary for daily life. In other words, it is possible to improve the speed of social adaptation by allowing patients to perform cooking tasks that are frequently used in daily life, in a similar manner to actual situations.
Second, patients may be interested in rehabilitation exercises while performing activities that are performed in daily life rather than not having fun without rehabilitation such as moving cups. In order to adapt to daily life, motivation is needed And to increase the motivation for rehabilitation training.
Third, this system has the effect of training various cooking tasks at once.
Fourth, the performance measuring system according to an embodiment of the present invention can evaluate the achievement of the rehabilitation training rather than simply presenting the rehabilitation training to the patient. In addition, the difficulty level can be adjusted according to the degree of rehabilitation training performed by the patient, thereby providing a level of rehabilitation training suitable for the patient.
While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, You will understand. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive.
10: system 100: calculation unit
200: sensing part 210: weight
300: screen output unit 310: target value
320: Guideline 330: Work space
340: vision sensor 400: manipulated object
Claims (12)
A calculation unit for calculating a real time performance result of the specific task based on the sensed motion data and generating a real time output image corresponding to the task; And
And a screen output unit arranged horizontally on the bottom surface for displaying the real time output image,
The sensing unit
A position sensing unit for acquiring first motion data corresponding to a hand position of a user performing the specific task on the screen output unit; And
And a motion sensing unit for acquiring second motion data including at least one of a working type of a hand, a linear acceleration, a linear velocity, a rotation acceleration, a rotation speed and a slope,
Wherein the sensing unit, the calculation unit, and the screen output unit are included in a separate sensing device and a screen output device,
The sensing device includes the motion sensing unit and is operated in a state of being held in a user's hand,
Wherein the screen output device includes the calculation unit, the position sensing unit, and the screen output unit, and displays an output image corresponding to the second motion data on the basis of a working point corresponding to the first motion data,
Wherein the first motion data is a point at which the spatial position of the sensing device is vertically projected onto the screen output unit.
The operation corresponds to a cooking operation performed on the screen by using the screen output unit as a work performing space,
Wherein the output image is a result image obtained by the cooking operation.
This operation corresponds to the pouring operation of the liquid flow,
The calculation unit may calculate,
Calculating a pouring degree of the liquid flow based on the inclination sensed by the motion sensing unit, and displaying a liquid image on the screen,
Wherein the liquid image is an image unfolding as a liquid flow is swollen.
Wherein the screen output unit comprises:
A guide line corresponding to the shape of the liquid flow is displayed,
Wherein the guideline is a target value of the job.
The calculation unit may calculate,
Recognizes movement from a first work point to a second work point through the second motion data,
Generating a liquid image by spreading a liquid flow on the basis of the second working point,
Wherein the second work point is a point at which a work position of a user performing the specific job on the screen output unit is changed from the first work point.
The calculation unit may calculate,
Comparing the liquid image with the guide line to calculate an operation accuracy,
A task performance measurement system for assessing a user's ability to perform a task based on the task execution time and the task accuracy.
The calculation unit may calculate,
Calculating an area between the guide line and the liquid image, and determining the work accuracy to be lower as the area is larger.
Wherein the screen output device has a screen output unit horizontally disposed on a floor,
Wherein the position sensing unit is a vision sensor that recognizes the sensing device in which work is performed in the space on the screen output device.
The motion sensing unit includes:
Wherein the first inertial sensor is provided to measure at least one of an inclination degree, a linear acceleration, a linear velocity, a rotation acceleration, a rotation speed, a rotation number, and a warp degree with the second motion data, .
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150051607A KR101838939B1 (en) | 2015-04-13 | 2015-04-13 | System for measuring the ability of ordinary performance |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150051607A KR101838939B1 (en) | 2015-04-13 | 2015-04-13 | System for measuring the ability of ordinary performance |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20160121858A KR20160121858A (en) | 2016-10-21 |
KR101838939B1 true KR101838939B1 (en) | 2018-03-15 |
Family
ID=57257154
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150051607A KR101838939B1 (en) | 2015-04-13 | 2015-04-13 | System for measuring the ability of ordinary performance |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101838939B1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102418994B1 (en) * | 2019-12-31 | 2022-07-11 | 주식회사 버넥트 | Method for providng work guide based augmented reality and evaluating work proficiency according to the work guide |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009000156A (en) * | 2007-06-19 | 2009-01-08 | Cooking Mama Ltd | Cooking game program |
-
2015
- 2015-04-13 KR KR1020150051607A patent/KR101838939B1/en active IP Right Grant
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009000156A (en) * | 2007-06-19 | 2009-01-08 | Cooking Mama Ltd | Cooking game program |
Also Published As
Publication number | Publication date |
---|---|
KR20160121858A (en) | 2016-10-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107577045B (en) | The method, apparatus and storage medium of predicting tracing for head-mounted display | |
JP2019521452A (en) | Detecting Range of User Movement for Virtual Reality User Interface | |
US11194407B2 (en) | Controller with situational awareness display | |
KR101726894B1 (en) | Testing/training visual perception speed and/or span | |
AU2014336974B2 (en) | Alignment apparatus for use in hip arthroplasty | |
US8972055B1 (en) | Methods and systems for selecting a velocity profile for controlling a robotic device | |
JP5804553B2 (en) | Posture balance measuring device | |
KR101813522B1 (en) | Apparatus for analyzing golf swing and system for virtual golf simulation using the same | |
JP2009011362A (en) | Information processing system, robot apparatus, and its control method | |
JP2021503646A (en) | VR walking mechanism and walking method in virtual reality scene | |
TW201842432A (en) | Method, electronic apparatus and recording medium for automatically configuring sensors | |
US20210068674A1 (en) | Track user movements and biological responses in generating inputs for computer systems | |
KR20210136043A (en) | Interacting with smart devices using pointing controllers | |
JP6796197B2 (en) | Information processing equipment, information processing methods and programs | |
JP2024020292A (en) | Operation request system, operation request method, and operation request program | |
KR101838939B1 (en) | System for measuring the ability of ordinary performance | |
JPWO2016079828A1 (en) | User interface system, operation signal analysis method and program for batting operation | |
CN106507479A (en) | A kind of method of base station network, alignment system and building topology structure | |
JP6694333B2 (en) | Rehabilitation support control device and computer program | |
US20200113502A1 (en) | Apparatus and method for evaluating otolith dysfunction | |
JP2019128631A (en) | Program and image display system | |
KR102115501B1 (en) | Body care motion tracking device and body care management method using the same | |
CN109634427B (en) | AR (augmented reality) glasses control system and control method based on head tracking | |
KR102218089B1 (en) | Virtual Reality Control System | |
KR101997967B1 (en) | Stair gait measuring device and method using the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |