KR101990531B1 - Biometric information monitoring apparatus providing Biometric information and analysis information - Google Patents

Biometric information monitoring apparatus providing Biometric information and analysis information Download PDF

Info

Publication number
KR101990531B1
KR101990531B1 KR1020170176715A KR20170176715A KR101990531B1 KR 101990531 B1 KR101990531 B1 KR 101990531B1 KR 1020170176715 A KR1020170176715 A KR 1020170176715A KR 20170176715 A KR20170176715 A KR 20170176715A KR 101990531 B1 KR101990531 B1 KR 101990531B1
Authority
KR
South Korea
Prior art keywords
data
image
time bar
type
expression
Prior art date
Application number
KR1020170176715A
Other languages
Korean (ko)
Inventor
서영대
Original Assignee
(주)로임시스템
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)로임시스템 filed Critical (주)로임시스템
Priority to KR1020170176715A priority Critical patent/KR101990531B1/en
Application granted granted Critical
Publication of KR101990531B1 publication Critical patent/KR101990531B1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7435Displaying user selection data, e.g. icons in a graphical user interface
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/744Displaying an avatar, e.g. an animated cartoon character

Abstract

The present invention relates to a biometric information monitoring apparatus that displays intuitive data by displaying a time bar and a trajectory image in which acquired biometric data, motion data, and analysis data are expressed by color or shape changes, A time bar 110 in which a plurality of data are expressed by adopting a representation system in which a change in data is represented by a continuous change of the expression type is generated and a locus image 120 in which data is expressed in a trajectory in succession, The time bar 110 and the locus image 120 are screen-divided into a single screen and output at the same time.
The research related to the present invention has been carried out by the Ministry of Commerce, Industry and Energy in the promotion of a global design specialized enterprise (bio-signal composite monitoring, wireless wearable band technology development and design capability enhancement, bio-signal composite monitoring wireless wearable band technology development and design capacity enhancement, ).

Description

TECHNICAL FIELD [0001] The present invention relates to a biometric information monitoring apparatus for analyzing and providing biometric information,

The present invention relates to a biometric information monitoring apparatus that displays intuitive data by displaying a time bar and a trajectory image in which acquired biometric data, motion data, and analysis data are expressed by color or shape changes, .

The research related to the present invention has been carried out by the Ministry of Commerce, Industry and Energy in the promotion of a global design specialized enterprise (bio-signal composite monitoring, wireless wearable band technology development and design capability enhancement, bio-signal composite monitoring wireless wearable band technology development and design capacity enhancement, ).

In order to monitor the activity of the body, it acquires motion data such as EMG, heartbeat, brain waves, and position, velocity, and acceleration, analyzes the biomedical data and motion data, and displays it on the screen.

The data displayed on the monitoring screen has various useful values such as evaluating the activity status and correcting the activity status.

In this regard, Japanese Patent Application Laid-Open No. 10-2014-0068403 allows the user to confirm the state of muscle power during exercise in real time, thereby effectively performing the muscle power exercise.

However, in Japanese Patent Application Laid-Open No. 10-2014-0068403, it is difficult to monitor many kinds of data in real time in correlation with each other, thereby displaying a large amount of information on a narrow screen.

In particular, it is important to monitor the real-time variation of the biometric data at the time of exercise, but it should be able to be monitored in association with the motion data. However, Japanese Patent Application Laid-Open No. 10-2014-0068403 does not provide a monitoring screen capable of analyzing motion data in real time only by simulating the motion data with a character.

In another prior art, Patent No. 10-1366077 detects an EMG signal, an acceleration signal, and a body fat signal to feed back the normal state of the exercise posture, the change in body fat and the calorie consumption, and can monitor information related to the exercise characteristic .

In addition, it is possible to display the data obtained by the measurement and analysis in a graph, thereby real-time monitoring of the change of the data.

However, when a large amount of data is displayed on a limited screen by a typical graph, it is difficult for the general person who is not familiar with the biometric data to visually check each data, and it becomes difficult to intuitively understand easily, It is difficult to easily recognize the correlation analysis result with the motion data.

KR 10-2014-0068403 A 2014.06.09. KR 10-1366077 B1 2014.02.14.

Accordingly, a problem to be solved by the present invention is to provide a biometric information monitoring device that can easily recognize and collate biometric data, motion data, and analysis data that are temporally and thermally changed, and can easily monitor the biometric data, .

In order to achieve the above object, a biometric information monitoring apparatus according to the present invention comprises a collecting unit (10) for collecting biometric data and motion data obtained by synchronizing with a sensor worn by a human body in time synchronization; An analysis data generating unit (20) for obtaining analysis data of biometric data and motion data; Color or shape as a representation type of data and adopting a representation system which shows the change of data as continuous change of the expression type so that at least two or more data among the biometric data, A time bar generating unit 30 for generating a time bar 110 expressed in a side-by-side manner; Generates a locus image (120) in which at least any one of biometric data, motion data, and analysis data is successively acquired based on the expression scheme of the time bar generating unit (30) along a locus obtained based on position information of the motion data An image generating unit 40; A graphical user interface unit 50 for displaying the time bar 110 and the locus image 120 on a screen-by-screen basis; .

According to the embodiment of the present invention, the time bar generating unit 30 designates different expression types for different data types, and the image generating unit 40 generates the expression types designated for each data type, The data of the time bar 110 indicated by the same expression type and the data of the locus image 120 can be correlated with each other by applying the same to the data to be represented by the image 120.

According to the embodiment of the present invention, the image generating unit 40 generates a trajectory image 120 in which a plurality of data are expressed along a trajectory in parallel with each other.

According to the embodiment of the present invention, the graphic user interface unit 50 allows data to be represented by the locus image 120 to be selected by user input.

According to an embodiment of the present invention, the image generating unit 40 displays a figure indicating a position of a human body part related to data on an avatar that shapes a human, forms a figure as a representation type for each successive data type, The graphic user interface unit 50 displays the time bar 110, the trajectory image 120, and the avatar image 130 at the same time, or displays the avatar image 130, And causes the trajectory image 120 simultaneously output with the time bar 110 to be replaced with the avatar image 130 according to user input.

According to the embodiment of the present invention, the graphic user interface unit 50 divides the time bar 110 or the trajectory image 120 output on the screen into one to select a data type with one swipe input, Sweep input of the selected type so that only the selected type of data can be expressed in the changed expression type.

The present invention configured as above can easily recognize a plurality of data expressed in color or shape in a side by side in a narrow time bar 110 more intuitively than in a normal data graph, The data of the trajectory image 120 succeeding the method can also be intuitively recognized easily and can be compared with each other and the data comparison between the time bar 110 and the trajectory image 120 using the same expression type or the same expression method is easy So that comprehensive analysis results can be displayed at a glance while displaying on a limited area screen.

Further, the present invention easily selects a data type and a phenotype so that it is possible to easily recognize results of insect analysis in various angles in many types of data.

In addition, the present invention allows for easier analysis of data by showing the body part associated with the data in conjunction with the time bar 110 rather than merely showing it as an avatar image 130.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a use state diagram of a biometric information monitoring apparatus according to an embodiment of the present invention; FIG.
FIG. 2 is a block diagram of a biometric information monitoring apparatus according to an embodiment of the present invention. FIG.
3 is an exemplary view of a time bar 110 and a trajectory image 120;
4 is an illustration of an avatar image 130;
FIG. 5 is an exemplary view showing a screen configuration output from a biometric information monitoring apparatus according to an embodiment of the present invention; FIG.
FIG. 6 is an exemplary view of an output screen reconstructed in the biometric information monitoring apparatus according to the embodiment of the present invention; FIG.
FIG. 7 is an exemplary view of another output screen reconstructed in the biometric information monitoring apparatus according to the embodiment of the present invention; FIG.

A biometric information monitoring apparatus according to the present invention is an apparatus for intuitively recognizing and monitoring biometric data and motion data to be received together with analysis data and displaying the data on a screen. The apparatus is capable of designating a data expression type for each data type, A time bar 110 for easily recognizing and comparing time-series variations of each data, and a locus that can easily recognize the variation of data according to the movement stages of the body Not only the image 120 is outputted simultaneously but also the data displayed by the time bar 110 and the data displayed by the locus image 120 are intuitively related to each other.

To this end, the time bar 110 generates a unit image representing variation with time of data by changing the expression type designated differently for each data type along the time axis in the longitudinal direction, for each data type, .

Then, the trajectory image 120 succeeds the expression type and expression scheme used in the time bar 110, and allows each data to be represented along the trajectory obtained by the motion data.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, preferred embodiments of the present invention will be described with reference to the accompanying drawings.

1, the biometric information monitoring apparatus A according to the embodiment of the present invention collects biometric data and motion data from a wearable device B worn on the human body.

The biometric data may vary from an electromyogram, a heartbeat, an electrocardiogram, an electroencephalogram, and the like, and can be obtained by a bio-signal sensor well known in the art.

The motion data is satisfactory if it is position, velocity, or acceleration data, and can be obtained by a motion sensor (for example, a gyro sensor, an acceleration sensor, etc.) well known in the art .

The wearable device B is a device equipped with a living body signal sensor for acquiring biometric data and a motion sensor for acquiring motion data, and is configured to be worn on the human body to obtain biometric data or motion data of the wearing part. As well known, since the wearable device B can be manufactured in various forms, detailed description of the wearable structure and the detailed configuration of the wearable device B will be omitted.

A plurality of wearable devices B can be worn by a person for the reason of obtaining the biometric data of various human body parts at the same time, the part where the motion data is obtained is different from the part of the human body from which the biometric data is obtained, and so on. For example, as shown in FIG. 1, the upper arm muscles and the lower leg muscles are simultaneously obtained with respect to an athlete playing a soccer game. In this case, the motion data includes, for example, a wearable device B, As shown in FIG.

The biometric data and the motion data simultaneously acquired by the wearable device B are transmitted to the biometric information monitoring device A and can be collected as data synchronized with each other in the biometric information monitoring device A. [

On the other hand, the body-worn part of the sensor for obtaining the respective biometric data and motion data is registered in the biometric information monitoring device A in a manner that the user inputs the data.

Hereinafter, for the sake of convenience of explanation, the biometrics data is used as the EMG data acquired from the upper leg muscles and the upper leg muscles and the heart rate data obtained from the upper leg muscles, the motion data is the position data obtained from the upper leg muscles, .

2, the biometric information monitoring apparatus A according to the embodiment of the present invention includes a collecting unit 10 for collecting EMG, heartbeat, and position data acquired by synchronizing with each other in time, An analysis data generating unit 20 for obtaining analysis data of heartbeat and position data, a time bar generating unit 30 for generating a time bar 110 representing EMG, heartbeat, position, and analysis data, An image generating unit (40) for generating a trajectory image (120) representing at least one piece of data along a trajectory obtained based on the positional data and an avatar image (130) expressed on an avatar representing a human being, And displays the time bar 110 and the trajectory image 120 at the same time and displays the avatar image 130 additionally or displays the trajectory image 120 on the avatar image 130 according to the user's request. And a graphic user interface unit 50 for processing user input.

The collecting unit 10 may have a function of storing and managing data to be collected. In the embodiment of the present invention, since the position data is collected, the speed and acceleration data are obtained based on the position data, and the position, To generate motion data. Of course, all of the position, velocity, and acceleration data may be obtained from the motion sensor and collected.

The analysis data generator 20 is a component for generating analytical data, which is a result of correlating the EMG, heartbeat, and motion data. The analysis data includes, for example, muscular efficiency, muscle fatigue, momentum, fitness of muscle power, fitness of heartbeat, fitness of speed, suitability of acceleration, and the like. And the detailed description thereof is omitted.

However, for convenience of explanation, the analysis data is limited to muscle efficiency and muscle fatigue.

In addition, by setting a threshold value in advance for at least any one of data of muscular fatigue, muscle fatigue, muscle power, momentum, and heartbeat, the analysis data generator 20 generates an event . For example, if the muscle power is greater than the threshold value and excessive muscle is used, an event is generated.

The analysis data generation unit 20 updates the maximum value or the minimum value with respect to at least any one of data of muscular efficiency, muscle fatigue, muscle power, momentum, and heartbeat over time, To generate an event.

In summary, in the embodiment of the present invention, the heart rate of the two parts of the human body part and the heartbeat of one part of the human body part are used as biometric data, the position, velocity and acceleration are used as motion data, muscle efficiency and muscle fatigue are used as analysis data, An event is generated so that the data at the time of the event can be known.

The time bar generator 30 adopts a color or shape as a data expression type so as to recognize a data size in a displayed color or shape, and adopts a presentation method in which a change in data is represented as a continuous change of the expression type .

The time bar generating unit 30 generates a unit bar image in which the biometric data, the motion data, and the analysis data are expressed along the time axis in the longitudinal direction by the above-described expression system, Time bar 110 is generated.

Each unit bar image included in the time bar 110 is formed for each type of data classified into biometric data, motion data, and analysis data. Here, the data type is also divided into the data acquisition body parts, so that the number of the EMGs is two, and the expression types are designated differently for each data type, so that it is possible to recognize by the expression type which kind of data is represented.

The time bar 110 illustrated in FIG. 3 (a) will be specifically described with reference to FIG.

The two electromyograms and heartbeat data, which are biometric data, velocity and acceleration data, which are motion data, and muscle strength and muscle fatigue data, which are analysis data, are assigned different expression types, and data sizes varying along the time axis in the longitudinal direction Are represented by unit bar images 111, 112, 113, 114, 115, 116, 117 expressed by continuous changes of the assigned expression types, and are arranged side by side to constitute a time bar 110.

Here, the position data in the motion data is not expressed in the time bar 110, but a unit bar image of the position data may be added. For example, when the dumbbell movement is performed, the position data of the cuff is acquired and the position of the cuff is displayed on the time bar 110, so that the bending angle of the arm can be known from the position of the cuff.

In the time bar 110 illustrated in FIG. 3 (a), data types are indicated in accordance with the positions of the unit bar images 111, 112, 113, 114, 115, 116, and 117, to be.

In FIG. 3 (a), different expression types for different unit bar images 111, 112, 113, 114, 115, 116, and 117 are shown in different hatch patterns and a short section However, this is merely to show that different expression types are used for each data, and that the expression types vary along the lengthwise time axis. Actually, different expression types designated for each data continuously change .

For example, when a phenotype is a color, a gradation expression method in which brightness or chroma continuously varies along the time axis can be used. When the expression type is a shape, a method of continuously changing the thickness of the shape along the time axis Can be used.

Since the biometric data is represented by a signal waveform representing the temporal variation of the biological signal, it is preferable that the biometric data is converted into a meaningful value and expressed by the above-mentioned expression type. For example, the electromyogram can be converted into power, the heartbeat can be converted to the heart rate, and the muscle power and heart rate can be determined from the expression pattern at each point on the unit bar image.

Although each of the unit bar images 111, 112, 113, 114, 115, 116, and 117 may appear complicated by arranging the plurality of unit bar images 111 side by side, , 117) can be intuitively perceived by looking at the expression type, and the data size fluctuating along the time axis can be easily recognized by the variation of the expression type. In addition, since each point in the longitudinal direction of the time bar 110 represents time, the correlation between various data appearing at the same point of time can be easily grasped by comparing mutually parallel and concatenated phenotypes.

As another embodiment of the above-mentioned expression type and expression system, a color table in which various colors are arranged according to the degree of brightness and color saturation is divided into a plurality of sections, and different sections are assigned to each data type. A continuous change of the data may be displayed using the color in the section.

On the other hand, when the shape is a phenotype, a hatch pattern in Fig. 3 (a) may be used. However, for example, it is also possible to represent successive variations of data by changing the size of each figure It is possible.

Even if the time bar 110 including all the data can be generated according to the embodiment of the present invention, the time bar 110 composed of only the data type selected by the user can be displayed on the screen Let's do it. Here, the data to be selected is at least two or more in correlation with each other, and the time bar 110 including two or more unit bar images output on the screen can be viewed and interpreted relative to each other.

In addition, the expression type assigned to each data can be changed to a user-friendly expression form.

The image generation unit 40 is successively received from the time bar generation unit 30 in accordance with the data type and the data type.

Then, the image generator 40 obtains a locus based on the position data of the motion data, and then applies at least one of the biometric data, the motion data, and the analysis data to the locus To generate a trajectory image 120 representing the data.

That is, since each point on the locus represents position information, the data represented by the expression type and the expression method can be matched to the position information to be recognized.

In addition, the data represented by the time bar 110 and the trajectory image 120 are represented in a form of expression by representing the data in the form of a trajectory image 120, So that they can be seen and contrasted with each other. That is, the data change on the visual axis can be intuitively observed by correlating with the data change on the locus indicating the spatial position.

Will be described in detail with reference to the locus image 120 illustrated in Fig. 3 (b).

3 (b), a locus image 120 is formed by two parallel unit locus images 126 and 127 displayed along the locus obtained based on the position data of the motion data.

At this time, the two unit locus images 126 and 127 are formed in the above-mentioned expression manner, so that the size of data corresponding to each position on the locus can be recognized by the expression of the position.

Since the two unit locus images 126 and 127 are applied in succession to the data types designated according to the data types when generating the time bar 110, Data can be distinguished.

That is, similar to the time bar 110, even if two unit locus images 126 and 127 are arranged side by side, what type of data is represented can be divided into expression types.

It is also possible to easily find the two unit bar images 116 and 117 of the time bar 110 using the same expression type as that of the two unit locus images 126 and 127 shown in Fig. 3 (b) Therefore, the unit bar images 116 and 117 and the unit locus images 126 and 127 using the same expression type can be easily recognized, checked against each other, and data changes can be observed.

In the embodiment of the present invention, the user is allowed to select data to be represented by the locus image 120 among biometric data, motion data, and analysis data.

The image generating unit 40 is configured to generate an avatar image 130 in addition to the trajectory image 120.

The avatar image 130 is an image in which a specific figure is displayed on an avatar that shapes a human being.

Here, the position of the specific figure displayed on the avatar represents the human body part related to the data. That is, in the case of biometric data, a human body wearing part of a bio-signal sensor for acquiring biometric data is represented by a specific figure expressing on an avatar. In case of motion data, a human body wearing part of a motion sensor for acquiring motion data is represented as an avatar image And in the case of the analysis data, the human body wearing part of the sensor for obtaining the analyzed data (biometrics data or motion data) is represented by a specific figure displayed on the avatar.

Then, each specific graphic form is formed into a successive representation type to be distinguished from each other by the expression type, and the data size is represented by the graphic size.

Will be described in detail with reference to the avatar image 130 illustrated in FIG.

In the avatar image 130 of FIG. 4, circular shapes 131 and 132 are displayed at positions corresponding to two human body parts that have detected the EMG, and each of the figures 131 and 132 is a representation type And the size of each of the graphics 131 and 132 corresponds to the real time power of the corresponding electromyogram to intuitively recognize the part of the human body obtained from the data represented by the unit bar image of the time bar 110, The size can also be perceived intuitively.

Here, the sizes of the graphics 131 and 132 may have various meanings depending on the type of data, and may be, for example, an accumulation of momentum obtained by analysis data.

In the embodiment of the present invention, the user can select data to be displayed as graphics 131 and 132 on the avatar image 130 as described later.

Of course, in order to express data in which human body parts overlap each other, a figure may be superimposed and displayed. Alternatively, only one figure may be displayed on the avatar image 130, and other figures of the same body part may be displayed on the avatar image 130 It may be displayed on the outside and the human body part may be indicated by the leader line.

The time bar 110, the trajectory image 120 and the avatar image 130 thus generated are displayed on the screen by the graphic user interface unit 50.

The graphic user interface unit 50 includes an output unit 51 for an output screen and an input unit 52 for receiving a user input. The graphic user interface unit 50 divides the screen into a position on the screen for outputting the time bar 110, 120, and displays the time bar 110 and the locus image 120 on the screen at the same time.

The graphic user interface unit 50 requests the time bar generating unit 30 and the image generating unit 40 to reconstruct the time bar 110 and the trajectory image 120 according to a user input, .

In addition, the graphic user interface unit 50 replaces the trajectory image 120 with the avatar image 130 according to user input and outputs the same.

The graphic user interface unit 50 displays the events generated by the analysis data generator 20 on the time bar 110 and the trajectory image 120.

5 is a diagram showing an exemplary screen configuration.

5, the graphic user interface unit 50 receives the time bar 110 generated by the time bar generating unit 30 and the locus image 120 generated by the image generating unit 40 Screen-divided on one screen 100 and outputted simultaneously. As shown in the figure, when the image 140 captured by the camera is input to the collecting unit 10, the same may be displayed on one screen 100 for display.

In this way, the time bar 110 and the locus image 120, to which the expression system expressing the data variation is applied in the same manner as the continuous change of the expression type and the expression type that are designated differently for each data type, It is possible to intuitively perceive the variation of various data over time by looking at the time bar 110 and intuitively recognize the correlation between various data and position data while viewing the trajectory image 120, Furthermore, it is possible to easily see and compare the fluctuation of the data according to the time lapse represented by the time bar 110 and the fluctuation of the data due to the positional change on the locus image 120, so that a comprehensive analysis result can be grasped at a glance .

In addition, event windows 110a and 120a are displayed on the split screen displaying the time bar 110 and the split screen displaying the trajectory image 120, respectively. At this time, both side event windows 110a and 120a are window for outputting event information, and indicate the position on the corresponding type bar 110 and the position of the corresponding locus image 120.

As shown in the drawing, it is preferable that not all the event information is displayed on the both-side event windows 110a and 120a but only information related to the display data of the object indicated by each is displayed.

For example, as shown in the drawing, the information to be output to the event windows 110a and 120a may include event information, such as heart rate, speed, acceleration, muscle efficiency, and muscle fatigue, Speed and acceleration data values are output to the window 110a and data values of muscular efficiency and muscle fatigue are displayed in the event window 120a of the locus image 120 expressing muscular efficiency and muscle fatigue as shown in the figure Output.

In addition, the graphic user interface unit 50 allows the user to change the type of data displayed on the time bar 110 and the trajectory image 120 and the expression type to be used for data presentation.

The graphic user interface unit 50 may include a touch screen as input and output means to enable input of a swipe, and may include a time bar 110 and a locus image 120 as a nuisance To process either one of the swipe inputs as an input for changing the data type and the other swipe input as an input for changing the expression type.

A swipe is a technique to memorize a moving point that has been passed without lifting a finger and input the processed data.

Utilizing these swipe inputs, a method is employed that takes a gesture that makes the displayed choices appear by displaying the options (data type or expression type) differently according to gestures (for example, travel distance) and showing them visually However, the present invention is not limited thereto, and various known swipe input methods may be applied. As another example, for a time bar, a method may be used that allows the user to select a unit bar image that has passed by a swipe gesture.

FIGS. 6 and 7 are exemplary diagrams showing that the screen 100 of FIG. 5 can be reconstructed with a swipe input.

6, only some of the data are displayed on the time bar 110 during two portions of the electromyogram, heart rate, speed, acceleration, muscular efficiency, and muscle load by one of the swipe inputs 110b of the time bar 110 The swipe input 110c of the time bar 110 and the event window 110a of the time bar 110 according to the type of data displayed.

The unit bar image of the time bar 110 shown in FIG. 6 is a line graph indicating the data size with the width boundary as the maximum value and the minimum value, but it may be changed to a bar graph. That is, the time bar 110 can be constructed by concatenating the unit bar images compressed in the form of a graph using various conventional expression types. Here, in order to identify the data type, the line between the line and the unit bar image boundary line is expressed by the expression type, but the data line type may be discriminated by different kinds of line.

In the example of Fig. 6, since only one swipe input 120b for changing the data type expressed by the locus image 120 is used, only the unit locus images 124 and 126 corresponding to the changed data type Is included in the locus image 120, and the information to be displayed in the event window 120a is modified in accordance with the changed data. Of course, although not shown in the drawing, the expression type may be changed by swipe input on the other side.

7, a horizontal swipe input 120d, which is different from the vertical swipe input for changing the expression data and changing the expression type, is allowed on the divided screen outputting the trajectory image 120, (120) to the avatar image (130).

That is, the graphic user interface unit 50 causes the horizontal swipe input 120d to select one of the trajectory image 120 and the avatar image 130, and displays the selected image together with the time bar 110 on the screen .

Accordingly, the human body part related to the data displayed by the time bar 110 can be recognized by the positions of the graphics 131 and 132 on the avatar image 130. [ In addition, the sizes of the graphics 131 and 132 can be viewed and the data size can be known.

Although not shown in the drawing, the expression type of the data type and the graphic form 131, 132 may be changed by the longitudinal swipe input on both sides even when the avatar image 130 is outputted.

In the description of the embodiment of the present invention, the trajectory image 120 is replaced with the avatar image 130 in the divided screen outputting the trajectory image 120, but the time bar 110, the trajectory image 120, The avatar image 130 may be divided and output simultaneously.

Meanwhile, as shown in FIG. 1, the embodiment of the present invention acquires biometric data and motion data from a player who plays soccer to provide a monitoring screen. However, when applied to a motion other than football, various modifications are possible.

For example, in order to monitor the swing in golf, the EMG and heartbeat data can be obtained from the arm muscles and the motion data can be obtained from the wrist. Based on the EMG, heartbeat and motion data at this time, The heart rate, the motion data, and the analysis data on the time bar 110, the trajectory image 120, and the avatar image 130 by analyzing the muscle use and the fitness of the heartbeat of the subject. Here, since the locus image 120 will be a swing trajectory, it is generated as a three-dimensional image.

It is preferable that the avatar image 130 reflects the position data of the cuff obtained from the motion data and displays it as a moving image.

That is, the biometric information monitoring apparatus according to the present invention may acquire analysis data suitable for a sports item and select an exercise item so as to use the avatar image 130 corresponding to the sports item.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, . ≪ / RTI > Accordingly, such modifications are deemed to be within the scope of the present invention, and the scope of the present invention should be determined by the following claims.

A: Biometric information monitoring device
10: Collecting section
20: Analysis data generating section
30: time bar generator
40:
50: Graphic user interface unit
51: output section 52: input section
B: Wearable device
100: Screen
110: Time bar
110a: Event window 110b, 110c: Swipe input
111, 112, 113, 114, 115, 116, 117: unit bar image
120: Trajectory image
120a: Event window 120b, 120d: Swipe input
124, 126, 127: unit locus image
Avatar Image
131, 132: figure
140: Video

Claims (6)

  1. A collection unit (10) for collecting biometric data and motion data obtained by mutual time synchronization with a sensor worn on a human body;
    An analysis data generating unit (20) for obtaining analysis data of biometric data and motion data;
    Color or shape as a representation type of data and adopting a representation system which shows the change of data as continuous change of the expression type so that at least two or more data among the biometric data, A time bar generating unit 30 for generating a time bar 110 expressed in a side-by-side manner;
    Generates a locus image (120) in which at least any one of biometric data, motion data, and analysis data is successively acquired based on the expression scheme of the time bar generating unit (30) along a locus obtained based on position information of the motion data An image generating unit 40; And
    A graphical user interface (50) for displaying the time bar (110) and the locus image (120) on a screen and displaying them at the same time;
    And a biometric information monitoring device.
  2. The method according to claim 1,
    The time bar generating unit 30 specifies different expression types for each data type,
    The image generation unit 40 applies the expression type designated for each data type to the data to be represented by the trajectory image 120 in succession along with the expression system so that the data of the time bar 110 indicated by the same expression type and the trajectory image 120 The biometric information monitoring apparatus according to claim 1,
  3. The method according to claim 1,
    Wherein the image generating unit (40) generates a trajectory image (120) in which a plurality of pieces of data are expressed along a trajectory in parallel with each other.
  4. The method according to claim 1,
    Wherein the graphic user interface unit (50) causes data to be represented by the locus image (120) to be selected by user input.
  5. 3. The method of claim 2,
    The image generation unit 40 displays a figure indicating a position of a human body part related to data on an avatar that shapes a human being, forms a figure as a representation type for each successive data type, and displays an avatar image (130)
    The graphic user interface unit 50 displays the time bar 110, the trajectory image 120 and the avatar image 130 at the same time or displays the locus image 120 simultaneously output together with the time bar 110, To the avatar image (130) according to user input.
  6. 3. The method of claim 2,
    The graphic user interface unit 50 divides the time bar 110 or the trajectory image 120 output on the screen into a data type with one swipe input and expresses data with the other swipe input The biometric information monitoring device is capable of displaying only the selected type of data in a modified form.
KR1020170176715A 2017-12-21 2017-12-21 Biometric information monitoring apparatus providing Biometric information and analysis information KR101990531B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020170176715A KR101990531B1 (en) 2017-12-21 2017-12-21 Biometric information monitoring apparatus providing Biometric information and analysis information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020170176715A KR101990531B1 (en) 2017-12-21 2017-12-21 Biometric information monitoring apparatus providing Biometric information and analysis information

Publications (1)

Publication Number Publication Date
KR101990531B1 true KR101990531B1 (en) 2019-06-18

Family

ID=67103089

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020170176715A KR101990531B1 (en) 2017-12-21 2017-12-21 Biometric information monitoring apparatus providing Biometric information and analysis information

Country Status (1)

Country Link
KR (1) KR101990531B1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090059497A (en) * 2007-12-06 2009-06-11 삼성전자주식회사 Module for measuring character linked to exercise, system for analyzing character linked to exercise with the module, and method for applying the module
KR20140068403A (en) 2012-11-28 2014-06-09 주식회사 솔미테크 Muscle training guide system
KR20160063126A (en) * 2014-11-26 2016-06-03 삼성전자주식회사 Exercise information providing method and electronic device supporting the same
JP2017000454A (en) * 2015-06-10 2017-01-05 セイコーエプソン株式会社 Exercise guidance system, guidance content generation method, exercise guidance device and guidance content generation device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090059497A (en) * 2007-12-06 2009-06-11 삼성전자주식회사 Module for measuring character linked to exercise, system for analyzing character linked to exercise with the module, and method for applying the module
KR101366077B1 (en) 2007-12-06 2014-02-21 삼성전자주식회사 Module for measuring character linked to exercise, system for analyzing character linked to exercise with the module, and method for applying the module
KR20140068403A (en) 2012-11-28 2014-06-09 주식회사 솔미테크 Muscle training guide system
KR20160063126A (en) * 2014-11-26 2016-06-03 삼성전자주식회사 Exercise information providing method and electronic device supporting the same
JP2017000454A (en) * 2015-06-10 2017-01-05 セイコーエプソン株式会社 Exercise guidance system, guidance content generation method, exercise guidance device and guidance content generation device

Similar Documents

Publication Publication Date Title
Chen et al. A survey of depth and inertial sensor fusion for human action recognition
US10209773B2 (en) Methods and systems for obtaining, aggregating, and analyzing vision data to assess a person's vision performance
US20190333629A1 (en) Methods for the diagnosis and treatment of neurological disorders
Gruebler et al. Design of a wearable device for reading positive expressions from facial emg signals
Aung et al. The automatic detection of chronic pain-related expression: requirements, challenges and the multimodal EmoPain dataset
JP6268193B2 (en) Pulse wave measuring device, portable device, medical device system, and biological information communication system
Hernandez et al. Bioglass: Physiological parameter estimation using a head-mounted wearable device
JP5982392B2 (en) Fatigue index and its use
CN104379056B (en) For the collection of musculation and the system of analysis and operational approach thereof
Greene et al. A survey of affective computing for stress detection: Evaluating technologies in stress detection for better health
US10300371B2 (en) Method and system for interacting with a virtual environment
JP5609973B2 (en) Method and system for examining or training eye movements and body movements, and method for examining or training visual ability and intention
CN105338890B (en) The method and apparatus for determining life parameters
Manera et al. Cooperation or competition? Discriminating between social intentions by observing prehensile movements
Neuper et al. Imagery of motor actions: Differential effects of kinesthetic and visual–motor mode of imagery in single-trial EEG
JP6207510B2 (en) Apparatus and method for analyzing golf swing
EP2695645B1 (en) Running form diagnostic system and method for scoring running form
US20140288874A1 (en) Information processing device, sensor device, information processing system, and storage medium
JP5899289B2 (en) Method of operating a system for determining a physiological state of a person
US8790255B2 (en) Computer interfaces including physiologically guided avatars
Lange et al. Visual perception of biological motion by form: A template-matching analysis
US10092220B2 (en) System and method for motion capture
US20150079560A1 (en) Wearable Monitoring and Training System for Focus and/or Mood
KR101549761B1 (en) Method and system for automated personal training
Ghasemzadeh et al. Coordination analysis of human movements with body sensor networks: A signal processing model to evaluate baseball swings

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant