KR20160143036A - Mobile terminal and method for correting a posture using the same - Google Patents
Mobile terminal and method for correting a posture using the same Download PDFInfo
- Publication number
- KR20160143036A KR20160143036A KR1020150079066A KR20150079066A KR20160143036A KR 20160143036 A KR20160143036 A KR 20160143036A KR 1020150079066 A KR1020150079066 A KR 1020150079066A KR 20150079066 A KR20150079066 A KR 20150079066A KR 20160143036 A KR20160143036 A KR 20160143036A
- Authority
- KR
- South Korea
- Prior art keywords
- user
- relative coordinate
- coordinate value
- sensing
- posture
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/45—For evaluating or diagnosing the musculoskeletal system or teeth
- A61B5/4538—Evaluating a particular part of the muscoloskeletal system or a particular medical condition
- A61B5/4561—Evaluating static posture, e.g. undesirable back curvature
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6898—Portable consumer electronic devices, e.g. music players, telephones, tablet computers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- H04M1/72522—
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Surgery (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Animal Behavior & Ethology (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Dentistry (AREA)
- Physical Education & Sports Medicine (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Rheumatology (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Telephone Function (AREA)
Abstract
Description
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a mobile terminal, and more particularly, to a mobile terminal capable of correcting a posture of a user and a posture correcting method using the same.
A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.
The functions of mobile terminals are diversified. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video to the
Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .
On the other hand, as shown in Fig. 1, when the face is bent in a forward direction with respect to a vertical line (or a reference line) positioned on the shoulder and such a posture lasts for a long time, These problems are collectively referred to as turtle neck syndrome.
When the neck bone is bent like this, it is very difficult to be corrected back to the original position.
In recent years, with the explosion of mobile terminals, the use time of mobile terminals has also increased exponentially. When the user uses the mobile terminal for a long time, the user's shoulder is maintained at the moment when the user does not know, while the user's face approaches the mobile terminal, and if repeatedly performed, a disorder such as a turtle neck syndrome in which the neck bone is bent may occur .
The present invention is directed to solving the above-mentioned problems and other problems. Another object of the present invention is to provide a mobile terminal capable of correcting a user's posture and a posture correcting method using the same.
According to an aspect of the present invention, there is provided a mobile terminal including a user sensing sensor for acquiring image data and depth data on a plurality of sensing areas of a user, And a control unit for checking the amount of change of the relative coordinate value of each of the sensing areas and controlling the alarm to be generated when the user's attitude is bad according to the confirmation result.
According to another aspect of the present invention, there is provided a method of correcting a posture using a mobile terminal, the method comprising: acquiring image data and depth data on a plurality of sensing regions of a user using a user sensing sensor; Calculating relative coordinate values of each of the sensing areas based on the depth data; Confirming a change amount of a relative coordinate value of each of the sensing regions; And generates an alarm when the user attitude is bad according to the result of the check.
Effects of the mobile terminal and the control method according to the present invention will be described as follows.
According to at least one of the embodiments of the present invention, based on the image data and the depth data obtained from a plurality of sensing areas divided into a specific part of the user, for example, an upper body, a neck and a face, relative coordinate values Determines whether the user's posture is bad based on the calculated amount of change of the relative coordinate value for each sensing area thus calculated and provides an alarm for the user to return to the normal state when the posture is bad, It has the advantage of preventing various troubles such as torticollis syndrome caused by taking a bad posture unwittingly when you use it.
In addition, according to at least one embodiment of the present invention, it is advantageous that the user continuously checks until the user returns to the normal term, thereby more effectively managing the user to return to the normal term.
In addition, according to at least one embodiment of the present invention, the user attitude detection sensor is driven only when the terminal attitude is at a specific position, for example, a predetermined position, thereby eliminating the inconvenience that the user attitude sensor is driven indiscriminately .
Further scope of applicability of the present invention will become apparent from the following detailed description. It should be understood, however, that the detailed description and specific examples, such as the preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art.
1 is a view for explaining turtle neck syndrome.
2a is a block diagram for explaining a mobile terminal related to the present invention.
FIG. 2B and FIG. 2C are conceptual diagrams illustrating an example of a mobile terminal according to the present invention in different directions.
FIG. 3 illustrates a method of acquiring image data and depth data from a plurality of sensing areas of a user using a user attitude sensor.
4 is a flowchart illustrating a method of correcting a user's posture using the mobile terminal according to the first embodiment of the present invention.
5 is a flowchart illustrating S210 in more detail in the flowchart of FIG.
FIG. 6 shows a plurality of sensing areas having relative coordinate values.
7 is a flowchart illustrating S220 in more detail in the flowchart of FIG.
Fig. 8 shows a change in the attitude of the user.
FIG. 9 shows the coordinate values of the sensing areas varying according to the user's attitude change.
FIG. 10 is a flowchart illustrating a method of correcting a user's posture using a mobile terminal according to a second embodiment of the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The singular expressions include plural expressions unless the context clearly dictates otherwise.
In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.
2A to 2C are block diagrams for explaining a mobile terminal according to the present invention, and FIGS. 2B and 2C are conceptual diagrams illustrating mobile terminals according to the present invention in different directions.
The
The
The
The
The
The terminal position detection sensor 143 may be, for example, an accelerator sensor. The terminal attitude detection sensor 143 can detect the direction in which the terminal is tilted relative to the ground.
The user
The user
The
According to the present invention, the user
The user's sensing area may be divided into a number of sensing areas corresponding to a plurality of cells included in the user's
The plurality of sensed image data and depth data may be provided to the
Based on the image data, the user's upper body, neck, face and background area can be distinguished. Here, the distinction between the user's upper body, neck, and face and background region can be performed by a widely known edge detection algorithm. The relative coordinate value of each sensing area can be calculated with respect to the upper body, neck and face of the user thus divided. This coordinate value calculation will be described below.
As shown in Fig. 3, depth data can be obtained, for example, from each of the neck area (a), the face nose area (b), the forehead area (c) and the shoulder area (d) of the face.
In this case, the depth data obtained from the neck area (a) includes the first distance data (d1) and the first angle (? 1), and the depth data obtained from the face nose area (b) d2 and a second angle [alpha] 2.
The depth data obtained from the forehead region c of the face includes the third distance data d3 and the
The
The coordinate values can be expressed as (x, y, z). Here, the x axis and the y axis are faces of the user facing the terminal, for example, the x axis may be a vertical line and the y axis may be a horizontal line. The z-axis is an axis that is orthogonal to the plane formed by the y-axis and the y-axis, and can be directed, for example, from the user to the terminal direction.
Accordingly, when the coordinate value of the specific area is changed, the relative coordinate values of the remaining areas are also changed.
For example, when the face of the user is moved toward the terminal, the distance between the specific region and the terminal and the angular data are changed because the distance between the nose region b and the terminal is close to the specific region. Since the distance data between the specific area and the terminal is changed, the coordinate value of the specific area is changed. Since the coordinate values of the specific region change in this manner, the relative coordinate values of the remaining regions, that is, the neck region (a), the forehead region (c) and the shoulder region (d) of the face also change.
In another embodiment, each cell may have more cells than a corresponding region of the user. For example, each region is divided into a plurality of sub-regions, and each cell can acquire image data and depth data for each sub-region. In this case, the distance data and the angle data included in the depth data of each of the plurality of sub regions included in each area are averaged, and the average coordinate value of the corresponding area is calculated based on the averaged distance data and the angle data . In this method, the number of cells is so large that the load required for calculating each coordinate value for each cell can be solved. In addition, in order to determine a user's posture deficiency as a problem to be solved by the present invention, It is more efficient to calculate the coordinate value of each area rather than the coordinate value of each sub-unit.
The
The
The
In addition, the
In addition to the operations related to the application program, the
In addition, the
The
At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. The method of operation, control, or control of the mobile terminal may also be implemented on the mobile terminal by driving at least one application program stored in the
Referring to FIGS. 1B and 1C, the disclosed
Here, the terminal body can be understood as a concept of referring to the
The
A
In some cases, electronic components may also be mounted on the
As shown, when the
These
The
Meanwhile, the
The
2B and 2C, a
However, these configurations are not limited to this arrangement. These configurations may be excluded or replaced as needed, or placed on different planes. For example, the
Meanwhile, as described above, the user
Hereinafter, embodiments related to a control method that can be implemented in a mobile terminal configured as above will be described with reference to the accompanying drawings. It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.
4 is a flowchart illustrating a method of correcting a user's posture using the mobile terminal according to the first embodiment of the present invention.
Referring to FIGS. 1 and 4, a user attitude can be sensed by the user attitude sensor 144 (S210).
The user's
Specifically, as shown in FIG. 5, the user
When the user's
The user may position the front face of the mobile terminal facing the user and use various information provided by the mobile terminal by manipulating a plurality of icons displayed on the
Thus, when the user uses the mobile terminal, the front face of the mobile terminal is positioned to face the user. In this case, when the user
The depth data may include distance data and angle data. The distance data is a distance value between the mobile terminal and each sensing area, and the angle data may be a horizontal plane, that is, an inclination angle inclined with respect to the ground, but the present invention is not limited thereto.
The user's multiple sensing areas can be divided into upper body, neck and face including shoulders. For example, as shown in FIG. 6, the face is divided into 12 sensing areas, the neck is divided into 4 sensing areas, and the upper half can be divided into 24 sensing areas.
The user's plurality of sensing areas may include, but are not limited to, an upper body, neck and face, as well as peripheral background areas.
The acquired image data and depth data are provided to the control unit.
The control unit receives the image data and the depth data for each sensing area, and can distinguish the upper body, neck, and face of the user from the background area based on the image data for each sensing area (S216). This distinction can be performed by a well-known edge detection algorithm.
The control unit may calculate the relative coordinate values of the respective sensing regions based on the depth data (S218).
As described above, since the depth data has distance data and angle data, the relative coordinate values of each sensing area can be calculated using the distance data and the angle data. Since the algorithm for calculating the relative coordinate value is well known, a detailed description thereof will be omitted.
The relative coordinate value may mean that the coordinate values of the other sensing areas are relatively calculated based on the coordinate values of the specific sensing area when the coordinate values in the specific sensing area are calculated. Accordingly, the relative coordinate values can be calculated in consideration of coordinate values in a specific sensing region, distance data of the sensing region, and angle data.
As shown in FIG. 6, the user's upper body, neck, face, and background region can be distinguished through image data. In addition, the relative coordinate values a1 to a40 of the sensing regions can be calculated based on the depth data obtained from the sensing regions included in the upper body, neck, face, and background region of the user thus divided. Each of al to a40 may be composed of an x value, a y value, and a z value.
For example, the coordinate value a7 is calculated on the basis of the distance data and the angle data included in the depth data obtained in the specific sensing area, a part of the sensing area of the face, and the coordinate value a7 thus calculated is used as the reference coordinate value Can be set.
In this case, the relative coordinate values (a1 to a6 and a8 to a40) for each of the remaining sensing regions are calculated based on the reference coordinate value a7 and the distance data and angle data of the depth data obtained in the sensing region .
The sensing area of the plurality of sensing areas may be set as the reference coordinate value by the manufacturer of the posture correction application developer, the manufacturer of the mobile terminal, or the user.
Referring again to FIG. 4, when the user's posture is sensed, the controller can check whether the sensed user's posture deviates from a certain range (S220).
More specifically, as shown in FIG. 7, the control unit can check whether the relative coordinate value of each sensing area changes (S222).
As shown in FIG. 8, for example, the user's face may be moved toward the mobile terminal after a predetermined period of time while the user is using the mobile terminal in a state where the user is in the normal state, and the user may be bent.
In this case, it is assumed that the relative coordinate values a1 to a40 for the respective sensing regions obtained and calculated from the user's
In this case, the relative coordinate values (b1 to b40) for each sensing area obtained from the user's
As the face of the user is moved to the mobile terminal, the reference coordinate value a7 in Fig. 6 may be changed to b7 in Fig. At this time, a7 and b7 are different coordinate values. Specifically, as the user moves from the normal state to the mobile terminal, the z value of b7 becomes smaller than the z value of a7.
Since the reference coordinate value b7 is changed as the face of the user is moved to the mobile terminal, the relative coordinate values b1 to b6 and b8 obtained from the remaining sensing regions, as well as the reference coordinate value b7, To b40) can also be changed from the corresponding relative coordinate values a1 to a6 and a8 to a40 shown in Fig.
How the x value, the y value, and the z value of each relative coordinate value change depends on how much the user moves in what direction.
The controller may check whether the change amount between the calculated relative coordinate value of the sensing region and the reference relative coordinate value variation exceeds the threshold change amount (S224).
The reference relative coordinate value may mean the relative coordinate value in the user's normal state. The reference relative coordinate value may be set by the manufacturer of the posture correction application developer, the manufacturer of the mobile terminal, or the user.
The control unit can confirm whether the user's posture has deviated from a certain range based on whether or not the amount of change between the relative coordinate value and the reference relative coordinate value for each sensing area calculated and obtained from the user's
The reference relative coordinate value can be set for all sensing areas. Therefore, when the relative coordinate value calculated from each sensing area coincides one-to-one with the reference relative coordinate value for each sensing area for all corresponding sensing areas, the user can be perfectly normalized.
If the relative coordinate values calculated from some sensing regions among the relative coordinate values calculated from the respective sensing regions do not match the reference relative coordinate values, the degree of change between the relative coordinate values calculated from the partial sensing regions and the reference relative coordinate values (S226). ≪ / RTI >
For example, the amount of change between the relative coordinate value calculated for the face and the reference relative coordinate value does not exceed the threshold change amount, whereas the amount of change between the relative coordinate value calculated for the face and the reference relative coordinate value exceeds the critical change amount , It can be judged that the posture deficiency is the same as that of the turtle neck syndrome which occurs when the user's face is moved forward.
For example, the amount of change between the relative coordinate value calculated for the left side of the upper half of the body and the reference relative coordinate value does not exceed the critical variation, while the amount of change between the relative coordinate value calculated for the right side of the upper body and the reference relative coordinate value, If the amount of change is exceeded, the upper body of the user may be judged as a posture deflection which is inclined to the left.
For example, the amount of change between the relative coordinate value computed for the right side of the face and the reference relative coordinate value does not exceed the critical change amount, while the amount of change between the relative coordinate value computed for the left side of the face and the reference relative coordinate value, If the change amount is exceeded, the user's face may be judged as a posture deflection which is tilted to the right side.
The above-described critical change amount can be set by the manufacturer of the posture correction application developer, the manufacturer of the mobile terminal, or the user.
As described above, when various threshold change amounts corresponding to the respective failure postures are set, it can be determined that posture failure is caused by the exceeding of the corresponding critical change amount.
In addition to the situations described above, the user's posture defects can be determined through various threshold change amounts.
Referring again to FIG. 4, when the detected user's posture deviates from a predetermined range, the controller may transmit an alarm to the user using the mobile terminal (S230).
Alarms can be delivered using voice, vibration, text, and so on.
For example, through the
For example, the character " Please move your face backward so that it is straight with the shoulder "may be displayed through the
For example, vibration may be generated for a few minutes so that the user can not use the mobile terminal until the user moves his / her face back and becomes normal.
The control unit can confirm whether the user's posture deficiency is solved (S240).
When the user is moved to return to the normal position in response to the alarm, the image data and the depth data are obtained from the user's plurality of sensing areas by the user's
If it is determined that the user has returned to the home position, the control unit may release the alarm being transmitted to the user (S250).
For example, the voice for the alarm transmitted to the
For example, the characters for the alarm being displayed on the
For example, the vibration for the alarm that is continuously occurring in the mobile terminal can be stopped.
According to the first embodiment of the present invention, based on the image data and the depth data obtained by the user's
Furthermore, according to the first embodiment of the present invention, by continuously checking until the user returns to the normal taxation, the user can more effectively manage to return to the normal taxation.
FIG. 10 is a flowchart illustrating a method of correcting a user's posture using a mobile terminal according to a second embodiment of the present invention.
S330 to S370 in the second embodiment are the same as S210 to S250 explained in the first embodiment. Therefore, S330 to S370 can be easily understood from the first embodiment, and detailed description will be omitted.
Referring to FIG. 10, the terminal attitude sensor 143 may detect the terminal attitude (S310).
The terminal posture detection sensor 143 may be an acceleration sensor, which has been described in detail in the foregoing, and a further description thereof will be omitted.
The terminal attitude detection sensor 143 can detect the direction in which the terminal is tilted relative to the ground.
In the present invention, the reference posture position for the terminal can be set in advance. Here, the reference attitude for the terminal means that the terminal is tilted at a certain angle? With respect to the ground as shown in FIG. At this time, the constant angle? Can be set in the range of 0 to 45 degrees with respect to the ground. 0 ° may mean a state parallel to the ground.
Therefore, the reference posture position for the terminal can be set within the range of 0 DEG to 45 DEG with respect to the ground.
In the following description, it is assumed that the reference posture position for the terminal is set to 30 degrees with respect to the ground, but the present invention is not limited to this.
The controller checks whether the terminal posture detected by the terminal posture sensor 143 is at a reference position, for example, a tilted angle of 30 degrees with respect to the ground, and drives the user posture sensor 144 (S320).
That is, when the terminal posture is at the reference posture position, the controller can control the
The subsequent operations, that is, S330 to S370 can be easily understood from S210 to S250 of the first embodiment, and further explanation will be omitted.
According to the second embodiment of the present invention, when the terminal posture becomes a specific position, for example, a correct position, the user is prompted to drive the user's
The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). In addition, the computer may include a
100: mobile terminal 110: wireless communication unit
120: Input unit
140: sensing unit 150: output unit
160: interface unit 170: memory
180: control unit 190: power supply unit
Claims (15)
Calculating relative coordinate values of each of the sensing areas based on the depth data,
A change amount of a relative coordinate value of each of the sensing regions is checked,
And controlling the alarm to be generated when the user's attitude is bad according to the confirmation result.
Wherein the sensing region includes at least an upper body, a neck and a face including a shoulder.
Wherein,
And separates the upper body, the neck and the face from the background area based on the image data.
Wherein the depth data includes distance data and angle data,
Wherein the controller calculates a relative coordinate value of each of the sensing areas based on the distance data and the angle data.
Wherein the relative coordinate value is a relative coordinate value calculated in a remaining sensing area based on a reference coordinate value calculated in a specific sensing area among the plurality of sensing areas.
Wherein,
And determining whether a change amount between a relative coordinate value of each of the detection regions and a reference relative coordinate value variation amount exceeds a critical variation amount,
When the amount of change between the relative coordinate value of each of the sensing regions and the reference relative coordinate value change amount exceeds the threshold change amount,
Wherein the reference relative coordinate value is a relative coordinate value in a normal state of the user.
Wherein,
The alarm is continuously generated until the user's posture returns to the normal state,
And releases the alarm when the user's posture returns to the normal state.
Further comprising a terminal attitude detecting sensor for detecting a terminal attitude,
Wherein,
When the terminal is located in the reference posture, drives the user detection sensor,
Wherein the reference posture of the terminal is a position tilted at a predetermined angle with respect to the ground.
Wherein the user attitude sensor is a depth sensor,
Wherein the terminal posture sensor is an acceleration sensor.
Calculating relative coordinate values of each of the sensing areas based on the depth data;
Confirming a change amount of a relative coordinate value of each of the sensing regions; And
And generating an alarm when the user's attitude is bad according to the result of the determination.
Wherein the sensing region comprises at least a shoulder, an upper body, a neck and a face.
Wherein the depth data includes distance data and angle data,
Wherein the step of calculating the relative coordinate value comprises:
Calculating the relative coordinate value of each of the sensing regions based on the distance data and the angle data.
The method of claim 1,
Confirming whether a change amount between a relative coordinate value of each of the sensing regions and a reference relative coordinate value variation amount exceeds a critical variation amount; And
And determining that the user posture is defective when the amount of change between the relative coordinate values of the sensing regions and the reference relative coordinate value variation exceeds the threshold change amount,
Wherein the reference relative coordinate value is a relative coordinate value in a normal state of the user.
Continuously generating an alarm until the user's posture returns to normal; And
And canceling the alarm when the user's posture returns to the normal state.
Sensing a terminal posture; And
Further comprising driving the user detection sensor when the terminal is positioned in a reference posture,
Wherein the reference posture of the terminal is a position that is inclined at a predetermined angle with respect to the ground.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150079066A KR20160143036A (en) | 2015-06-04 | 2015-06-04 | Mobile terminal and method for correting a posture using the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150079066A KR20160143036A (en) | 2015-06-04 | 2015-06-04 | Mobile terminal and method for correting a posture using the same |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20160143036A true KR20160143036A (en) | 2016-12-14 |
Family
ID=57575717
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150079066A KR20160143036A (en) | 2015-06-04 | 2015-06-04 | Mobile terminal and method for correting a posture using the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20160143036A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101851099B1 (en) * | 2017-01-19 | 2018-04-20 | 신소현 | Method and System for Preventing Software-based Forward Head Posture Using Image Processing and Machine Learning |
KR20210000513A (en) * | 2019-06-25 | 2021-01-05 | 수원대학교산학협력단 | Forward head posture protection device and controlling method thereof |
CN113239748A (en) * | 2021-04-26 | 2021-08-10 | 深圳市安星数字系统有限公司 | Radar monitoring method, device, equipment and storage medium |
KR102381542B1 (en) * | 2020-11-11 | 2022-03-31 | 숙명여자대학교산학협력단 | System including algorithm for determining real-time turtle neck posture, responsive cradle interlocking with the system, and control method thereof |
WO2022103406A1 (en) * | 2020-11-16 | 2022-05-19 | Google Llc | Posture detection system |
WO2024080579A1 (en) * | 2022-10-11 | 2024-04-18 | 삼성전자주식회사 | Wearable device for guiding user's posture and method thereof |
-
2015
- 2015-06-04 KR KR1020150079066A patent/KR20160143036A/en unknown
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101851099B1 (en) * | 2017-01-19 | 2018-04-20 | 신소현 | Method and System for Preventing Software-based Forward Head Posture Using Image Processing and Machine Learning |
KR20210000513A (en) * | 2019-06-25 | 2021-01-05 | 수원대학교산학협력단 | Forward head posture protection device and controlling method thereof |
KR102381542B1 (en) * | 2020-11-11 | 2022-03-31 | 숙명여자대학교산학협력단 | System including algorithm for determining real-time turtle neck posture, responsive cradle interlocking with the system, and control method thereof |
WO2022103406A1 (en) * | 2020-11-16 | 2022-05-19 | Google Llc | Posture detection system |
CN113239748A (en) * | 2021-04-26 | 2021-08-10 | 深圳市安星数字系统有限公司 | Radar monitoring method, device, equipment and storage medium |
WO2024080579A1 (en) * | 2022-10-11 | 2024-04-18 | 삼성전자주식회사 | Wearable device for guiding user's posture and method thereof |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR20160143036A (en) | Mobile terminal and method for correting a posture using the same | |
CN106899801B (en) | Mobile terminal and control method thereof | |
KR20170137476A (en) | Mobile device and method for controlling thereof | |
CN112907725B (en) | Image generation, training of image processing model and image processing method and device | |
KR101783773B1 (en) | Dual camera module and mobile terminal comprising the camera | |
CN109558837B (en) | Face key point detection method, device and storage medium | |
KR20160008372A (en) | Mobile terminal and control method for the mobile terminal | |
KR20180040409A (en) | Mobile terminal and method for controlling the same | |
CN110933452B (en) | Method and device for displaying lovely face gift and storage medium | |
US9916004B2 (en) | Display device | |
KR101718043B1 (en) | Mobile terminal and method of controlling the same | |
CN112470450A (en) | Mobile terminal | |
KR20170057058A (en) | Mobile terminal and method for controlling the same | |
KR102218919B1 (en) | Mobile terminal | |
CN108317992A (en) | A kind of object distance measurement method and terminal device | |
US11032467B2 (en) | Mobile terminal and control method thereof for obtaining image in response to the signal | |
KR20190054727A (en) | Smart mirror device and and method the same | |
KR20160038409A (en) | Mobile terminal and method for controlling the same | |
WO2020078277A1 (en) | Structured light support and terminal device | |
KR20210068877A (en) | Method and electronic device for correcting image according to camera switch | |
CN111753606A (en) | Intelligent model upgrading method and device | |
KR20170036489A (en) | Camera module and mobile terminal communicatively therewith | |
KR20160005862A (en) | Mobile terminal and method for controlling the same | |
CN111829651B (en) | Method, device and equipment for calibrating light intensity value and storage medium | |
KR102084161B1 (en) | Electro device for correcting image and method for controlling thereof |