KR20160143036A - Mobile terminal and method for correting a posture using the same - Google Patents

Mobile terminal and method for correting a posture using the same Download PDF

Info

Publication number
KR20160143036A
KR20160143036A KR1020150079066A KR20150079066A KR20160143036A KR 20160143036 A KR20160143036 A KR 20160143036A KR 1020150079066 A KR1020150079066 A KR 1020150079066A KR 20150079066 A KR20150079066 A KR 20150079066A KR 20160143036 A KR20160143036 A KR 20160143036A
Authority
KR
South Korea
Prior art keywords
user
relative coordinate
coordinate value
sensing
posture
Prior art date
Application number
KR1020150079066A
Other languages
Korean (ko)
Inventor
김도관
최윤창
김주현
오승근
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150079066A priority Critical patent/KR20160143036A/en
Publication of KR20160143036A publication Critical patent/KR20160143036A/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4538Evaluating a particular part of the muscoloskeletal system or a particular medical condition
    • A61B5/4561Evaluating static posture, e.g. undesirable back curvature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • H04M1/72522

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Dentistry (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Rheumatology (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Telephone Function (AREA)

Abstract

A mobile terminal includes: a user detecting sensor configured to obtain image data and depth data in multiple detecting regions of a user; and a control unit configured to calculate a relative coordinate value of each detecting region based on the depth data, check the variation of the relative coordinate value of each of the detecting regions, and control to generate notification when a user posture is in failure according to a checking result. The present invention can previously block various aftereffects such as forward head posture (FHP) or the like.

Description

TECHNICAL FIELD [0001] The present invention relates to a mobile terminal and a posture correction method using the mobile terminal.

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a mobile terminal, and more particularly, to a mobile terminal capable of correcting a posture of a user and a posture correcting method using the same.

A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.

The functions of mobile terminals are diversified. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video to the display unit 151. Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs.

Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .

On the other hand, as shown in Fig. 1, when the face is bent in a forward direction with respect to a vertical line (or a reference line) positioned on the shoulder and such a posture lasts for a long time, These problems are collectively referred to as turtle neck syndrome.

When the neck bone is bent like this, it is very difficult to be corrected back to the original position.

In recent years, with the explosion of mobile terminals, the use time of mobile terminals has also increased exponentially. When the user uses the mobile terminal for a long time, the user's shoulder is maintained at the moment when the user does not know, while the user's face approaches the mobile terminal, and if repeatedly performed, a disorder such as a turtle neck syndrome in which the neck bone is bent may occur .

The present invention is directed to solving the above-mentioned problems and other problems. Another object of the present invention is to provide a mobile terminal capable of correcting a user's posture and a posture correcting method using the same.

According to an aspect of the present invention, there is provided a mobile terminal including a user sensing sensor for acquiring image data and depth data on a plurality of sensing areas of a user, And a control unit for checking the amount of change of the relative coordinate value of each of the sensing areas and controlling the alarm to be generated when the user's attitude is bad according to the confirmation result.

According to another aspect of the present invention, there is provided a method of correcting a posture using a mobile terminal, the method comprising: acquiring image data and depth data on a plurality of sensing regions of a user using a user sensing sensor; Calculating relative coordinate values of each of the sensing areas based on the depth data; Confirming a change amount of a relative coordinate value of each of the sensing regions; And generates an alarm when the user attitude is bad according to the result of the check.

Effects of the mobile terminal and the control method according to the present invention will be described as follows.

According to at least one of the embodiments of the present invention, based on the image data and the depth data obtained from a plurality of sensing areas divided into a specific part of the user, for example, an upper body, a neck and a face, relative coordinate values Determines whether the user's posture is bad based on the calculated amount of change of the relative coordinate value for each sensing area thus calculated and provides an alarm for the user to return to the normal state when the posture is bad, It has the advantage of preventing various troubles such as torticollis syndrome caused by taking a bad posture unwittingly when you use it.

In addition, according to at least one embodiment of the present invention, it is advantageous that the user continuously checks until the user returns to the normal term, thereby more effectively managing the user to return to the normal term.

In addition, according to at least one embodiment of the present invention, the user attitude detection sensor is driven only when the terminal attitude is at a specific position, for example, a predetermined position, thereby eliminating the inconvenience that the user attitude sensor is driven indiscriminately .

Further scope of applicability of the present invention will become apparent from the following detailed description. It should be understood, however, that the detailed description and specific examples, such as the preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art.

1 is a view for explaining turtle neck syndrome.
2a is a block diagram for explaining a mobile terminal related to the present invention.
FIG. 2B and FIG. 2C are conceptual diagrams illustrating an example of a mobile terminal according to the present invention in different directions.
FIG. 3 illustrates a method of acquiring image data and depth data from a plurality of sensing areas of a user using a user attitude sensor.
4 is a flowchart illustrating a method of correcting a user's posture using the mobile terminal according to the first embodiment of the present invention.
5 is a flowchart illustrating S210 in more detail in the flowchart of FIG.
FIG. 6 shows a plurality of sensing areas having relative coordinate values.
7 is a flowchart illustrating S220 in more detail in the flowchart of FIG.
Fig. 8 shows a change in the attitude of the user.
FIG. 9 shows the coordinate values of the sensing areas varying according to the user's attitude change.
FIG. 10 is a flowchart illustrating a method of correcting a user's posture using a mobile terminal according to a second embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.

2A to 2C are block diagrams for explaining a mobile terminal according to the present invention, and FIGS. 2B and 2C are conceptual diagrams illustrating mobile terminals according to the present invention in different directions.

The mobile terminal 100 includes a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a control unit 180, And the like. The components shown in FIG. 2A are not essential for implementing a mobile terminal, so that the mobile terminal described herein may have more or fewer components than the components listed above.

The wireless communication unit 110 may be connected between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and another mobile terminal 100 or between the mobile terminal 100 and the external server 100. [ Lt; RTI ID = 0.0 > wireless < / RTI > In addition, the wireless communication unit 110 may include one or more modules for connecting the mobile terminal 100 to one or more networks.

The wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short distance communication module 114, and a location information module 115 .

The input unit 120 includes a camera 121 or an image input unit for inputting a video signal, a microphone 122 for inputting an audio signal, an audio input unit, a user input unit 123 for receiving information from a user A touch key, a mechanical key, and the like). The voice data or image data collected by the input unit 120 may be analyzed and processed by a user's control command.

The sensing unit 140 may include at least one sensor for sensing at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information. For example, the sensing unit 140 may include a proximity sensor 141, an illumination sensor 142, a touch sensor, a terminal posture sensor 143, a user posture sensor 144, A magnetic sensor, a G-sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared sensor, a finger scan sensor, an ultrasonic sensor, an optical sensor (e.g., a camera 121), a microphone 122, a battery gauge, an environmental sensor (e.g., a barometer , A hygrometer, a thermometer, a radiation sensor, a thermal sensor, a gas sensor, etc.), a chemical sensor (e.g., an electronic nose, a healthcare sensor, a biometric sensor, etc.). Meanwhile, the mobile terminal disclosed in the present specification can combine and utilize information sensed by at least two of the sensors.

The terminal position detection sensor 143 may be, for example, an accelerator sensor. The terminal attitude detection sensor 143 can detect the direction in which the terminal is tilted relative to the ground.

The user position detection sensor 144 may be a depth sensor. The user position detection sensor 144 may be disposed on a part of the front surface of the terminal body, as shown in FIG. 2B.

The user attitude detection sensor 144 can acquire image data and depth data from a user. To this end, the user attitude detection sensor 144 may include a plurality of cells for measuring a distance to photograph an image. Each cell may obtain image data for the corresponding sensing area of the user and depth data between the user orientation sensor 144 and the corresponding sensing area of the user.

The control unit 180 may generate a three-dimensional image based on the image data and the depth data.

According to the present invention, the user position detection sensor 144 can acquire image data and depth data on the upper body, neck and face including the user's shoulder. Here, the upper body, neck and face including the user's shoulder may be referred to as a user's sensing area.

The user's sensing area may be divided into a number of sensing areas corresponding to a plurality of cells included in the user's position sensing sensor 144. [ Accordingly, image data and depth data can be obtained for a plurality of sensing regions corresponding to a plurality of cells included in the user position detection sensor 144.

The plurality of sensed image data and depth data may be provided to the control unit 180. Here, the depth data may include distance data and angle data. The distance data is a distance value between the mobile terminal and each sensing area, and the angle data may be a horizontal plane, that is, an inclination angle inclined with respect to the ground plane (FIG. 3).

Based on the image data, the user's upper body, neck, face and background area can be distinguished. Here, the distinction between the user's upper body, neck, and face and background region can be performed by a widely known edge detection algorithm. The relative coordinate value of each sensing area can be calculated with respect to the upper body, neck and face of the user thus divided. This coordinate value calculation will be described below.

As shown in Fig. 3, depth data can be obtained, for example, from each of the neck area (a), the face nose area (b), the forehead area (c) and the shoulder area (d) of the face.

In this case, the depth data obtained from the neck area (a) includes the first distance data (d1) and the first angle (? 1), and the depth data obtained from the face nose area (b) d2 and a second angle [alpha] 2.

The depth data obtained from the forehead region c of the face includes the third distance data d3 and the third angle 3 and the depth data obtained from the shoulder region d includes the fourth distance data d4 and And a fourth angle [alpha] 4.

The controller 180 may calculate relative coordinate values based on the distance data and the angle data included in each depth data. The relative coordinate value refers to a coordinate value of a specific area, for example, a nose area (c) of a face and a shoulder area (a) d may be referred to as relative coordinate values.

The coordinate values can be expressed as (x, y, z). Here, the x axis and the y axis are faces of the user facing the terminal, for example, the x axis may be a vertical line and the y axis may be a horizontal line. The z-axis is an axis that is orthogonal to the plane formed by the y-axis and the y-axis, and can be directed, for example, from the user to the terminal direction.

Accordingly, when the coordinate value of the specific area is changed, the relative coordinate values of the remaining areas are also changed.

For example, when the face of the user is moved toward the terminal, the distance between the specific region and the terminal and the angular data are changed because the distance between the nose region b and the terminal is close to the specific region. Since the distance data between the specific area and the terminal is changed, the coordinate value of the specific area is changed. Since the coordinate values of the specific region change in this manner, the relative coordinate values of the remaining regions, that is, the neck region (a), the forehead region (c) and the shoulder region (d) of the face also change.

In another embodiment, each cell may have more cells than a corresponding region of the user. For example, each region is divided into a plurality of sub-regions, and each cell can acquire image data and depth data for each sub-region. In this case, the distance data and the angle data included in the depth data of each of the plurality of sub regions included in each area are averaged, and the average coordinate value of the corresponding area is calculated based on the averaged distance data and the angle data . In this method, the number of cells is so large that the load required for calculating each coordinate value for each cell can be solved. In addition, in order to determine a user's posture deficiency as a problem to be solved by the present invention, It is more efficient to calculate the coordinate value of each area rather than the coordinate value of each sub-unit.

The control unit 180 determines whether the user's posture is bad based on whether the coordinate values are relative to the respective areas, and can control the action to be performed, for example, to transmit an alarm through text or voice. A more detailed description will be given later.

The output unit 150 includes at least one of a display unit 151, an acoustic output unit 152, a haptic tip module 153, and a light output unit 154 to generate an output related to visual, auditory, can do. The display unit 151 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. The touch screen may function as a user input unit 123 that provides an input interface between the mobile terminal 100 and a user and may provide an output interface between the mobile terminal 100 and a user.

The interface unit 160 serves as a path to various types of external devices connected to the mobile terminal 100. The interface unit 160 is connected to a device having a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, And may include at least one of a port, an audio I / O port, a video I / O port, and an earphone port. In the mobile terminal 100, corresponding to the connection of the external device to the interface unit 160, it is possible to perform appropriate control related to the connected external device.

In addition, the memory 170 stores data supporting various functions of the mobile terminal 100. The memory 170 may store a plurality of application programs or applications driven by the mobile terminal 100, data for operation of the mobile terminal 100, and commands. At least some of these applications may be downloaded from an external server via wireless communication. Also, at least a part of these application programs may exist on the mobile terminal 100 from the time of shipment for the basic functions (e.g., telephone call receiving function, message receiving function, and calling function) of the mobile terminal 100. Meanwhile, the application program may be stored in the memory 170, installed on the mobile terminal 100, and operated by the control unit 180 to perform the operation (or function) of the mobile terminal.

In addition to the operations related to the application program, the control unit 180 typically controls the overall operation of the mobile terminal 100. The control unit 180 may process or process signals, data, information, and the like input or output through the above-mentioned components, or may drive an application program stored in the memory 170 to provide or process appropriate information or functions to the user.

In addition, the controller 180 may control at least some of the components illustrated in FIG. 2A to drive an application program stored in the memory 170. FIG. In addition, the controller 180 may operate at least two of the components included in the mobile terminal 100 in combination with each other for driving the application program.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power to the components included in the mobile terminal 100. The power supply unit 190 may include a battery, and the battery may be an internal battery or a replaceable battery.

At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. The method of operation, control, or control of the mobile terminal may also be implemented on the mobile terminal by driving at least one application program stored in the memory 170. [

Referring to FIGS. 1B and 1C, the disclosed mobile terminal 100 includes a bar-shaped terminal body. However, the present invention is not limited thereto and can be applied to various structures such as a folder type, a flip type, a slide type, a swing type, and a swivel type in which a watch type, a clip type, a glass type or two or more bodies are relatively movably coupled . A description of a particular type of mobile terminal, although relevant to a particular type of mobile terminal, is generally applicable to other types of mobile terminals.

Here, the terminal body can be understood as a concept of referring to the mobile terminal 100 as at least one aggregate.

The mobile terminal 100 includes a case (for example, a frame, a housing, a cover, and the like) that forms an appearance. As shown, the mobile terminal 100 may include a front case 101 and a rear case 102. Various electronic components are disposed in the inner space formed by the combination of the front case 101 and the rear case 102. At least one middle case may be additionally disposed between the front case 101 and the rear case 102.

A display unit 151 is disposed on a front surface of the terminal body to output information. The window 151a of the display unit 151 may be mounted on the front case 101 to form a front surface of the terminal body together with the front case 101. [

In some cases, electronic components may also be mounted on the rear case 102. Electronic parts that can be mounted on the rear case 102 include detachable batteries, an identification module, a memory card, and the like. In this case, a rear cover 103 for covering the mounted electronic components can be detachably coupled to the rear case 102. Therefore, when the rear cover 103 is separated from the rear case 102, the electronic parts mounted on the rear case 102 are exposed to the outside.

As shown, when the rear cover 103 is coupled to the rear case 102, a side portion of the rear case 102 can be exposed. In some cases, the rear case 102 may be completely covered by the rear cover 103 during the engagement. Meanwhile, the rear cover 103 may be provided with an opening for exposing the camera 121b and the sound output unit 152b to the outside.

These cases 101, 102, and 103 may be formed by injection molding of synthetic resin or may be formed of metal such as stainless steel (STS), aluminum (Al), titanium (Ti), or the like.

The mobile terminal 100 may be configured such that one case provides the internal space, unlike the above example in which a plurality of cases provide an internal space for accommodating various electronic components. In this case, a unibody mobile terminal 100 in which synthetic resin or metal is connected from the side to the rear side can be realized.

Meanwhile, the mobile terminal 100 may include a waterproof unit (not shown) for preventing water from penetrating into the terminal body. For example, the waterproof portion is provided between the window 151a and the front case 101, between the front case 101 and the rear case 102, or between the rear case 102 and the rear cover 103, And a waterproof member for sealing the inside space of the oven.

The mobile terminal 100 is provided with a display unit 151, first and second sound output units 152a and 152b, a proximity sensor 141, an illuminance sensor 142, a light output unit 154, Cameras 121a and 121b, first and second operation units 123a and 123b, a microphone 122, an interface unit 160, and the like.

2B and 2C, a display unit 151, a first sound output unit 152a, a proximity sensor 141, a light intensity sensor 142, an optical output unit (not shown) A second operation unit 123b, a microphone 122 and an interface unit 160 are disposed on a side surface of the terminal body, And a mobile terminal 100 having a second sound output unit 152b and a second camera 121b disposed on a rear surface thereof.

However, these configurations are not limited to this arrangement. These configurations may be excluded or replaced as needed, or placed on different planes. For example, the first operation unit 123a may not be provided on the front surface of the terminal body, and the second sound output unit 152b may be provided on the side surface of the terminal body rather than the rear surface of the terminal body.

Meanwhile, as described above, the user position detection sensor 144 may be disposed on a part of the front surface of the terminal body. The user's attitude detection sensor 144 may acquire image data and depth data from a plurality of sensing areas divided into upper, lower, and face regions of the user. Based on the image data and the depth data obtained from the respective regions, relative coordinate values for the respective regions are calculated, and it is possible to determine whether or not the user is in a state of posture based on the calculated amount of change in the relative coordinate values.

Hereinafter, embodiments related to a control method that can be implemented in a mobile terminal configured as above will be described with reference to the accompanying drawings. It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.

4 is a flowchart illustrating a method of correcting a user's posture using the mobile terminal according to the first embodiment of the present invention.

Referring to FIGS. 1 and 4, a user attitude can be sensed by the user attitude sensor 144 (S210).

The user's posture detection sensor 144 may be a depth sensor, which has been described in detail above, so that further explanation is omitted.

Specifically, as shown in FIG. 5, the user position detection sensor 144 may be driven (S212). The user's position detection sensor 144 may be driven simultaneously when the mobile terminal is powered on, or may be driven by a user, but the present invention is not limited thereto.

When the user's posture sensing sensor 144 is driven, the user's posture sensing sensor 144 may acquire image data and depth data on a sensing area of the user using the mobile terminal from behind the mobile terminal (S214).

The user may position the front face of the mobile terminal facing the user and use various information provided by the mobile terminal by manipulating a plurality of icons displayed on the display unit 151 disposed on the front face of the mobile terminal.

Thus, when the user uses the mobile terminal, the front face of the mobile terminal is positioned to face the user. In this case, when the user position detection sensor 144 disposed on the front surface of the mobile terminal is driven, the user position detection sensor 144 can obtain the image data and the depth data from the user's plurality of detection areas.

The depth data may include distance data and angle data. The distance data is a distance value between the mobile terminal and each sensing area, and the angle data may be a horizontal plane, that is, an inclination angle inclined with respect to the ground, but the present invention is not limited thereto.

The user's multiple sensing areas can be divided into upper body, neck and face including shoulders. For example, as shown in FIG. 6, the face is divided into 12 sensing areas, the neck is divided into 4 sensing areas, and the upper half can be divided into 24 sensing areas.

The user's plurality of sensing areas may include, but are not limited to, an upper body, neck and face, as well as peripheral background areas.

The acquired image data and depth data are provided to the control unit.

The control unit receives the image data and the depth data for each sensing area, and can distinguish the upper body, neck, and face of the user from the background area based on the image data for each sensing area (S216). This distinction can be performed by a well-known edge detection algorithm.

The control unit may calculate the relative coordinate values of the respective sensing regions based on the depth data (S218).

As described above, since the depth data has distance data and angle data, the relative coordinate values of each sensing area can be calculated using the distance data and the angle data. Since the algorithm for calculating the relative coordinate value is well known, a detailed description thereof will be omitted.

The relative coordinate value may mean that the coordinate values of the other sensing areas are relatively calculated based on the coordinate values of the specific sensing area when the coordinate values in the specific sensing area are calculated. Accordingly, the relative coordinate values can be calculated in consideration of coordinate values in a specific sensing region, distance data of the sensing region, and angle data.

As shown in FIG. 6, the user's upper body, neck, face, and background region can be distinguished through image data. In addition, the relative coordinate values a1 to a40 of the sensing regions can be calculated based on the depth data obtained from the sensing regions included in the upper body, neck, face, and background region of the user thus divided. Each of al to a40 may be composed of an x value, a y value, and a z value.

For example, the coordinate value a7 is calculated on the basis of the distance data and the angle data included in the depth data obtained in the specific sensing area, a part of the sensing area of the face, and the coordinate value a7 thus calculated is used as the reference coordinate value Can be set.

In this case, the relative coordinate values (a1 to a6 and a8 to a40) for each of the remaining sensing regions are calculated based on the reference coordinate value a7 and the distance data and angle data of the depth data obtained in the sensing region .

The sensing area of the plurality of sensing areas may be set as the reference coordinate value by the manufacturer of the posture correction application developer, the manufacturer of the mobile terminal, or the user.

Referring again to FIG. 4, when the user's posture is sensed, the controller can check whether the sensed user's posture deviates from a certain range (S220).

More specifically, as shown in FIG. 7, the control unit can check whether the relative coordinate value of each sensing area changes (S222).

As shown in FIG. 8, for example, the user's face may be moved toward the mobile terminal after a predetermined period of time while the user is using the mobile terminal in a state where the user is in the normal state, and the user may be bent.

In this case, it is assumed that the relative coordinate values a1 to a40 for the respective sensing regions obtained and calculated from the user's position detection sensor 144 in the normal position are as shown in Fig.

In this case, the relative coordinate values (b1 to b40) for each sensing area obtained from the user's position detection sensor 144 while the user's face is moved to the mobile terminal are as shown in FIG.

As the face of the user is moved to the mobile terminal, the reference coordinate value a7 in Fig. 6 may be changed to b7 in Fig. At this time, a7 and b7 are different coordinate values. Specifically, as the user moves from the normal state to the mobile terminal, the z value of b7 becomes smaller than the z value of a7.

Since the reference coordinate value b7 is changed as the face of the user is moved to the mobile terminal, the relative coordinate values b1 to b6 and b8 obtained from the remaining sensing regions, as well as the reference coordinate value b7, To b40) can also be changed from the corresponding relative coordinate values a1 to a6 and a8 to a40 shown in Fig.

How the x value, the y value, and the z value of each relative coordinate value change depends on how much the user moves in what direction.

The controller may check whether the change amount between the calculated relative coordinate value of the sensing region and the reference relative coordinate value variation exceeds the threshold change amount (S224).

The reference relative coordinate value may mean the relative coordinate value in the user's normal state. The reference relative coordinate value may be set by the manufacturer of the posture correction application developer, the manufacturer of the mobile terminal, or the user.

The control unit can confirm whether the user's posture has deviated from a certain range based on whether or not the amount of change between the relative coordinate value and the reference relative coordinate value for each sensing area calculated and obtained from the user's position detection sensor 144 exceeds the critical variation amount.

The reference relative coordinate value can be set for all sensing areas. Therefore, when the relative coordinate value calculated from each sensing area coincides one-to-one with the reference relative coordinate value for each sensing area for all corresponding sensing areas, the user can be perfectly normalized.

If the relative coordinate values calculated from some sensing regions among the relative coordinate values calculated from the respective sensing regions do not match the reference relative coordinate values, the degree of change between the relative coordinate values calculated from the partial sensing regions and the reference relative coordinate values (S226). ≪ / RTI >

For example, the amount of change between the relative coordinate value calculated for the face and the reference relative coordinate value does not exceed the threshold change amount, whereas the amount of change between the relative coordinate value calculated for the face and the reference relative coordinate value exceeds the critical change amount , It can be judged that the posture deficiency is the same as that of the turtle neck syndrome which occurs when the user's face is moved forward.

For example, the amount of change between the relative coordinate value calculated for the left side of the upper half of the body and the reference relative coordinate value does not exceed the critical variation, while the amount of change between the relative coordinate value calculated for the right side of the upper body and the reference relative coordinate value, If the amount of change is exceeded, the upper body of the user may be judged as a posture deflection which is inclined to the left.

For example, the amount of change between the relative coordinate value computed for the right side of the face and the reference relative coordinate value does not exceed the critical change amount, while the amount of change between the relative coordinate value computed for the left side of the face and the reference relative coordinate value, If the change amount is exceeded, the user's face may be judged as a posture deflection which is tilted to the right side.

The above-described critical change amount can be set by the manufacturer of the posture correction application developer, the manufacturer of the mobile terminal, or the user.

As described above, when various threshold change amounts corresponding to the respective failure postures are set, it can be determined that posture failure is caused by the exceeding of the corresponding critical change amount.

In addition to the situations described above, the user's posture defects can be determined through various threshold change amounts.

Referring again to FIG. 4, when the detected user's posture deviates from a predetermined range, the controller may transmit an alarm to the user using the mobile terminal (S230).

Alarms can be delivered using voice, vibration, text, and so on.

For example, through the sound output unit 152, a voice may be output saying "Your current posture may cause turtle neck syndrome. Move the face backward so that it is straight with the shoulder ".

For example, the character " Please move your face backward so that it is straight with the shoulder "may be displayed through the display unit 151." Your current posture may cause turtle neck syndrome.

For example, vibration may be generated for a few minutes so that the user can not use the mobile terminal until the user moves his / her face back and becomes normal.

The control unit can confirm whether the user's posture deficiency is solved (S240).

When the user is moved to return to the normal position in response to the alarm, the image data and the depth data are obtained from the user's plurality of sensing areas by the user's position detection sensor 144, and the control unit acquires the acquired image data and the depth Based on the data, the relative coordinate value for each sensing area is calculated. When the variation between the relative coordinate value for each sensing area and the reference relative coordinate value corresponding to each sensing area is close to 0 or 0, the user checks the variation amount between the calculated relative coordinate value and the reference relative coordinate value, It can be judged that the user has returned to the normal position.

If it is determined that the user has returned to the home position, the control unit may release the alarm being transmitted to the user (S250).

For example, the voice for the alarm transmitted to the sound output unit 152 may be stopped.

For example, the characters for the alarm being displayed on the display unit 151 may disappear.

For example, the vibration for the alarm that is continuously occurring in the mobile terminal can be stopped.

According to the first embodiment of the present invention, based on the image data and the depth data obtained by the user's position detection sensor 144 from a plurality of sensing areas divided into a specific part of the user, for example, upper body, neck and face A relative coordinate value for each sensing area is calculated and a determination is made as to whether or not the user is in a bad state based on the amount of change between the relative coordinate value for each of the calculated sensing areas and the reference relative coordinate value for indicating the normal state, By providing an alarm for the user to return to normalcy, it is possible to prevent various troubles such as a turtle neck syndrome caused by the user taking a bad posture unwittingly when the user uses the mobile terminal.

Furthermore, according to the first embodiment of the present invention, by continuously checking until the user returns to the normal taxation, the user can more effectively manage to return to the normal taxation.

FIG. 10 is a flowchart illustrating a method of correcting a user's posture using a mobile terminal according to a second embodiment of the present invention.

S330 to S370 in the second embodiment are the same as S210 to S250 explained in the first embodiment. Therefore, S330 to S370 can be easily understood from the first embodiment, and detailed description will be omitted.

Referring to FIG. 10, the terminal attitude sensor 143 may detect the terminal attitude (S310).

The terminal posture detection sensor 143 may be an acceleration sensor, which has been described in detail in the foregoing, and a further description thereof will be omitted.

The terminal attitude detection sensor 143 can detect the direction in which the terminal is tilted relative to the ground.

In the present invention, the reference posture position for the terminal can be set in advance. Here, the reference attitude for the terminal means that the terminal is tilted at a certain angle? With respect to the ground as shown in FIG. At this time, the constant angle? Can be set in the range of 0 to 45 degrees with respect to the ground. 0 ° may mean a state parallel to the ground.

Therefore, the reference posture position for the terminal can be set within the range of 0 DEG to 45 DEG with respect to the ground.

In the following description, it is assumed that the reference posture position for the terminal is set to 30 degrees with respect to the ground, but the present invention is not limited to this.

The controller checks whether the terminal posture detected by the terminal posture sensor 143 is at a reference position, for example, a tilted angle of 30 degrees with respect to the ground, and drives the user posture sensor 144 (S320).

That is, when the terminal posture is at the reference posture position, the controller can control the user posture sensor 144 to detect the user's posture (S320).

The subsequent operations, that is, S330 to S370 can be easily understood from S210 to S250 of the first embodiment, and further explanation will be omitted.

According to the second embodiment of the present invention, when the terminal posture becomes a specific position, for example, a correct position, the user is prompted to drive the user's posture sensor 144, thereby unnecessarily driving the user's posture sensor 144 indiscriminately Can be solved.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). In addition, the computer may include a control unit 180 of the mobile terminal 100. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

100: mobile terminal 110: wireless communication unit
120: Input unit
140: sensing unit 150: output unit
160: interface unit 170: memory
180: control unit 190: power supply unit

Claims (15)

A user sensing sensor for acquiring image data and depth data on a plurality of sensing areas of a user; And
Calculating relative coordinate values of each of the sensing areas based on the depth data,
A change amount of a relative coordinate value of each of the sensing regions is checked,
And controlling the alarm to be generated when the user's attitude is bad according to the confirmation result.
The method according to claim 1,
Wherein the sensing region includes at least an upper body, a neck and a face including a shoulder.
3. The method of claim 2,
Wherein,
And separates the upper body, the neck and the face from the background area based on the image data.
The method according to claim 1,
Wherein the depth data includes distance data and angle data,
Wherein the controller calculates a relative coordinate value of each of the sensing areas based on the distance data and the angle data.
5. The method of claim 4,
Wherein the relative coordinate value is a relative coordinate value calculated in a remaining sensing area based on a reference coordinate value calculated in a specific sensing area among the plurality of sensing areas.
The method according to claim 1,
Wherein,
And determining whether a change amount between a relative coordinate value of each of the detection regions and a reference relative coordinate value variation amount exceeds a critical variation amount,
When the amount of change between the relative coordinate value of each of the sensing regions and the reference relative coordinate value change amount exceeds the threshold change amount,
Wherein the reference relative coordinate value is a relative coordinate value in a normal state of the user.
The method according to claim 1,
Wherein,
The alarm is continuously generated until the user's posture returns to the normal state,
And releases the alarm when the user's posture returns to the normal state.
The method according to claim 1,
Further comprising a terminal attitude detecting sensor for detecting a terminal attitude,
Wherein,
When the terminal is located in the reference posture, drives the user detection sensor,
Wherein the reference posture of the terminal is a position tilted at a predetermined angle with respect to the ground.
9. The method of claim 8,
Wherein the user attitude sensor is a depth sensor,
Wherein the terminal posture sensor is an acceleration sensor.
Acquiring image data and depth data on a plurality of sensing areas of a user using a user sensing sensor;
Calculating relative coordinate values of each of the sensing areas based on the depth data;
Confirming a change amount of a relative coordinate value of each of the sensing regions; And
And generating an alarm when the user's attitude is bad according to the result of the determination.
11. The method of claim 10,
Wherein the sensing region comprises at least a shoulder, an upper body, a neck and a face.
11. The method of claim 10,
Wherein the depth data includes distance data and angle data,
Wherein the step of calculating the relative coordinate value comprises:
Calculating the relative coordinate value of each of the sensing regions based on the distance data and the angle data.
11. The method of claim 10,
The method of claim 1,
Confirming whether a change amount between a relative coordinate value of each of the sensing regions and a reference relative coordinate value variation amount exceeds a critical variation amount; And
And determining that the user posture is defective when the amount of change between the relative coordinate values of the sensing regions and the reference relative coordinate value variation exceeds the threshold change amount,
Wherein the reference relative coordinate value is a relative coordinate value in a normal state of the user.
11. The method of claim 10,
Continuously generating an alarm until the user's posture returns to normal; And
And canceling the alarm when the user's posture returns to the normal state.
11. The method of claim 10,
Sensing a terminal posture; And
Further comprising driving the user detection sensor when the terminal is positioned in a reference posture,
Wherein the reference posture of the terminal is a position that is inclined at a predetermined angle with respect to the ground.
KR1020150079066A 2015-06-04 2015-06-04 Mobile terminal and method for correting a posture using the same KR20160143036A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150079066A KR20160143036A (en) 2015-06-04 2015-06-04 Mobile terminal and method for correting a posture using the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150079066A KR20160143036A (en) 2015-06-04 2015-06-04 Mobile terminal and method for correting a posture using the same

Publications (1)

Publication Number Publication Date
KR20160143036A true KR20160143036A (en) 2016-12-14

Family

ID=57575717

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150079066A KR20160143036A (en) 2015-06-04 2015-06-04 Mobile terminal and method for correting a posture using the same

Country Status (1)

Country Link
KR (1) KR20160143036A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101851099B1 (en) * 2017-01-19 2018-04-20 신소현 Method and System for Preventing Software-based Forward Head Posture Using Image Processing and Machine Learning
KR20210000513A (en) * 2019-06-25 2021-01-05 수원대학교산학협력단 Forward head posture protection device and controlling method thereof
CN113239748A (en) * 2021-04-26 2021-08-10 深圳市安星数字系统有限公司 Radar monitoring method, device, equipment and storage medium
KR102381542B1 (en) * 2020-11-11 2022-03-31 숙명여자대학교산학협력단 System including algorithm for determining real-time turtle neck posture, responsive cradle interlocking with the system, and control method thereof
WO2022103406A1 (en) * 2020-11-16 2022-05-19 Google Llc Posture detection system
WO2024080579A1 (en) * 2022-10-11 2024-04-18 삼성전자주식회사 Wearable device for guiding user's posture and method thereof

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101851099B1 (en) * 2017-01-19 2018-04-20 신소현 Method and System for Preventing Software-based Forward Head Posture Using Image Processing and Machine Learning
KR20210000513A (en) * 2019-06-25 2021-01-05 수원대학교산학협력단 Forward head posture protection device and controlling method thereof
KR102381542B1 (en) * 2020-11-11 2022-03-31 숙명여자대학교산학협력단 System including algorithm for determining real-time turtle neck posture, responsive cradle interlocking with the system, and control method thereof
WO2022103406A1 (en) * 2020-11-16 2022-05-19 Google Llc Posture detection system
CN113239748A (en) * 2021-04-26 2021-08-10 深圳市安星数字系统有限公司 Radar monitoring method, device, equipment and storage medium
WO2024080579A1 (en) * 2022-10-11 2024-04-18 삼성전자주식회사 Wearable device for guiding user's posture and method thereof

Similar Documents

Publication Publication Date Title
KR20160143036A (en) Mobile terminal and method for correting a posture using the same
CN106899801B (en) Mobile terminal and control method thereof
KR20170137476A (en) Mobile device and method for controlling thereof
CN112907725B (en) Image generation, training of image processing model and image processing method and device
KR101783773B1 (en) Dual camera module and mobile terminal comprising the camera
CN109558837B (en) Face key point detection method, device and storage medium
KR20160008372A (en) Mobile terminal and control method for the mobile terminal
KR20180040409A (en) Mobile terminal and method for controlling the same
CN110933452B (en) Method and device for displaying lovely face gift and storage medium
US9916004B2 (en) Display device
KR101718043B1 (en) Mobile terminal and method of controlling the same
CN112470450A (en) Mobile terminal
KR20170057058A (en) Mobile terminal and method for controlling the same
KR102218919B1 (en) Mobile terminal
CN108317992A (en) A kind of object distance measurement method and terminal device
US11032467B2 (en) Mobile terminal and control method thereof for obtaining image in response to the signal
KR20190054727A (en) Smart mirror device and and method the same
KR20160038409A (en) Mobile terminal and method for controlling the same
WO2020078277A1 (en) Structured light support and terminal device
KR20210068877A (en) Method and electronic device for correcting image according to camera switch
CN111753606A (en) Intelligent model upgrading method and device
KR20170036489A (en) Camera module and mobile terminal communicatively therewith
KR20160005862A (en) Mobile terminal and method for controlling the same
CN111829651B (en) Method, device and equipment for calibrating light intensity value and storage medium
KR102084161B1 (en) Electro device for correcting image and method for controlling thereof