WO2020138840A1 - Dispositif d'affichage pour correction de posture et procédé de commande correspondant - Google Patents

Dispositif d'affichage pour correction de posture et procédé de commande correspondant Download PDF

Info

Publication number
WO2020138840A1
WO2020138840A1 PCT/KR2019/018076 KR2019018076W WO2020138840A1 WO 2020138840 A1 WO2020138840 A1 WO 2020138840A1 KR 2019018076 W KR2019018076 W KR 2019018076W WO 2020138840 A1 WO2020138840 A1 WO 2020138840A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
user
display device
unit
distance
Prior art date
Application number
PCT/KR2019/018076
Other languages
English (en)
Korean (ko)
Inventor
김승훈
류지헌
송현아
정현주
홍보람
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of WO2020138840A1 publication Critical patent/WO2020138840A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1601Constructional details related to the housing of computer displays, e.g. of CRT monitors, of flat displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • An embodiment of the present disclosure relates to a display device for posture correction and a control method thereof.
  • an embodiment of the present disclosure relates to a display device capable of correcting a posture of a user watching or using the display device, and a control method thereof.
  • the display device may include a computer monitor, a digital television, a tablet PC, a notebook-type display, and the like.
  • a configuration for outputting visual data will be referred to as a'display'.
  • Turtleneck syndrome is occurring in people who see a lot of displays, especially those looking down at a monitor at a low position.
  • Turtle Neck Syndrome When the display is below eye level, the user gazes at the display for a long time with his neck bent down to look down. In this case, the user is more likely to develop turtle neck syndrome.
  • the typical symptoms of Turtle Neck Syndrome include pain, headache and fatigue in the back neck and shoulders. If the Turtle Neck Syndrome is not treated, work efficiency and learning efficiency may drop continuously.
  • An object of the present disclosure is to provide a display device capable of correcting a user's posture and a control method thereof.
  • an embodiment of the present disclosure aims to provide a display device and a control method thereof to correct the posture of a user to prevent the occurrence of turtle neck syndrome.
  • a display device includes a display;
  • a driving unit which is connected to a rear surface of the display, and includes a tilt adjusting unit for adjusting a tilt of the display and a distance adjusting unit including a plurality of columns that are provided to overlap each other to linearly drive in both directions;
  • a sensing unit acquiring first information indicating a user's posture;
  • at least one processor executing at least one instruction, in a first direction to increase a vertical distance between the user and the front of the display while decreasing a horizontal distance between the user and the front of the display based on the first information.
  • a control unit controlling the display to move, and controlling the display to move in a second direction opposite to the first direction when the distance the display moves in the first direction becomes a first limit value.
  • a display device and a control method thereof according to an embodiment of the present disclosure can correct a user's posture without disturbing a user's work while the user uses the display device.
  • the display device and a control method thereof may induce a correct posture of the user by allowing the user to perform neck movement unconsciously while using the display device.
  • 1 is a view showing a posture of a user using a display device.
  • FIG. 2 is a block diagram illustrating a display device according to an exemplary embodiment of the present disclosure.
  • FIG. 3 is another block diagram illustrating a display device according to an exemplary embodiment of the present disclosure.
  • FIG. 4 is a flowchart illustrating a method of controlling a display device according to an embodiment of the present disclosure.
  • FIG. 5 is a view for explaining the operation of the display device according to an embodiment of the present disclosure.
  • 6 and 7 are diagrams for explaining the structure of a display device according to an embodiment of the present disclosure.
  • FIG. 8 is a flowchart illustrating in detail a first information acquisition operation among control methods of a display device.
  • FIG. 9 is a view for explaining the operation of the display device according to an embodiment of the present disclosure.
  • FIG. 10 is another flowchart illustrating a method of controlling a display device according to an embodiment of the present disclosure.
  • 11 is a view for explaining a user's viewing angle.
  • FIG. 12 is a diagram for describing a movement of a display according to an embodiment of the present disclosure.
  • FIG. 13 is another flowchart illustrating a method of controlling a display device according to an embodiment of the present disclosure.
  • FIG. 14 is another diagram for describing movement of a display according to an embodiment of the present disclosure.
  • 15 is another flowchart illustrating a method of controlling a display device according to an embodiment of the present disclosure.
  • 16 is another flowchart illustrating a method of controlling a display device according to an embodiment of the present disclosure.
  • 17 is another flowchart illustrating a method of controlling a display device according to an embodiment of the present disclosure.
  • FIG. 18 is a diagram illustrating a specific configuration of a driving unit in a display device according to an exemplary embodiment of the present disclosure.
  • FIG. 19 is another diagram illustrating a specific configuration of a driving unit in a display device according to an exemplary embodiment of the present disclosure.
  • 20 is another block diagram illustrating a display device according to an exemplary embodiment of the present disclosure.
  • a display device includes a display;
  • a driving unit which is connected to a rear surface of the display, and includes a tilt adjusting unit for adjusting a tilt of the display and a distance adjusting unit including a plurality of columns that are provided to overlap each other to linearly drive in both directions;
  • a sensing unit acquiring first information indicating a user's posture;
  • at least one processor executing at least one instruction, in a first direction to increase a vertical distance between the user and the front of the display while decreasing a horizontal distance between the user and the front of the display based on the first information.
  • a control unit controlling the display to move, and controlling the display to move in a second direction opposite to the first direction when the distance the display moves in the first direction becomes a first limit value.
  • the first information may include information indicating a distance between each of the user's neck and face from the front of the display.
  • control unit may control the driving unit to adjust the inclination of the display such that the front of the display is inclined toward the user as the horizontal distance decreases.
  • control unit may generate a movement trajectory of the display according to a time corresponding to the user based on the first information, and control movement of the display according to the movement trajectory.
  • control unit may control the driving unit such that the display moves at a first speed having a fixed value.
  • control unit may control the display to move in the first direction when the distance the display moves in the second direction becomes a second limit value.
  • the first limit value may be a value corresponding to the moving distance of the display such that the distance between the front of the display and the user is a proximity limit value.
  • the tilt adjusting unit may include a plurality of gears rotating at a first gear ratio having a fixed value so that the tilt of the display increases in proportion to the distance the plurality of columns move.
  • the distance adjusting unit may include the plurality of columns in which the plurality of columns have a telescopic column shape and linearly move in both directions including the first direction and the second direction.
  • control unit may determine whether an obstacle exists on the front surface of the display based on the detection result of the detection unit, and control the movement of the display according to whether the obstacle exists.
  • the display device may further include a notification unit that outputs at least one of a visual notification and an audio notification. Then, when it is determined that the obstacle exists, the controller may control the notification unit to output a notification signal informing of the presence of the obstacle.
  • control unit may control the movement of the display so that the display repeatedly moves in the first direction and the second direction.
  • control unit controls the first information to be periodically obtained, periodically determines whether the user's posture is a posture requiring correction based on the first information, and based on the determination result, Movement can be controlled.
  • the sensing unit may include a TOF camera.
  • a control method of a display device includes obtaining first information indicating a user's posture; Controlling, based on the first information, a display included in a display device to move in a first direction such that a vertical distance between the user and the front of the display increases while a horizontal distance between the user and the front of the display decreases; And when the distance the display moves in the first direction becomes a first limit value, controlling the display to move in a second direction opposite to the first direction.
  • Some embodiments may be represented by functional block configurations and various processing steps. Some or all of these functional blocks may be implemented with various numbers of hardware and/or software configurations that perform particular functions.
  • the functional blocks of the present disclosure can be implemented by one or more processors or microprocessors, or by circuit configurations for a given function.
  • functional blocks of the present disclosure may be implemented in various programming or scripting languages.
  • the functional blocks can be implemented with an algorithm running on one or more processors.
  • the present disclosure may employ conventional techniques for electronic environment setting, signal processing, and/or data processing. Terms such as module and configuration may be widely used, and are not limited to mechanical and physical configurations.
  • connection lines or connection members between the components shown in the drawings are merely illustrative of functional connections and/or physical or circuit connections. In an actual device, connections between components may be represented by various functional connections, physical connections, or circuit connections that are replaceable or added.
  • An embodiment of the present disclosure relates to a display device and a control method thereof.
  • the display device may be any electronic device that outputs visual data to the user.
  • the display device may exist in various forms such as a computer, a computer monitor, a TV, and a terminal for digital broadcasting.
  • the display device according to an embodiment of the present disclosure is a display device used as a monitor of a computer will be described and illustrated as an example.
  • 1 is a view showing a posture of a user using a display device.
  • the user 101 can perform a task using the display device 110 for a long time.
  • the display device 110 shown in FIG. 1 has an example in which the height cannot be adjusted and the screen is output at a fixed position.
  • the user 101 bends his back or bows his neck toward the display 111 to view the screen output from the display 111. You will be posing. However, as shown in the figure, when the display device 110 is used for a long time in a posture of bending or leaning the neck forward, pain may occur in the user's neck and waist. In addition, when working in this position for a long period of time, the likelihood of diseases such as turtle neck syndrome increases.
  • a display device and a control method thereof are provided to prevent or correct a bad posture of a user using the display device 110.
  • a display device and a control method thereof according to an embodiment of the present disclosure will be described in detail with reference to FIGS. 2 to 20 below.
  • FIG. 2 is a block diagram illustrating a display device according to an exemplary embodiment of the present disclosure.
  • the display device 200 includes a display 210, a driving unit 220, a sensing unit 230, and a control unit 240.
  • the display device 200 may be any electronic device for outputting visual data to a user.
  • the display device 200 may be any electronic device including a display, such as a computer monitor, laptop, TV, and the like.
  • an electronic device according to an embodiment of the present disclosure will be described and illustrated as an example of a display device used as a monitor of a computer.
  • the display 210 may output visual data. Specifically, the display 210 may output an image corresponding to the video data through a display panel included internally so that the user can visually recognize the video data.
  • the display 210 is curved with a flat display and a display panel formed of a flat surface without a curvature. It can be divided into a curved display formed. Further, the display 210 may include various types of display panels corresponding to various light emitting elements, for example, an OLED panel, an LED panel, an LCD panel, or a PDP panel.
  • the driving unit 220 is connected to the rear surface of the display 210, and includes a tilt adjusting unit 221 for adjusting the tilt of the display 210 and a plurality of columns that overlap each other to linearly drive in both directions. It includes an adjustment unit 222.
  • the driving unit 220 is a configuration for moving the display 210, the display 210 can be moved vertically or the display 210 can be tilted to a certain slope. have.
  • the sensing unit 230 acquires first information indicating a user's posture.
  • the first information may include distance information indicating a posture of a user located in front of the display 210.
  • the first information may include information indicating a distance between each of the user's neck and face from the front surface of the display 210.
  • the first information may include a distance from the front of the display 210 to the front of the user's face and a distance from the front of the user's face to the neck of the user.
  • 1 information may include the distance from the front of the display 210 to the front of the user's face and the first information may include the distance from the front of the display to the front of the user's neck.
  • the sensing unit 230 may include at least one sensor capable of detecting the distance between the user's face and neck, which is information indicating the user's posture.
  • the sensing unit 230 may include a distance sensor (not shown) capable of measuring the distance to the target object.
  • the sensing unit 230 may include a TOF sensor (Time of Flight) sensor, a three-dimensional camera, an ultrasonic sensor, and the like.
  • the TOF sensor refers to a sensor that is capable of detecting or calculating a distance to an object by measuring the time that is reflected by shooting light, for example, infrared light, as an object that is the object of the distance measurement.
  • a device that outputs a depth image using a TOF sensor may be referred to as a TOF camera or a depth camera.
  • the TOF camera may output a depth image indicating the distance of the object measured by TOF technology. That is, as a result of sensing the TOF sensor, a distance or depth value from the sensor to the object may be obtained.
  • the depth (depth) refers to the distance from the sensor to the object, for example, the user of the display device 200, the distance value and the depth value may be used in the same sense.
  • the 3D camera refers to an image acquisition device that photographs a subject by allowing an object to be imaged in the image to have a sense of depth.
  • the 3D camera may include a stereo camera, and may acquire 3D information about a subject (eg, a patient) existing in the scene.
  • the stereo camera may include a plurality of cameras for acquiring left and right eye images respectively.
  • the stereo camera may include an L camera (not shown) for acquiring a left-eye image and an R camera (not shown) for acquiring a right-eye image.
  • the stereo camera may capture a user of the display device 200 as a subject and obtain left eye data corresponding to the left eye image and right eye data corresponding to the right eye image.
  • the controller 240 may obtain 3D information about the user (for example, a distance value of each face and neck of the user) using the left-eye data and the right-eye data obtained from the stereo camera.
  • the sensing unit 230 includes a TOF camera (not shown) capable of measuring depth using TOF technology will be described as an example.
  • the control unit 240 may control a predetermined operation to be performed using at least one processor that executes at least one instruction. Specifically, based on the first information, the control unit 240 displays in the first direction, which is a direction in which the vertical distance between the user and the front surface of the display 210 increases while the horizontal distance between the user and the front surface of the display 210 decreases. 210 is controlled to move. Then, when the distance the display 210 moves in the first direction becomes the first limit value, the display 210 is controlled to move in the second direction opposite to the first direction.
  • control unit 240 may include at least one processor (not shown) that executes at least one instruction. Then, the controller 240 may execute at least one command to control certain operations to be performed. Specifically, here, each of the at least one processor (not shown) may execute a predetermined operation by executing at least one of the one or more instructions.
  • the at least one processor described above may be included in the control unit 240.
  • the controller 240 may include an internal memory (not shown) and at least one processor (not shown) executing at least one stored program.
  • the internal memory (not shown) of the control unit 240 may store one or more commands.
  • at least one processor (not shown) included in the control unit 240 may execute at least one of one or more commands stored in the internal memory (not shown) of the control unit 240 to execute a predetermined operation.
  • the controller 240 stores a signal or data input from the outside of the display device 200, or a RAM (not shown) used as a storage area corresponding to various operations performed in the display device 200, a display A control program for controlling the device 200 and/or a ROM (not shown) in which a plurality of instructions are stored and at least one processor (not shown) may be included.
  • the processor (not shown) may include a graphic processor (not shown) for graphic processing corresponding to video.
  • the processor (not shown) may be implemented as a system on chip (SoC) that integrates a core (not shown) and a GPU (not shown).
  • SoC system on chip
  • the processor (not shown) may include a single-core, dual-core, triple-core, quad-core, and multiple cores thereof.
  • At least one processor (not shown) included in the control unit 240 may control operations performed in the display device 200, and control other components included in the display device 200 to perform a predetermined operation. can do. Accordingly, even if the controller 240 is described as controlling to perform certain operations, it will be apparent that at least one processor (not shown) included in the controller 240 controls to perform certain operations.
  • FIG. 3 is another block diagram illustrating a display device according to an exemplary embodiment of the present disclosure.
  • the same configuration as in the display device 200 illustrated in FIG. 2 is illustrated using the same reference numerals. Therefore, in describing the display device 300 illustrated in FIG. 3, detailed descriptions overlapping with those in FIG. 2 are omitted.
  • the display device 300 may further include at least one of a memory 250, a user interface 260, an input/output unit 270, and a notification unit 280, compared to the display device 200.
  • the memory 250 may include at least one program necessary for the display device 300 to operate or at least one instruction necessary for at least one program to be executed. Also, the memory 250 may include at least one processor (not shown) for performing the above-described operations.
  • the memory 250 may store the first information acquired by the sensing unit 230 under the control of the control unit 240.
  • the user interface 260 may receive predetermined data or predetermined commands from the user. Also, the user interface 260 may be formed as a touch screen formed integrally with the display 210. As another example, the user interface 260 may include a user input device such as a pointer, mouse, and keyboard.
  • the input/output unit 270 is controlled by the control unit 240 from the outside of the display device 300, such as video (eg, video), audio (eg, voice, music, etc.) and additional government (eg For example, EPG, etc.).
  • the input/output unit 270 includes an HDMI port (High-Definition Multimedia Interface port, not shown), a component jack (not shown), a PC port (PC port, not shown), and a USB port (USB port, not shown) City).
  • the input/output unit 270 may include a combination of an HDMI port (not shown), a component jack (not shown), a PC port (not shown), and a USB port (not shown).
  • the input/output unit 270 may include a port for connecting at least one of a mouse and a keyboard.
  • the notification unit 280 may include a notification element capable of outputting a visual and audible notification to the user.
  • the notification unit 280 may include at least one LED element (not shown) that can output different colors.
  • the notification unit 280 may output a notification signal indicating whether the posture of the user is correct based on the first information under the control of the control unit 240.
  • the control unit 240 may determine whether an obstacle exists in the direction of movement of the display 210 based on the detection result of the detection unit 230 and control the alarm signal to be output when it is determined that the obstacle exists. have.
  • the notification unit 280 may control to output a notification signal informing that there is an obstacle in the direction of movement of the display 210 according to the control of the control unit 240.
  • FIG. 4 is a flowchart illustrating a method of controlling a display device according to an embodiment of the present disclosure.
  • FIG. 4 may be a flow chart for describing operations performed in the display device 200 or 300 according to the exemplary embodiment of the present disclosure described with reference to FIGS. 2 to 3.
  • the control method 400 of the display device according to the exemplary embodiment of the present disclosure may be performed through the display device 200 or 300 described with reference to FIGS. 2 to 3. Therefore, the control method 400 of the display device may include the same configuration features as the display device 200 or 300 described above.
  • control method 400 of the display device will be described in detail with reference to the display device 200 described with reference to FIG. 2.
  • the control method 400 of the display device acquires first information indicating a user's posture (S410).
  • the operation of step S410 may be performed through the control unit 240 and the detection unit 230.
  • a TOF camera included in the sensing unit 230 emits light, for example, an infrared signal to the front of the display 210 under the control of the control unit 240, and the emitted infrared signal. It is possible to measure the time of reflection and return of the subject.
  • the control unit 240 may obtain the first information indicating the user's posture based on the result detected by the detection unit 230, for example, time information (S410).
  • time information S410
  • FIG. 5 is a view for explaining the operation of the display device according to an embodiment of the present disclosure.
  • the same configuration as in FIG. 2 is illustrated using the same reference numerals.
  • the user 501 may perform a document task or a predetermined task using the display device 200.
  • the user 501 may perform a predetermined task while operating the keyboard 530 while looking at the display device 200 which is the monitor of the computer.
  • the keyboard 530 may be included in the user interface 260 described above.
  • the keyboard 530 may be a separate external device from the display device 200, and in this case, the keyboard 530 may be connected to the display device 200 through the input/output unit 270.
  • At least one sensor included in the sensing unit 230 may be a TOF camera.
  • the TOF camera 230 may photograph the front surface of the display 210 and photograph a depth image corresponding to the front surface of the display 210.
  • FIG. 5 an example in which the sensing unit 230 is formed in a form attached to the stand 290 for supporting the display 210 included in the display device 200 is illustrated.
  • FIG. 9 is a view for explaining the operation of the display device according to an embodiment of the present disclosure.
  • the sensing unit 230 may be formed at the bottom of the front of the display 210 so as to photograph the posture of the user 501.
  • the sensing unit 230 may be formed at a predetermined position 230-1 on the front top of the display 210 so as to photograph the posture of the user 501.
  • the sensing unit 230 may be disposed at various positions to photograph the front surface of the display 210 in addition to the positions shown in FIGS. 5 and 9.
  • the first information obtained in step S410 may obtain information indicating a distance between each of the neck and face of the user 501 from the front surface of the display 210.
  • the first information may be a depth image acquired by the sensing unit 230.
  • the first information may include a distance between the display 210 and the neck of the user 501 calculated based on the depth image, and a distance to the face of the display 210 and the user 501.
  • the first information may include a distance between the user's neck and face, which is a criterion for determining whether the user's posture is good or bad. Specifically, when the user's face is a posture that is more outward than the user's shoulder, it may be determined as a bad posture. That is, as the distance between the front face 502 and the face 503 below the neck increases, the user's posture can be said to be an easy posture to induce turtle neck syndrome. Accordingly, the first information may include the distance between the user's neck and the face, and the control unit 240 may determine whether to move the display 210 to correct the user's posture based on the first information.
  • the controller 240 may generate a movement trajectory of the display according to the time corresponding to the user, based on the first information.
  • the movement of the display 210 may be controlled according to the generated movement trajectory. The operation of generating the movement trajectory based on the first information will be described in detail with reference to FIG. 8 below.
  • FIG. 8 is a flowchart illustrating in detail a first information acquisition operation among control methods of a display device.
  • step S410 illustrated in FIG. 4 may include steps S810 to S850. Also, the operations illustrated in FIG. 8 may be performed under the control of the control unit 240 of the display device 200.
  • the detection unit 230 acquires a detection result from a sensor, for example, a TOF camera (S810).
  • a sensor for example, a TOF camera (S810).
  • the TOF camera 230 may transmit an infrared signal toward a subject, for example, a user present on the front surface of the display 210, and detect a time when the transmitted infrared signal is reflected by the user and reflected back. have.
  • control unit 240 may obtain the depth value of the region corresponding to the user's throat and the depth value of the region corresponding to the user's face based on the detection result in step S810 (S820 ). Specifically, the controller 240 may generate a depth image of the photographed subject based on the time detected in step S810. Specifically, the TOF camera 230 may acquire the depth value d1 up to the front face 502 of the user and the depth value d2 up to the face 503 below the user's neck (S820 ).
  • the control unit 240 considers the position difference between the TOF camera 230 and the front surface of the display 210, and based on the depth values obtained in step S820, between the front surface of the display 210 and the face of the user 501.
  • the distance and the distance between the front of the display 210 and the neck of the user 501 may be respectively obtained (S830). Since the positions of the TOF camera 230 and the front side of the display 210 are known values, a face transformation of the front side of the display 210 and the user 501 is performed by performing coordinate transformation or distance transformation operations on the depth values d1 and d2. The distance between the front and the front of the display 210 and the neck of the user 501 may be respectively obtained.
  • the controller 240 may obtain the distance d3 between the neck and the face of the user 501 using the distance value obtained in step S830 (S840).
  • control unit 240 may generate a movement trajectory of the display 210 according to the time corresponding to the user 501 based on the distance d3 between the neck and the face of the user 501 obtained in step S840. (S850).
  • the movement trajectory represents the movement path of the display 210 over time.
  • the control unit 240 may generate a movement trajectory corresponding to the user's posture based on the detection result of the detection unit 230. For example, if it is determined that the user's posture is good as a result of detection by the sensing unit 230, a movement trajectory that prevents the display 210 from moving may be generated. As another example, when it is determined that the user's posture is bad as a result of the detection of the sensing unit 230, a movement trajectory corresponding to the movement of the display 210 to induce the user's neck movement may be generated.
  • the distance between the user's face and the neck is large (that is, when the user pulls the neck more forward), the distance between the user's face and the neck becomes smaller.
  • a movement path of the display 210 may be generated.
  • step S420 the control method 400 of the display apparatus is based on the first information obtained in step S410, while the horizontal distance between the display 210 and the user 501 and the front surface of the display 210 decreases.
  • the vertical distance between the user 501 and the front surface of the display 210 is controlled to move in the first direction 515 to be increased (S420).
  • the operation of step S420 may be performed by the control unit 240.
  • the control unit 240 may control the driving unit 220 so that the display 210 moves in the first direction 515.
  • the first direction 515 may be a diagonal direction toward the user 501 as shown in FIG. 5.
  • the horizontal distance between the front of the display 210 and the user 501 is reduced, and the height of the display 210 is increased so that the front 210 of the display 210 and the user 501 ) To increase the vertical distance.
  • control unit 240 may control the driving unit 220 to adjust the inclination of the display 210 such that the front of the display 210 is inclined toward the user 501 as the above-described horizontal distance decreases.
  • the slope of the display 210-1 may have a larger value than the slope of the display 210.
  • step S430 may be performed by the controller 240.
  • the control unit 240 may control the driving unit 220 so that the display 210 moves in the second direction 517.
  • the first limit value may be a value corresponding to the moving distance of the display 210 such that the distance between the surface of the display 210 and the user 501 becomes the proximity limit value.
  • the configuration related to the proximity limit value will be described in detail below with reference to FIG. 12.
  • 6 and 7 are diagrams for explaining the structure of a display device according to an embodiment of the present disclosure. 6 and 7, the same configuration as in FIGS. 2 and 5 is illustrated using the same reference numerals.
  • FIG. 6 shows a view of the display device 200 shown in FIG. 5 from the front direction 571
  • FIG. 7 shows a view of the display device 200 shown in FIG. 6 from the rear direction 572.
  • the driving unit 220 of the display device 200 may include a tilt adjusting unit 221 and a distance adjusting unit 222.
  • the tilt adjusting unit 221 may adjust the tilt of the display 210.
  • the tilt adjusted by the tilt adjusting unit 221 may mean a tilt angle indicating the angle at which the display 210 is tilted forward.
  • the distance adjusting unit 222 includes a plurality of columns 610, 211, and 612, and is linearly driven in both directions to move the position of the display 210.
  • the plurality of columns 610, 211, and 612 included in the distance adjusting unit 222 have a telescopic column shape and a first direction 515 and a second direction 517 as shown. It may linear motion in both directions including. That is, for the plurality of columns 610, 211, and 612, the top column 610 may be inserted into the previous step column 611, and the column 611 may be inserted into the previous step column 612.
  • the column 612 may be inserted into the case of the driving unit 220. That is, in the plurality of columns 610, 211, and 612, the highest column 610 is a column that moves to the farthest, and the lowest column 612 can be a column that moves the shortest.
  • the tilt adjustment unit 221 is a first gear ratio having a fixed value, so that the gradient of the display 210 increases in proportion to the distance that the plurality of columns 610, 211, and 612 move. It may include a plurality of rotating gears. Here, a plurality of gears included in the tilt adjusting unit 221 may be referred to as a gear set.
  • the tilt adjusting unit 221 may include a connection element connected to a bracket of the display 210.
  • the bracket of the display 210 may satisfy a common standard.
  • the connection element included in the tilt adjustment unit 221 may have a form that can be connected to or disconnected from the bracket of the display 210 having a common standard.
  • the gear set included in the tilt adjustment unit 221 may be connected to the top column 610 of the distance adjustment unit 222.
  • the display 210 may move farthest in the first direction 515, and the plurality of columns 610, When both 211 and 612 are inserted into the case of the driving unit 220, the display 210 may move farthest in the second direction 517.
  • FIG. 7 shows the rear surface of the display device 200 illustrated in FIG. 6, and the same configuration as in FIG. 6 is illustrated using the same reference numerals.
  • FIG. 7 a case is illustrated in which a plurality of columns 610, 211, and 612 included in the distance adjusting unit 222 are inclined at a fixed angle.
  • the degree of inclination of the plurality of columns 610, 211, and 612 included in the distance adjusting unit 222 may vary according to design specifications of the display device 200.
  • the driving unit 220 may further include a separate angle adjusting element (not shown) so that the degree of inclination of the plurality of columns 610, 211, and 612 included in the distance adjusting unit 222 is varied. will be.
  • the display 210 since the inclination adjusting unit 221 operates so that the inclination of the display 210 increases in proportion to the distance that the plurality of columns 610, 211, and 612 move, the display 210 is shown in FIG. It may move in the same path as in the first path 510 or the second path 511 shown in 5.
  • FIG. 10 is another flowchart illustrating a method of controlling a display device according to an embodiment of the present disclosure.
  • FIG. 10 may also be a flowchart for describing operations performed in the display device 200 or 300 according to an embodiment of the present disclosure described with reference to FIGS. 2 to 7.
  • the control method 1000 of the display device according to the exemplary embodiment of the present disclosure may be performed through the display device 200 or 300 described with reference to FIGS. 2 to 3. Therefore, the control method 1000 of the display device may include features identical to those of the display device 200 or 300 described above.
  • FIG. 10 the same configuration as in FIG. 4 is illustrated by using the same reference numerals, and thus a description overlapping with that in FIG. 4 is omitted.
  • control method 1000 of the display device may further include a step S1005 before performing the operation of step S410.
  • the control method 1000 of the display device may determine whether a user exists on the front surface of the display device 200 (S1005).
  • the operation of step S1005 may be performed through the control unit 240 and the detection unit 230.
  • a TOF camera included in the sensing unit 230 emits and emits light, for example, an infrared signal, to the front of the display 210 under the control of the control unit 240.
  • the infrared signal and the time reflected by the subject can be measured and returned.
  • the control unit 240 may determine whether a user exists on the front surface of the display 210 based on the detection result of the TOF camera (not shown).
  • step S1005 when it is determined that a user exists on the front surface of the display device 200, specifically, the front surface of the display 210, the operation of step S410 may be performed.
  • control method 1000 of the display device may include steps S410 to S430. It can be terminated without performing an operation.
  • FIG. 11 is a view for explaining a user's viewing angle.
  • the same configuration as in FIG. 5 is illustrated using the same reference numerals.
  • the user 501 can stare at the screen 1100 output from the display 210 and process a predetermined task.
  • the general field of view (FOV) 1110 that the user 501 can view without turning the neck from side to side may be approximately 35 degrees in each of the left and right sides.
  • the preferred viewing area 1120 which is a suitable area for a user to view the screen, may be approximately 15 degrees from the left and right sides, respectively.
  • the user when the user's gaze moves, the user unconsciously moves his head to view the object. Specifically, when the object with the gaze moves, the head moves at a speed corresponding to the movement speed of the object or slightly higher than the movement speed of the object for gaze pursuit.
  • the display 210 when the movement of the object the user is staring at, focusing on the point in which the movement of the user's head or neck occurs to follow the gaze, the display 210 is moved to correct the user's posture. Control.
  • the embodiment of the present disclosure through its own experiment, it was found that the user's movement occurs as follows.
  • the user moves the neck or head to place the screen of the display 210 in the viewing suitable area 1120.
  • the display 210 such as a monitor
  • the display 210 moves in a direction closer to the user
  • the viewing distance approaches the proximity threshold or less
  • the user 501 views the screen of the display 210 in a suitable viewing area ( 1120)
  • the neck or head is moved in a direction away from the screen of the display 210.
  • the proximity limit value could be obtained experimentally, and was measured to be about 450 mm.
  • the proximity limit value may vary depending on the size of a screen output by the display 210, a physical condition such as a user's height, and an installation height of the display 210. For example, the larger the size of the screen output by the display 210, the larger the proximity limit value may be.
  • the first limit value may be a value corresponding to the moving distance of the display 210 such that the distance between the front surface of the display 210 and the user 501 becomes the proximity limit value.
  • the proximity limit value may be set to a value of 450 mm.
  • the first limit value may be a distance that the display 210 moves to a point at which the distance 1212 between the front surface of the display 210 and the user 501 becomes 450 mm.
  • the first limit value is a plurality of columns (610, 611, 612) is moved from the original position (P1) to'the point where the distance between the front of the display 210 and the user 501 (1212) 450mm' Can be a distance.
  • FIG. 12 is a diagram for describing a movement of a display according to an embodiment of the present disclosure.
  • the display 210 may move according to the movement trajectory 1240 to correct the posture of the user.
  • the original position of the display 210 may be the P1 position
  • the slope 1231 of the display 210 at the P1 position may be referred to as 0 degrees.
  • the controller 240 is based on the first information, in the first direction 515 such that the vertical distance between the user and the front surface of the display 210 increases while the horizontal distance between the user 501 and the front surface of the display 210 decreases,
  • the display 210 is controlled to move. Specifically, based on the detection result of the sensing unit 230, when the distance between the user's face and the neck is large (that is, when the user pulls the neck more forward), the distance between the user's face and the neck becomes smaller. In order to move the user's face in the direction, a movement path of the display 210 may be generated.
  • the display 210 may allow the user to move the face backward so that the distance d3 between the user's face and the neck becomes the correct posture.
  • the distance that the display 210 moves in the first direction is the viewing angle 1110 described in FIG.
  • the viewing area 1120 the user may move the face in a direction away from the display 210 to be calculated based on the position of the display 210.
  • the control unit 240 of the display device 200 may cause the display 210 to move from the P1 position to the P5 along the 1240 path based on the first information indicating the user's posture.
  • the position of the front side of the display 210 becomes P5
  • the distance from the eye of the user 501_1 to the front of the display 210 may be 43 cm, and the user 501_1 faces back in accordance with the line of sight described in FIG. 11.
  • the screen of the display 210 may be viewed by moving it.
  • the distance between the neck and the face of the user 501_2 may be a d32 value.
  • the value of d32 is a value smaller than the value of d31, and may be a value that can be determined in a correct posture.
  • the movement of the display 210 may be controlled to guide the user to naturally move the neck or face to achieve the correct posture. Accordingly, it can be optimized for each individual user to induce the user's neck movement.
  • the controller 240 is proportional to the movement distance of the display 210 (specifically, the distance the plurality of columns 610, 611, and 612 move), and the tilt of the display (tilt)
  • the driving unit 220 specifically, the tilt adjusting unit 221 may be controlled.
  • the control unit 240 moves the distance of the display 210 (specifically, a plurality of columns
  • the driving unit 220 specifically, the tilt adjusting unit 221) may be controlled such that the tilt angle of the display increases linearly in proportion to the distance the 610, 611, and 612 move.
  • the controller 240 inclines the inclination of the display 210 in the P2 position and the inclination of the display 210 in the P3 position.
  • the driving unit 220 (specifically, the inclination adjustment unit 221) can be controlled so that the inclination of the display 210 is 28 degrees. That is, the controller 240 simultaneously adjusts the tilt angle of the display 210 as the display 210 moves upward, so that the display 210 faces downward as the display 210 moves upward. To tilt, the movement of the display 210 can be controlled.
  • the user can continue to stare at the screen of the display 210 without feeling discomfort.
  • the movement trajectory of the display 210 may be set to an experimentally optimized value.
  • the movement of the display 210 is such that the angle 1250 formed by the path in which the display 210 moves is 30 to 60 degrees relative to the horizontal direction 1211. You can set the trajectory. Specifically, the movement trajectory of the display 210 may be set such that the angle 1250 formed by the path in which the display 210 moves is 45 to 49 degrees relative to the horizontal direction 1211.
  • control unit 240 may control the driving unit 220 such that the display 210 moves at a first speed having a fixed value.
  • the speed of movement of the display 210 (specifically, the speed at which the plurality of columns 610, 611, and 612 move) may be set to satisfy a condition that the user is not aware of the movement of the display 210.
  • experimentally the above-described speed conditions were obtained at 2 mm/s or acceleration conditions at 10 mm/s ⁇ 2. However, these speed conditions may vary depending on the screen size of the display 210, the user's body condition, and the like, and may be updated by being experimentally optimized.
  • control unit 240 may control the driving unit 220 such that the plurality of columns 610, 611, and 612 move at a speed of 2 mm/s or less.
  • the user moves the head backward by moving the neck from the point where the display 210 of the display device 200 is located at the front of the user's face, specifically, about 45 cm from the user's eye. Showed. Accordingly, the user corrected the turtle neck posture according to the movement of the display 210 and took a correct posture. Therefore, in an embodiment of the present disclosure, the display 210 may be moved in a diagonal direction toward the user's position (eg, 515) to induce the user's turtle neck posture to be corrected.
  • the display 210 may be moved in a diagonal direction toward the user's position (eg, 515) to induce the user's turtle neck posture to be corrected.
  • FIG. 13 is another flowchart illustrating a method of controlling a display device according to an embodiment of the present disclosure.
  • FIG. 13 may also be a flowchart for describing operations performed in the display device 200 or 300 according to the exemplary embodiment of the present disclosure described with reference to FIGS. 2 to 7.
  • the control method 1300 of the display device according to the exemplary embodiment of the present disclosure may be performed through the display device 200 or 300 described with reference to FIGS. 2 to 3. Therefore, the control method 1300 of the display device may include features identical to those of the display device 200 or 300 described above.
  • FIG. 13 the same configuration as in FIG. 4 is illustrated by using the same reference numerals, and thus a description overlapping with that in FIG. 4 is omitted.
  • step S420 the control method 1300 of the display apparatus performs a subsequent step according to a result of determining whether the distance the display 210 moves in the first direction 515 becomes a first limit value. You can proceed (S1325).
  • the operation of step S1325 may be performed by the control unit 240.
  • the display 210 continues to the first direction 515 until the distance the display 210 moves in the first direction 515 becomes the first limit value. It can be controlled to move.
  • the control method 1300 of the display device may be configured such that when the distance the display 210 moves in the first direction 515 becomes the first limit value, the display 210 is the opposite direction of the first direction 515. It can be controlled to move in two directions (S430).
  • the control method 1300 of the display device continues to move the display 210 in the first direction 515 Can be controlled.
  • the display 210 moves in the first direction 515 and the distance 1212 between the front surface of the display 210 and the user 501 becomes 450 mm, the display 210 It may be controlled to move in the second direction, which is the opposite direction of the first direction 515.
  • step S430 the control method 1300 of the display device may proceed to a subsequent step according to a result of determining whether the distance the display 210 moves in the second direction becomes the second limit value (S1335 ).
  • the operation of step S1335 may be performed by the control unit 240.
  • the second limit value may be set based on a result of sensing the posture of the user 501.
  • the second limit value may be set to a value corresponding to the first limit value.
  • the second limit value may be set to the same value as the first limit value or a value below the first limit value.
  • the controller 240 may control the movement of the display 210 to end when the distance the display 210 moves in the second direction becomes the second limit value. In addition, the controller 240 may control the display 210 to continuously move in the second direction until the distance the display 210 moves in the second direction becomes the second limit value.
  • the control method 1300 of the display device is performed until the user becomes a comfortable distance in gazing at the display 210.
  • the display 210 may be moved in a second direction.
  • the controller 240 may control the movement of the display 210 such that the display 210 repeatedly moves in the first direction and the second direction.
  • the control method 1300 of the display device may repeatedly perform the reciprocating movement of the display 210 performed in steps S420 and S430.
  • the reciprocating motion may refer to the movement of the display 210 moving in the first direction and then moving again in the second direction.
  • FIG. 14 is another diagram for describing movement of a display according to an embodiment of the present disclosure.
  • control unit 240 may generate a movement trajectory of the display according to the time corresponding to the user, based on the first information.
  • the movement of the display 210 may be controlled according to the generated movement trajectory.
  • the movement trajectory may be set for each user based on the first information indicating the user's posture.
  • the movement trajectory may have a shape of a graph 1401.
  • the x-axis represents time
  • the y-axis represents the moving distance of the display 210. That is, the y-axis may represent the entire travel distance of the plurality of columns 610, 611, and 612 included in the driving unit 220.
  • a plurality of columns 610, 611, and 612 included in the driving unit 220 of the display 210 overlap all in the case. It may be inserted. That is, at the origin of time 0, the total movement distance of the plurality of columns 610, 611, and 612 may be zero. Then, the display 210 may move in the first direction until the time t1. Referring to block 1450, the moving distance of the display 210 until the time t1 may be increased to D2 (eg, 220 mm). For example, the plurality of columns 610, 611, and 612 may move D2 1410 a predetermined distance from the point P0.
  • D2 eg, 220 mm
  • the display 210 may change the movement direction from the first direction to the second direction, and move in the second direction until the time t2. Accordingly, the moving distance of the display 210 until the time t2 may be reduced to D1. Subsequently, based on the time t2, the display 210 may change the movement direction from the second direction to the first direction, and move in the first direction until the time t4. Accordingly, the moving distance of the display 210 may be increased to D3 (for example, 330 mm) until the time t4. Then, based on the time t4, the display 210 may change the movement direction from the first direction to the second direction, and move in the second direction until the time t6. Accordingly, the moving distance of the display 210 until the time t6 may be reduced to D1.
  • D3 for example, 330 mm
  • the display apparatus 200 may repeatedly perform movement in the first direction and movement in the second direction.
  • the D2 value may correspond to the initial motion width 1410
  • the D3 value may correspond to the maximum motion width 1420.
  • the maximum movement width 1420 is a maximum distance that the display 210 can move, and may be a movement distance corresponding to a case where each of the plurality of columns 610, 611, and 612 protrudes as much as possible.
  • the movement width of the display 210 may be gradually increased so that the user can perform neck movement naturally without feeling a burden.
  • the display 210 may be initially moved to the initial motion width 1410, and then the display 210 may be moved to the maximum motion width 1420.
  • FIG. 15 is another flowchart illustrating a method of controlling a display device according to an embodiment of the present disclosure.
  • FIG. 15 may also be a flowchart for describing operations performed in the display device 200 or 300 according to the exemplary embodiment of the present disclosure described with reference to FIGS. 2 to 7.
  • the control method 1500 of the display device according to the exemplary embodiment of the present disclosure may be performed through the display device 200 or 300 described with reference to FIGS. 2 to 3. Therefore, the control method 1500 of the display device may include the same configuration features as the display device 200 or 300 described above.
  • FIG. 15 the same configuration as in FIGS. 4 and 13 is illustrated using the same reference numerals, and thus, overlapping descriptions with those in FIGS. 4 and 13 will be omitted.
  • control unit 240 determines whether an obstacle is present on the front surface of the display 210 based on the detection result of the detection unit 230 and displays 210 according to whether the obstacle is present ) Can be controlled.
  • step S1525 may be performed through the sensing unit 230 and the control unit 240.
  • the control unit 240 may determine that an obstacle exists. For example, if an object is detected within 100 mm from the front of the display 210, the control unit 240 may determine that an obstacle exists.
  • step S1525 may be performed.
  • the control method 1500 of the display device displays the moving distance in the first direction until the first limit value is reached. 210 may be moved in a first direction.
  • step S430 may be performed to move the display 210 in a second direction that is opposite to the first direction.
  • control method 1500 of the display device since the remaining components are the same as in FIG. 13, detailed descriptions thereof will be omitted.
  • FIG. 16 is another flowchart illustrating a method of controlling a display device according to an embodiment of the present disclosure.
  • FIG. 16 may also be a flowchart for describing operations performed in the display device 200 or 300 according to the exemplary embodiment of the present disclosure described with reference to FIGS. 2 to 7.
  • the control method 1300 of the display device according to the exemplary embodiment of the present disclosure may be performed through the display device 200 or 300 described with reference to FIGS. 2 to 3. Therefore, the control method 1600 of the display device may include features identical to those of the display device 200 or 300 described above.
  • FIG. 16 the same configuration as in FIGS. 4, 13, and 15 is illustrated by using the same reference numerals, and thus descriptions overlapping with those in FIGS. 4, 13, and 15 are omitted.
  • the control method 1600 of the display future may output a notification signal informing the user of the presence of the obstacle (S1625).
  • the controller 240 may control the notification unit 280 to output a notification signal informing of the presence of the obstacle.
  • the notification unit 280 may include at least one LED element as described above.
  • the controller 240 may control the red light to be output through the LED element.
  • the LED device may maintain a state in which light is not output or another color light (eg, green light) is output.
  • the control unit 240 may control the red light to be output through a blinking state.
  • control unit 240 may control a user interface screen including a message indicating that an obstacle exists to be output through the display 210.
  • control unit 240 may control to output a voice message indicating that an obstacle exists through a speaker (not shown) provided inside or outside the display device 200.
  • the user may recognize that the obstacle, for example, the drink cup should be removed, and move the drink cup to a position far from the front of the display 210 through the notification signal output in step S1625.
  • control method 1600 of the display device may determine whether there is an obstacle again (S1630). Since the operation of step S1630 corresponds to the operation of step S1525 described with reference to FIG. 15, detailed description is omitted.
  • step S1525 if it is determined in step S1525 that an obstacle exists, the movement of the display 210 may be temporarily stopped. Then, after outputting a notification signal indicating that an obstacle exists, the movement of the display 210 may be resumed.
  • FIG. 17 is another flowchart illustrating a method of controlling a display device according to an embodiment of the present disclosure.
  • FIG. 17 may also be a flowchart for describing operations performed in the display device 200 or 300 according to the exemplary embodiment of the present disclosure described with reference to FIGS. 2 to 7.
  • the control method 1700 of the display device according to the exemplary embodiment of the present disclosure may be performed through the display device 200 or 300 described with reference to FIGS. 2 to 3. Therefore, the control method 1700 of the display device may include the same configuration features as the display device 200 or 300 described above.
  • FIG. 17 the same configuration as in FIG. 4 is illustrated by using the same reference numerals, and thus, overlapping description with that in FIG. 4 is omitted.
  • the controller 240 may control the movement of the display 210 such that the display 210 repeatedly moves in the first direction and the second direction.
  • step S430 when the operation of step S430 is performed and the distance moved in the second direction corresponds to the second limit value, reciprocating motion in the first direction and the second direction is performed. It may be performed repeatedly (S1740). That is, the control method 1700 of the display device may repeatedly perform operations of steps S410, S420, and S430 until the reciprocating motion is repeated N times.
  • control unit 240 controls the first information to be periodically obtained, periodically determines whether the user's posture is a posture that requires correction based on the first information, and displays 210 based on the determination result. ) Can be controlled. Specifically, the control unit 240 may control the first information to be obtained or the first information to be obtained at all times whenever the reciprocating motion is performed once.
  • control method 1700 of the display device may repeatedly perform a reciprocating motion for a predetermined time (S1740).
  • the control method 1700 of the display device may repeatedly perform the operations of steps S410, S420, and S430 for a set time (eg, 30 minutes or 50 minutes).
  • the control method 1700 of the display device may repeatedly perform the reciprocating movement of the display 210 performed in steps S420 and S430.
  • FIG. 18 is a diagram illustrating a specific configuration of a driving unit in a display device according to an exemplary embodiment of the present disclosure.
  • the tilt adjusting unit 221 has a first gear ratio having a fixed value such that the tilt of the display 210 increases in proportion to the distance that the plurality of columns 610, 211, and 612 move. It may include a plurality of gears (1811, 1822, 1823, 1824) rotating at (gear ratio).
  • the plurality of gears 1811, 1822, 1823, and 1824 included in the tilt adjusting unit 221 may be referred to as a gear set.
  • the plurality of gears 1811, 1822, 1823, and 1824 may be engaged with each other to rotate clockwise or counterclockwise.
  • block 1860 is a view of the display 210 from the right side
  • block 1870 is a view of the display 210 from the left side.
  • the tilt adjusting unit 221 may include a connecting element 1815 connected to the bracket 1810 of the display 210.
  • the gear set included in the tilt adjustment unit 221 may be connected to the top column 610 of the distance adjustment unit 222.
  • the first gear (Gear1) 1811 included in the gear set included in the tilt adjustment unit 221 is connected to the top column 610 of the plurality of columns included in the distance adjustment unit 222 Combined with a pulley 1835, the first gear 1811 moves in the direction 1842 as the top column 610 moves in the first direction (specifically, the first direction 515 in FIG. 5). Can rotate.
  • the second gear 1822 coupled with the first gear 1811 rotates in the direction 1841.
  • the third gear 1823 coupled with the second gear 1822 rotates in the direction 1841.
  • the fourth gear 1824 coupled with the third gear 1823 rotates in the direction 1843.
  • the inclination of the display 210 may be inclined downward. That is, as the top column 610 moves in the first direction (specifically, the first direction 515 in FIG. 5), a plurality of gears 1811, 1822, 1823, and 1824 included in the gear set are continuously By rotating the display 210 is inclined downward.
  • the plurality of gears (1811, 1822, 1823, 1824) is a plurality of columns (610, 211, 612) has a fixed value so that the slope of the display 210 increases in proportion to the distance moved It can rotate at 1 gear ratio.
  • the first gear ratio may correspond to a reduction ratio of 1/25.
  • the tilt adjusting unit 221 may operate to increase the tilt of the display 210 in proportion to the distance that the plurality of columns 610, 211, and 612 included in the distance adjusting unit 222 move. Accordingly, the tilt adjusting unit 221 may be implemented in a wide variety of forms, and the configuration of the tilt adjusting unit 221 illustrated in FIG. 18 will be only an exemplary configuration.
  • FIG. 19 is another diagram illustrating a specific configuration of a driving unit in a display device according to an exemplary embodiment of the present disclosure.
  • the same configuration as in FIG. 18 is illustrated using the same reference numerals. Therefore, descriptions overlapping with those in FIG. 18 are omitted.
  • the distance adjusting unit 222 includes a plurality of columns 610, 211, and 612, and is linearly driven in both directions to move the position of the display 210.
  • the highest column 610 is a column that moves to the farthest, and the lowest column 612 can be a column that moves the shortest.
  • the open end belt 1910 connecting the lower end of the lowermost column 612 and the other end of the uppermost column 610, a plurality of columns are arranged in a first direction or a second direction. Can move.
  • the distance adjusting unit 222 may be connected to the tilt adjusting unit 221 through the first gear 1811 coupled with the pulley 1835.
  • the open end belt 1910 may be composed of a timing belt, V-belt or a circular belt.
  • the tilt adjusting unit () is rotated by the principle that the pulley 1835 connected to the open end belt 1910 rotates.
  • the tilt of the display 210 may be adjusted while each of the plurality of gears 1811, 1822, 1823, and 1824 included in 221 rotates.
  • one end of the open end belt 1910 is connected to the belt fixing block 1920 which is the lower end of the lowest column 612, so that one end of the open end belt 1910 may have a fixed position.
  • the other end of the open end belt 1910 may be connected to a spring 1940 connected to a lower end of the top column 610.
  • the spring 1940 may be formed in various types such as tension, compression, and static load spring.
  • each of the plurality of columns may linearly move in a direction induced by the guide rail 1930, for example, a first direction or a second direction.
  • the plurality of columns 610, 211, and 612 included in the distance adjusting unit 222 has a telescopic column shape and may be linearly moved in both directions. Accordingly, the distance adjusting unit 222 may be implemented in a very wide variety of forms, and the configuration of the distance adjusting unit 222 illustrated in FIG. 19 will be described as an exemplary configuration.
  • 20 is another block diagram illustrating a display device according to an exemplary embodiment of the present disclosure.
  • the display apparatus 2000 may correspond to the display apparatus 200 or 300 described with reference to FIGS. 1 to 19.
  • the display 2015 of the display apparatus 2000, the sensing unit 2060, the control unit 2080, the memory 2090 and the input/output unit 2070 are the display devices 200 illustrated in FIGS. 2 and 3, respectively.
  • it may correspond to the display 210, the sensing unit 230, the control unit 240, the memory 250, and the input/output unit 270 of the 300).
  • the display apparatus 2000 may further include a configuration corresponding to the user interface 260 illustrated in FIG. 3 in addition to the configurations illustrated in FIG. 20.
  • the display apparatus 2000 includes a video processing unit 2010, a display 2015, an audio processing unit 2020, an audio output unit 2025, a power supply unit 2030, a communication unit 2050, and a sensing unit (not shown) Hour), input/output unit 2070, control unit 2080, and memory 2090. Also, the display apparatus 2000 may further include a tuner unit 2040.
  • the video processing unit 2010 performs processing on the video data received by the display apparatus 2000.
  • the video processing unit 2010 may perform various image processing such as decoding, scaling, noise filtering, frame rate conversion, and resolution conversion for video data.
  • the control unit 2080 receives a record request for the video data processed by the video processing unit 2010, and encrypts the video data to include a memory device (not shown) included in the control unit 2080 or the memory 2090, for example , It can be controlled to be written to RAM (not shown).
  • the display 2015 displays a video included in the broadcast signal received through the tuner 2040 on the screen under the control of the controller 2080. Also, the display 2015 may display content (eg, an image) input through the communication unit 2050 or the input/output unit 2070.
  • content eg, an image
  • the display 2015 may output an image stored in the memory 2090 under the control of the controller 2080.
  • the display 2015 is a motion for performing a voice recognition UI (for example, including a voice command guide) or a motion recognition task corresponding to motion recognition for performing a voice recognition task corresponding to voice recognition.
  • UI eg, a user motion guide for motion recognition
  • the audio processing unit 2020 performs processing on audio data.
  • the audio processing unit 2020 may perform various processes such as decoding or amplification of audio data, noise filtering, and the like. Meanwhile, the audio processing unit 2020 may include a plurality of audio processing modules to process audio corresponding to a plurality of contents.
  • the audio output unit 2025 outputs audio included in the broadcast signal received through the tuner unit 2040 under the control of the control unit 2080.
  • the audio output unit 2025 may output audio (eg, voice, sound) input through the communication unit 2050 or the input/output unit 2070.
  • the audio output unit 2025 may output audio stored in the memory 2090 under the control of the control unit 2080.
  • the audio output unit 2025 may include at least one of a speaker 2026, a headphone output terminal 2027, or an S/PDIF (Sony/Philips Digital Interface) output terminal 2028.
  • the audio output unit 2025 It may include a combination of a speaker 2026, a headphone output terminal 2027 and an S/PDIF output terminal 2028.
  • the power supply unit 2030 supplies power input from an external power source to the components 2010 to 2090 inside the display apparatus 2000 under the control of the control unit 2080.
  • the power supply unit 2030 supplies power output from one or more batteries (not shown) located inside the display device 2000 under the control of the control unit 2080 to the internal components 2010 to 2090. Can.
  • the tuner 2040 tunes only the frequency of a channel to be received by the display apparatus 2000 among many radio wave components through amplification, mixing, and resonance of a broadcast signal received through wired or wireless communication. You can choose by tuning.
  • the broadcast signal includes audio, video, and additional data (eg, Electronic Program Guide (EPG)).
  • EPG Electronic Program Guide
  • the tuner unit 2040 may selectively receive a broadcast signal and/or a video signal received corresponding to a predetermined channel.
  • the tuner unit 2040 may receive a broadcast signal in a frequency band corresponding to a channel number (eg, cable broadcast number 506) according to a user input.
  • the user input is a control signal received from an external control device (not shown), or a remote controller (not shown), for example, a channel number input, an up-down (up-down) input and EPG screen of a channel It can be a channel input.
  • the user input may be an input for generating a predetermined event.
  • the tuner unit 2040 may receive broadcast signals from various sources such as terrestrial broadcasting, cable broadcasting, satellite broadcasting, and Internet broadcasting.
  • the tuner unit 2040 may receive a broadcast signal from a source such as analog broadcast or digital broadcast.
  • the broadcast signal received through the tuner 2040 is decoded (eg, audio decoding, video decoding, or additional data decoding) to be separated into audio, video, and/or additional information.
  • the separated audio, video and/or additional information may be stored in the memory 2090 under the control of the control unit 2080.
  • the tuner unit 2040 of the display apparatus 2000 may be one or a plurality. According to an embodiment, when the tuner unit 2040 is composed of a plurality, it is possible to output a plurality of broadcast signals to a plurality of windows constituting a multi-window screen provided on the display 2015.
  • the tuner unit 2040 is implemented as an all-in-one with the display device 2000 or a separate device (eg, a set-top box) having a tuner unit electrically connected to the display device 2000. top box, not shown), and a tuner unit (not shown) connected to the input/output unit 2070.
  • the communication unit 2050 may connect the display device 2000 with an external device (for example, an audio device) under the control of the control unit 2080.
  • the control unit may transmit/receive content to an external device connected through the communication unit 2050, download an application from the external device, or perform web browsing.
  • the communication unit 2050 may access the network and receive content from an external device (not shown).
  • the communication unit 2050 may include at least one of a short-range communication module (not shown), a wired communication module (not shown), and a mobile communication module (not shown).
  • the communication unit 2050 includes one of a wireless LAN 2051, a Bluetooth 2052, and a wired Ethernet 153 is illustrated as an example.
  • the communication unit 2050 may include a combination of a wireless LAN 2051, a Bluetooth 2052, and a wired Ethernet (153). Also, the communication unit 2050 may receive a control signal of a control device (not shown) under the control of the control unit 2080.
  • the control signal may be implemented as a Bluetooth type, an RF signal type or a Wi-Fi type.
  • the communication unit 2050 may further include other short-range communication (eg, near field communication (NFC), not shown) and a separate BLE module (bluetooth low energy, not shown) other than Bluetooth.
  • NFC near field communication
  • BLE blue low energy
  • the sensing unit 2060 detects the user's voice, the user's image, or the user's interaction.
  • the microphone 2061 receives the user's uttered voice.
  • the microphone 2061 converts the received voice into an electrical signal and outputs it to the controller 2080.
  • the user voice may include, for example, a voice corresponding to a menu or function of the display device 2000.
  • the recognition range of the microphone 2061 is recommended to be within 4 m from the microphone 2061 to the user's position, and the recognition range of the microphone 2061 is the size of the user's voice and the surrounding environment (eg, speaker sound, Ambient noise).
  • the microphone 2061 may be embodied as an integrated or separate display device 2000.
  • the separated microphone 2061 may be electrically connected to the display device 2000 through the communication unit 2050 or the input/output unit 2070.
  • the microphone 2061 may be excluded depending on the performance and structure of the display device 2000.
  • the camera unit 2062 receives an image (eg, a continuous frame) corresponding to a motion of a user including a gesture in the camera recognition range.
  • the recognition range of the camera unit 2062 may be a distance of 0.1 to 5 m from the camera unit 2062 to the user.
  • the user motion may include, for example, a user's body part such as a user's face, facial expression, hand, fist, or finger, or a motion of the user part.
  • the camera unit 2062 may convert the received image into an electrical signal under the control of the controller 2080 and output the converted image to the controller 2080.
  • the control unit 2080 may select a menu displayed on the display apparatus 2000 using the recognition result of the received motion or perform control corresponding to the motion recognition result. For example, it may include channel adjustment, volume adjustment, and indicator movement.
  • the camera unit 2062 may include a lens (not shown) and an image sensor (not shown).
  • the camera unit 2062 may support optical zoom or digital zoom using a plurality of lenses and image processing.
  • the recognition range of the camera unit 2062 may be variously set according to the angle of the camera and environmental conditions.
  • a 3D still image or a 3D motion may be received using a plurality of cameras.
  • the camera unit 2062 may be embodied in an integral or separate form with the display device 2000.
  • a separate device (not shown) including the separated camera unit 2062 may be electrically connected to the display device 2000 through the communication unit 2050 or the input/output unit 2070.
  • the camera unit 2062 may be excluded depending on the performance and structure of the display device 2000.
  • the light receiving unit 2063 receives an optical signal (including a control signal) received from an external control device (not shown) through a light window (not shown) of the bezel of the display 2015.
  • the light receiving unit 2063 may receive an optical signal corresponding to a user input (eg, touch, pressing, touch gesture, voice, or motion) from a control device (not shown).
  • the control signal may be extracted by the control of the control unit 2080 from the received optical signal.
  • the light receiving unit 2063 may receive a signal corresponding to a pointing position of a control device (not shown) and transmit it to the control unit 2080.
  • a control device for example, when a user interface screen for receiving data or commands from a user is output through the display 2015, and a user wants to input data or commands to the display device 2000 through a control device (not shown) When the user moves the control device (not shown) while the user touches a finger on the touch pad (not shown) provided on the control device (not shown), the light receiving unit 2063 corresponds to the movement of the control device (not shown). The signal may be received and transmitted to the control unit 2080.
  • the light receiving unit 2063 may receive a signal indicating that a specific button provided on a control device (not shown) is pressed and transmit it to the control unit 2080.
  • the light receiving unit 2063 receives a signal that the button-type touch pad (not shown) is pressed when a user presses a touch pad (not shown) provided as a button type on a control device (not shown) with a finger. It can be transmitted to the control unit 2080.
  • a signal that a button-type touch pad (not shown) is pressed may be used as a signal for selecting one of items.
  • the input/output unit 2070 is a video (for example, a video, etc.), audio (for example, voice, music, etc.) and additional data (for example, from the outside of the display device 2000) under the control of the control unit 2080. (Eg, EPG, etc.).
  • the input/output unit 2070 is one of an HDMI port (High-Definition Multimedia Interface port, 471), a component jack (component jack, 472), a PC port (PC port, 473), and a USB port (USB port, 474). It may include.
  • the input/output unit 2070 may include a combination of an HDMI port 2071, a component jack 2072, a PC port 2073, and a USB port 2074.
  • the control unit 2080 controls the overall operation of the display device 2000 and the signal flow between internal components (not shown) of the display device 2000 and processes data.
  • the controller 2080 may execute an operating system (OS) and various applications stored in the memory 2090 when a user input or a preset condition is satisfied.
  • OS operating system
  • the controller 2080 stores a signal or data input from the outside of the display device 2000, or a RAM (not shown) used as a storage area corresponding to various operations performed in the display device 2000, a display device 2000 ) May include a ROM (not shown) and a processor (not shown) in which a control program for control is stored.
  • the processor may include a graphic processor (not shown) for graphic processing corresponding to video.
  • the processor may be implemented as a system on chip (SoC) that integrates a core (not shown) and a GPU (not shown).
  • SoC system on chip
  • the processor (not shown) may include a single-core, dual-core, triple-core, quad-core, and multiple cores thereof.
  • a processor may include a plurality of processors.
  • the processor may be implemented as a main processor (not shown) and a subprocessor (not shown) operating in a sleep mode.
  • the graphic processing unit (not shown) generates a screen including various objects such as icons, images, and text using an operation unit (not shown) and a rendering unit (not shown).
  • the calculation unit (not shown) calculates attribute values such as coordinate values, shapes, sizes, colors, etc. to be displayed according to the layout of the screen by using the user interaction detected through the sensing unit 2060.
  • the rendering unit generates screens of various layouts including objects based on attribute values calculated by the calculation unit.
  • the screen generated by the rendering unit is displayed in the display area of the display 2015.
  • a control method of a display device may be implemented in a form of program instructions that can be executed through various computer means and may be recorded on a computer readable medium. Also, one or more programs including instructions for executing a control method of a display device according to an embodiment of the present disclosure may be formed of a computer-readable recording medium having a recorded thereon.
  • the computer-readable medium may include program instructions, data files, data structures, or the like alone or in combination.
  • the program instructions recorded on the medium may be specially designed and configured for the present invention, or may be known and usable by those skilled in computer software.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical media such as CD-ROMs, DVDs, and magnetic media such as floptical disks. Includes hardware devices specifically configured to store and execute program instructions such as magneto-optical media, and ROM, RAM, flash memory, and the like.
  • Examples of program instructions include high-level language code that can be executed by a computer using an interpreter, etc., as well as machine language codes produced by a compiler.
  • control method of the display device includes: obtaining a sentence composed of multiple languages; And using a multi-language translation model, obtain vector values corresponding to each of words included in the multi-language sentence, convert the obtained vector values into vector values corresponding to a target language, and convert the converted values.
  • a computer program product including a recording medium in which a program for performing an operation of obtaining a sentence configured in the target language is stored may be implemented.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Un mode de réalisation de la présente invention concerne un dispositif d'affichage et son procédé de commande, le dispositif d'affichage permettant de corriger la posture d'un utilisateur, ce qui permet de prévenir le développement de la posture de la tête vers l'avant. Le dispositif d'affichage et son procédé de commande, selon un mode de réalisation de la présente invention, permettent à un utilisateur de réaliser de façon inaperçue un exercice de cou tout en utilisant le dispositif d'affichage, et peut ainsi induire une bonne posture de l'utilisateur.
PCT/KR2019/018076 2018-12-26 2019-12-19 Dispositif d'affichage pour correction de posture et procédé de commande correspondant WO2020138840A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2018-0169918 2018-12-26
KR1020180169918A KR102594359B1 (ko) 2018-12-26 2018-12-26 자세 교정을 위한 디스플레이 장치 및 그의 제어 방법

Publications (1)

Publication Number Publication Date
WO2020138840A1 true WO2020138840A1 (fr) 2020-07-02

Family

ID=71129161

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/018076 WO2020138840A1 (fr) 2018-12-26 2019-12-19 Dispositif d'affichage pour correction de posture et procédé de commande correspondant

Country Status (2)

Country Link
KR (1) KR102594359B1 (fr)
WO (1) WO2020138840A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113568595A (zh) * 2021-07-14 2021-10-29 上海炬佑智能科技有限公司 基于ToF相机的显示器组件的控制方法、装置、设备和介质
US11822381B2 (en) 2019-11-27 2023-11-21 Dotheal Co., Ltd. Method of adjusting location and tilting of monitor

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020135088A1 (de) 2020-03-27 2021-09-30 Samsung Electronics Co., Ltd. Halbleitervorrichtung
KR102510288B1 (ko) * 2021-07-15 2023-03-16 가천대학교 산학협력단 거북목 예방을 위한 모니터 장치 및 제어 방법
KR102535143B1 (ko) * 2021-08-10 2023-05-26 이상준 디스플레이 지지대 및 이를 제어하는 제어시스템
KR102655829B1 (ko) 2021-11-08 2024-04-08 엄태경 자세 교정용 모니터 및 이의 조절 방법
WO2024043382A1 (fr) * 2022-08-26 2024-02-29 엘지전자 주식회사 Dispositif d'affichage et procédé de commande associé
WO2024043384A1 (fr) * 2022-08-26 2024-02-29 엘지전자 주식회사 Dispositif d'affichage et son procédé de commande
WO2024043383A1 (fr) * 2022-08-26 2024-02-29 엘지전자 주식회사 Dispositif d'affichage et son procédé de commande

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090079474A (ko) * 2008-01-17 2009-07-22 임차성 사용자 맞춤형 자동 디스플레이 위치 조절장치
KR20090123394A (ko) * 2008-05-28 2009-12-02 장진혁 플렉스 센서를 이용한 자동 디스플레이어 제어기
KR101150428B1 (ko) * 2010-08-17 2012-06-01 장철섭 모니터 아암의 목운동 기구
KR20120084132A (ko) * 2011-01-19 2012-07-27 김기홍 목디스크 예방을 위한 모니터 제어 로봇 및 제어 방법
KR101651099B1 (ko) * 2016-04-27 2016-08-25 김형석 거북목 방지용 모니터

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101954193B1 (ko) * 2012-11-26 2019-03-05 엘지전자 주식회사 어레이 카메라, 전기 기기 및 그 동작방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090079474A (ko) * 2008-01-17 2009-07-22 임차성 사용자 맞춤형 자동 디스플레이 위치 조절장치
KR20090123394A (ko) * 2008-05-28 2009-12-02 장진혁 플렉스 센서를 이용한 자동 디스플레이어 제어기
KR101150428B1 (ko) * 2010-08-17 2012-06-01 장철섭 모니터 아암의 목운동 기구
KR20120084132A (ko) * 2011-01-19 2012-07-27 김기홍 목디스크 예방을 위한 모니터 제어 로봇 및 제어 방법
KR101651099B1 (ko) * 2016-04-27 2016-08-25 김형석 거북목 방지용 모니터

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11822381B2 (en) 2019-11-27 2023-11-21 Dotheal Co., Ltd. Method of adjusting location and tilting of monitor
US11822379B2 (en) 2019-11-27 2023-11-21 Dotheal Co., Ltd. Monitor mounting device and information display system
US11822380B2 (en) 2019-11-27 2023-11-21 Dotheal Co., Ltd. Information display system including monitor mount
CN113568595A (zh) * 2021-07-14 2021-10-29 上海炬佑智能科技有限公司 基于ToF相机的显示器组件的控制方法、装置、设备和介质
CN113568595B (zh) * 2021-07-14 2024-05-17 上海炬佑智能科技有限公司 基于ToF相机的显示器组件的控制方法、装置、设备和介质

Also Published As

Publication number Publication date
KR102594359B1 (ko) 2023-10-27
KR20200080050A (ko) 2020-07-06

Similar Documents

Publication Publication Date Title
WO2020138840A1 (fr) Dispositif d'affichage pour correction de posture et procédé de commande correspondant
WO2018034462A1 (fr) Appareil d'affichage d'image, et procédé de commande correspondant
WO2020101453A1 (fr) Dispositif électronique et procédé de reconnaissance d'une scène audio
WO2016129784A1 (fr) Appareil et procédé d'affichage d'image
WO2017191978A1 (fr) Procédé, appareil et support d'enregistrement pour traiter une image
WO2014077541A1 (fr) Appareil d'affichage d'image et son procédé de fonctionnement
WO2017105015A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2012102592A2 (fr) Dispositif d'affichage d'image et son procédé d'utilisation
WO2016111464A1 (fr) Appareil et procédé d'affichage d'images
WO2016076570A1 (fr) Appareil et procédé d'affichage
WO2017086559A1 (fr) Dispositif d'affichage d'images et son procédé de fonctionnement
WO2018034436A1 (fr) Appareil électronique, et procédé de commande associé
WO2016104932A1 (fr) Appareil d'affichage d'images et procédé d'affichage d'images
WO2017119708A1 (fr) Appareil d'affichage d'image et son procédé de fonctionnement
WO2017171412A2 (fr) Appareil de traitement d'image, et terminal mobile
WO2016080700A1 (fr) Appareil d'affichage et procédé d'affichage
WO2016111455A1 (fr) Appareil et procédé d'affichage d'image
WO2014077509A1 (fr) Appareil d'affichage d'image et son procédé de fonctionnement
WO2016111487A1 (fr) Appareil d'affichage et procédé d'affichage
WO2021118225A1 (fr) Dispositif d'affichage et son procédé de fonctionnement
WO2019156408A1 (fr) Dispositif électronique et procédé de fonctionnement associé
WO2022191542A1 (fr) Procédé de fourniture de service d'entraînement à domicile et dispositif d'affichage mettant en œuvre ledit procédé de fourniture de service d'entraînement à domicile
WO2017159931A1 (fr) Dispositif électronique comprenant un écran tactile et procédé de commande du dispositif électronique
WO2017010602A1 (fr) Terminal et système le comprenant
WO2012124837A1 (fr) Appareil et procédé de reconnaissance de geste

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19901688

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19901688

Country of ref document: EP

Kind code of ref document: A1