WO2012043910A1 - Dispositif d'affichage d'image et procédé d'affichage d'image correspondant - Google Patents

Dispositif d'affichage d'image et procédé d'affichage d'image correspondant Download PDF

Info

Publication number
WO2012043910A1
WO2012043910A1 PCT/KR2010/006733 KR2010006733W WO2012043910A1 WO 2012043910 A1 WO2012043910 A1 WO 2012043910A1 KR 2010006733 W KR2010006733 W KR 2010006733W WO 2012043910 A1 WO2012043910 A1 WO 2012043910A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
user
feature data
information
display
Prior art date
Application number
PCT/KR2010/006733
Other languages
English (en)
Korean (ko)
Inventor
최지호
이종수
홍승범
박은영
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to PCT/KR2010/006733 priority Critical patent/WO2012043910A1/fr
Publication of WO2012043910A1 publication Critical patent/WO2012043910A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts

Definitions

  • the present specification relates to an electronic device and a method for controlling the operation of the electronic device, and more particularly, to an image display device having an image display function and an image display method in such an image display device.
  • An image display device is a device having a function of displaying an image that a user can watch. The user can watch the broadcast through the image display device.
  • the video display device displays a broadcast selected by a user on a display among broadcast signals transmitted from a broadcasting station.
  • broadcasting is shifting from analog broadcasting to digital broadcasting worldwide.
  • Digital broadcasting refers to broadcasting for transmitting digital video and audio signals. Digital broadcasting is more resistant to external noise than analog broadcasting, so it has less data loss, is advantageous for error correction, has a higher resolution, and provides a clearer picture. In addition, unlike analog broadcasting, digital broadcasting is capable of bidirectional services.
  • the present specification compares feature data included in a reference image with feature data extracted from an image of the user to determine whether the feature data match and provides a determination result to the user, thereby providing a user with the feature data included in the reference image. It is a technical problem to provide an image display device and a method of displaying the image, which monitor the compliance of the system and allow the user to take an accurate action according to the reference image, thereby enabling systematic exercise management.
  • the image display device for solving the technical problem, the photographing unit for obtaining a photographed image of the user; A communication unit which receives a reference image including first feature data; An output unit configured to output the captured image and the reference image; And extracting second feature data from the captured image, comparing the first feature data with the second feature data, and outputting an indicator indicating the comparison result superimposed on the captured image or the reference image. It characterized in that it comprises a control unit for controlling.
  • the output unit may display the captured image and the reference image at the same time.
  • the output unit may display the captured image and the reference image in a superimposed manner.
  • the output unit may display the captured image and the reference image in a picture in picture (PIP) manner.
  • the output unit may display the captured image and the reference image on each of divided regions of the screen.
  • the control unit may control the communication unit to transmit the comparison result to an external device.
  • the control unit may generate guide information based on the comparison result and control the output unit to output the guide information.
  • the guide information may include posture correction information.
  • the guide information characterized in that it comprises information about the moving direction and / or the moving distance of the body part of the user.
  • the controller may compare the first feature data and the second feature data to determine a match, and control the output unit to output a message indicating the match.
  • the controller may measure the progression amount of the user based on the first characteristic data or the second characteristic data, and control the output unit to output the progression quantity or the communication unit to an external device. It characterized in that the control to transmit the amount of exercise.
  • the image display apparatus may further include a storage unit configured to store a target exercise amount of the user, and the controller may control the output unit to output the target exercise amount.
  • the controller may further include controlling the output unit to output a notification message or controlling the communication unit to transmit the notification message to an external device when the progressed exercise amount reaches the target exercise amount.
  • the communication unit may receive a broadcast image, and the controller may acquire health information related to content included in the broadcast image, and control the output unit to output the health information, or the communication unit may be external. Control to transmit the health information to a device.
  • the communication unit may monitor the connection between the video display device and an external device, and the controller may extract identification information of the external device when the video display device is connected to the external device. Obtaining user information based on identification information, and obtaining health information of the user based on the user information.
  • the controller may further include detecting a face region of the user from the captured image, obtaining user information based on the detected face region, and obtaining health information of the user based on the user information. It is characterized by.
  • the communication unit may receive the reference image based on the health information.
  • the communication unit may receive a broadcast video and receive the reference video based on a type of content included in the broadcast video.
  • the communication unit may receive the reference image when the content included in the broadcast image is an advertisement content.
  • the first feature data may be feature data reflecting exercise prescription information for the user.
  • the exercise prescription information is characterized in that the exercise prescription information for any one of aerobic, muscular strength and stretching exercises.
  • the comparison result may include information about a body part of the user who does not match in the first feature data and the second feature data.
  • the image display method for solving the technical problem the step of obtaining a photographed image of the user; Receiving a reference image including first feature data; Outputting the captured image and the reference image; Extracting second feature data from the captured image; Comparing the first feature data and the second feature data; And displaying an indicator indicating the comparison result by overlapping the photographed image or the reference image.
  • the image display device compares the feature data of the reference image and the captured image to determine the accuracy of the operation and provide the user with information on the accuracy of the operation, thereby allowing the user to perform the correct operation. To make it possible. Accordingly, the user can easily and conveniently learn the operations included in the reference image, and accurately monitor the result of performing the operations.
  • FIG. 1 is a view schematically showing a video display device system according to an embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating an image display device according to an exemplary embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a device capable of transmitting and receiving data with the image display device of FIG. 2.
  • FIG. 4 is a diagram illustrating an output screen of a captured image and a reference image according to embodiments of the present invention.
  • FIG. 5 is a diagram illustrating an output screen of a captured image and a reference image according to embodiments of the present invention.
  • FIG. 6 is a diagram illustrating a display screen of an indicator according to a first embodiment of the present invention.
  • FIG. 7 is a flowchart illustrating an image display process according to the first embodiment of the present invention.
  • FIG. 8 is a diagram illustrating an output screen of guide information according to the second embodiment of the present invention.
  • FIG. 9 is a flowchart illustrating an image display process according to a second embodiment of the present invention.
  • FIG. 10 is a diagram illustrating a display screen of a message according to a third embodiment of the present invention.
  • FIG. 11 is a flowchart illustrating an image display process according to a third embodiment of the present invention.
  • FIG. 12 is a view showing a display screen of the exercise amount according to the fourth embodiment of the present invention.
  • FIG. 13 is a flowchart illustrating an image display process according to a fourth embodiment of the present invention.
  • FIG. 14 is a flowchart illustrating an image display process according to a fourth embodiment of the present invention.
  • 15 is a diagram illustrating an output screen of a notification message according to a fifth embodiment of the present invention.
  • 16 is a flowchart illustrating an image display process according to a fifth embodiment of the present invention.
  • 17 is a diagram illustrating an output screen of health information according to a sixth embodiment of the present invention.
  • FIG. 18 is a flowchart illustrating an image display process according to a sixth embodiment of the present invention.
  • FIG. 19 is a view showing a user authentication screen according to the seventh and eighth embodiments of the present invention.
  • FIG. 20 is a flowchart illustrating an image display process according to a seventh embodiment of the present invention.
  • 21 is a flowchart illustrating an image display process according to an eighth embodiment of the present invention.
  • FIG. 22 is a diagram illustrating a screen for checking whether an image is displayed according to the ninth embodiment of the present invention.
  • FIG. 23 is a flowchart illustrating an image display process according to a ninth embodiment of the present invention.
  • a professional such as a fitness trainer provides a schedule for exercise information, and the user manages the exercise by performing the provided schedule.
  • a professional such as a fitness trainer provides a schedule for exercise information
  • the user manages the exercise by performing the provided schedule.
  • it is important to accurately determine whether the user fulfills the predetermined schedule.
  • it is important to accurately perform an operation included in an exercise according to a predetermined schedule.
  • the video display device described in this specification is an intelligent video display device that adds a computer support function to the broadcast reception function, and is faithful to the broadcast reception function, but the Internet function is added, the handwriting type input device, touch screen or It can be equipped with a more convenient interface such as a space remote control.
  • the Internet by being connected to the Internet and a computer with the support of a wired or wireless Internet function, it is possible to perform functions such as email, web browsing, banking or gaming. Standardized general-purpose operating systems can be used for these various functions.
  • various applications can be freely added or deleted on the general-purpose OS kernel, and thus various user-friendly functions can be performed.
  • it may be a smart TV.
  • feature data described herein is data that reflects a feature of a user's motion or posture included in each image, and may mean data that is preset in the image or extracted from the image.
  • the feature data may include relative position information between one or more feature objects preset in order to compare a user's motion observed in the plurality of images.
  • the feature object may correspond to a body part of the user.
  • reference image described herein may refer to an image representing feature data reflecting an operation performed by a user.
  • the reference image may include a real image, a virtual image, or an image obtained by combining the real image and the virtual image.
  • FIG. 1 is a view schematically showing a video display device system according to an embodiment of the present invention.
  • the video display device system when viewed in terms of providing a content service on a broadcast sender basis, a content provider (CP) 10, a service provider (Service) It may be divided into a provider (SP) 20, a network provider (NP) 30, and a user 40.
  • CP content provider
  • Service service provider
  • SP provider
  • NP network provider
  • the content provider 10 produces and provides various contents. As shown in FIG. 1, the content provider 10 includes a terrestrial broadcaster, a cable system operator or a multiple system operator, a satellite broadcaster, and an internet broadcaster. An internet broadcaster, etc. may be exemplified, etc. In addition, the content provider 10 may provide various applications and the like in addition to the broadcast content.
  • the service provider 20 may provide a service package of contents provided by the content provider 10.
  • the service provider 20 of FIG. 1 may package and provide a first terrestrial broadcast, a second terrestrial broadcast, a cable MSO, satellite broadcast, various internet broadcasts, applications, and the like to a user.
  • the service provider 20 may provide a service to the user 40 using a unicast or multicast scheme.
  • the unicast method is a method of transmitting data 1: 1 between one sender and one receiver.
  • the server may transmit data to the receiver according to the request.
  • Multicasting is a method of transmitting data to a plurality of recipients of a specific group.
  • the server can send data to multiple pre-registered receivers at once.
  • the Internet Group Management Protocol (IGMP) protocol may be used.
  • the network provider 30 may provide a network for providing a service to the user 40.
  • the user 40 may be provided with a service by building a home network end user (HNED).
  • HNED home network end user
  • a conditional access or a content protection may be used.
  • a scheme such as a cable card or a downloadable conditional access system (DCAS) may be used.
  • the user 40 can also provide content through the network.
  • the user 40 may be a content provider, and the content provider 10 may receive content from the user 40. Accordingly, an interactive content service or a data service may be possible.
  • FIG. 2 is a block diagram illustrating an image display device according to an exemplary embodiment of the present invention.
  • the video display device 100 may include a broadcast receiver 105, an external device interface 135, a storage 140, a user input interface 150,
  • the controller 170 may include a display unit 180, an audio output unit 185, and a power supply unit 190.
  • the broadcast receiving unit 105 may include a tuner 110, a demodulator 120, and a network interface unit 130. Among them, the tuner 110 and the demodulator 120 may be provided with the network interface unit 130 selectively.
  • the tuner 110 selects an RF broadcast signal corresponding to a channel selected by a user or all pre-stored channels among RF (Radio Frequency) broadcast signals received through an antenna.
  • the selected RF broadcast signal is converted into an intermediate frequency signal or a baseband video or audio signal.
  • the selected RF broadcast signal is a digital broadcast signal
  • it is converted into a digital IF signal (DIF).
  • the analog broadcast signal is converted into an analog baseband video or audio signal (CVBS / SIF). That is, the tuner 110 may process a digital broadcast signal or an analog broadcast signal.
  • the analog baseband video or audio signal CVBS / SIF output from the tuner 110 may be directly input to the controller 170.
  • the tuner 110 may receive an RF broadcast signal of a single carrier according to an Advanced Television System Committee (ATSC) scheme or an RF broadcast signal of multiple carriers according to a digital video broadcasting (DVB) scheme.
  • ATSC Advanced Television System Committee
  • DVD digital video broadcasting
  • the tuner 110 may sequentially select RF broadcast signals of all broadcast channels stored through a channel memory function among RF broadcast signals received through an antenna and convert them to intermediate frequency signals or baseband video or audio signals. have.
  • the demodulator 120 receives the digital IF signal DIF converted by the tuner 110 and performs a demodulation operation.
  • the demodulator 120 when the digital IF signal output from the tuner 110 is an ATSC scheme, the demodulator 120 performs 8-VSB (8-Vestigal Side Band) demodulation. In addition, the demodulator 120 may perform channel decoding. To this end, the demodulator 120 includes a trellis decoder, a de-interleaver, a reed solomon decoder, and the like, for trellis decoding, deinterleaving, and read. Soloman decryption can be performed.
  • the demodulator 120 when the digital IF signal output from the tuner 110 is a DVB scheme, the demodulator 120 performs coded orthogonal frequency division modulation (COFDMA) demodulation. In addition, the demodulator 120 may perform channel decoding. To this end, the demodulator 120 may include a convolution decoder, a deinterleaver, a reed-soloman decoder, and the like to perform convolutional decoding, deinterleaving, and reed soloman decoding.
  • COFDMA coded orthogonal frequency division modulation
  • the demodulator 120 may output a stream signal TS after performing demodulation and channel decoding.
  • the stream signal may be a signal multiplexed with a video signal, an audio signal, or a data signal.
  • the stream signal may be an MPEG-2 Transport Stream (TS) multiplexed with an MPEG-2 standard video signal, a Dolby AC-3 standard audio signal, and the like.
  • the MPEG-2 TS may include a header of 4 bytes and a payload of 184 bytes.
  • the demodulator 120 described above can be provided separately according to the ATSC system and the DVB system. That is, it can be provided with an ATSC demodulation part and a DVB demodulation part.
  • the stream signal output from the demodulator 120 may be input to the controller 170.
  • the controller 170 After performing demultiplexing, image / audio signal processing, and the like, the controller 170 outputs an image to the display 180 and outputs audio to the audio output unit 185.
  • the external device interface unit 135 may connect the external device to the image display device 100.
  • the external device interface unit 135 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).
  • the external device interface unit 135 may be connected to an external device such as a digital versatile disk (DVD), a Blu-ray, a game device, a camera, a camcorder, a computer (laptop), or the like by wire or wireless.
  • the external device interface unit 135 transmits an externally input image, audio or data signal to the controller 170 of the image display device 100 through a connected external device.
  • the controller 170 may output an image, audio, or data signal processed by the controller 170 to a connected external device.
  • the external device interface unit 135 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).
  • the A / V input / output unit may use a USB terminal, a CVBS (Composite Video Banking Sync) terminal, a component terminal, an S-video terminal (analog), and a DVI to input video and audio signals from an external device to the video display device 100.
  • CVBS Composite Video Banking Sync
  • component terminal an S-video terminal (analog)
  • DVI Digital Visual Interface
  • HDMI High Definition Multimedia Interface
  • RGB terminal High Definition Multimedia Interface
  • D-SUB terminal D-SUB terminal and the like.
  • the wireless communication unit may perform near field communication with another electronic device.
  • the image display device 100 may communicate with Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and Digital Living Network Alliance (DLNA). Depending on the specification, it can be networked with other electronic devices.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • ZigBee ZigBee
  • DLNA Digital Living Network Alliance
  • the external device interface unit 135 may be connected through at least one of the various set top boxes and the various terminals described above, and perform input / output operations with the set top box.
  • the external device interface unit 135 may receive an application or a list of applications in a neighboring external device and transmit the received application or list to the controller 170 or the storage 140.
  • the network interface unit 130 provides an interface for connecting the video display device 100 to a wired / wireless network including an internet network.
  • the network interface unit 130 may include an Ethernet terminal for connecting to a wired network, and for connecting to a wireless network, a WLAN (Wi-Fi) or Wibro (Wireless). Broadband, Wimax (World Interoperability for Microwave Access), High Speed Downlink Packet Access (HSDPA) communication standards, and the like may be used.
  • the network interface unit 130 may access a predetermined web page through a network. That is, by accessing a predetermined web page through the network, it is possible to send or receive data with the server.
  • content or data provided by a content provider or a network operator may be received. That is, content such as a movie, an advertisement, a game, a VOD, a broadcast signal, and related information provided from a content provider or a network provider may be received through a network.
  • the network interface unit 130 may select and receive a desired application from among applications that are open to the public through the network.
  • the storage 140 may store a program for processing and controlling each signal in the controller 170, or may store a signal processed video, audio, or data signal.
  • the storage 140 may perform a function for temporarily storing an image, audio, or data signal input from the external device interface 135 or the network interface 130.
  • the storage 140 may store information on a predetermined broadcast channel through a channel storage function.
  • the storage 140 may store an application or a list of applications input from the external device interface 135 or the network interface 130.
  • the storage unit 140 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), It may include at least one type of storage medium such as RAM, ROM (EEPROM, etc.).
  • the video display device 100 may provide a user with a content file (video file, still image file, music file, document file, application file, etc.) stored in the storage 140.
  • FIG 2 illustrates an embodiment in which the storage 140 is provided separately from the controller 170, but the scope of the present invention is not limited thereto.
  • the storage unit 140 may be included in the control unit 170.
  • the user input interface unit 150 transmits a signal input by the user to the controller 170 or transmits a signal from the controller 170 to the user.
  • the user input interface unit 150 may be powered on / off, channel selection, and screen from the remote controller 200 according to various communication methods such as a radio frequency (RF) communication method and an infrared (IR) communication method.
  • the controller may receive and process a user input signal or a control signal such as a setting, or transmit a control signal from the controller 170 to the remote controller 200.
  • the user input interface unit 150 may transmit a user input signal or a control signal input from a local key (not shown) such as a power key, a channel key, a volume key, and a set value to the controller 170. have.
  • a local key such as a power key, a channel key, a volume key, and a set value
  • the user input interface unit 150 may transmit a user input signal or a control signal input from a sensing unit (not shown) that senses a user's gesture to the controller 170 or from the controller 170.
  • the signal of may be transmitted to a sensing unit (not shown).
  • the sensing unit may include a touch sensor, a voice sensor, a position sensor, an operation sensor, and the like.
  • the control unit 170 demultiplexes the input stream or processes the demultiplexed signals through the tuner 110, the demodulator 120, or the external device interface unit 135, and outputs a video or audio signal. You can create and output.
  • the image signal processed by the controller 170 may be input to the display 180 and displayed as an image corresponding to the image signal.
  • the image signal processed by the controller 170 may be input to the external output device through the external device interface unit 135.
  • the audio signal processed by the controller 170 may be audio output to the audio output unit 185.
  • the voice signal processed by the controller 170 may be input to the external output device through the external device interface unit 135.
  • the controller 170 may include a demultiplexer for demultiplexing an input stream and an image processor for performing image processing of the demultiplexed video signal.
  • the image processor may include a video decoder for decoding the demultiplexed video signal, a scaler for scaling the resolution of the decoded video signal on the display 180, and the like.
  • controller 170 may control overall operations of the image display device 100.
  • the controller 170 may control the tuner 110 to control the tuner 110 to select an RF broadcast corresponding to a channel selected by a user or a pre-stored channel.
  • controller 170 may control the image display device 100 by a user command or an internal program input through the user input interface unit 150.
  • the user may access the network to download the desired application or application list into the video display device 100.
  • the controller 170 controls the tuner 110 to input a signal of a selected channel according to a predetermined channel selection command received through the user input interface unit 150. Then, the video, audio, or data signal of the selected channel is processed. The controller 170 may output the channel information selected by the user together with the processed video or audio signal through the display 180 or the audio output unit 185.
  • the controller 170 may, for example, receive an external device image playback command received through the user input interface unit 150, from an external device input through the external device interface unit 135, for example, a camera or a camcorder.
  • the video signal or the audio signal may be output through the display 180 or the audio output unit 185.
  • the controller 170 may control the display 180 to display an image.
  • the display 180 displays a broadcast image input through the tuner 110, an external input image input through the external device interface unit 135, or an image input through the network interface unit, or an image stored in the storage unit 140.
  • the image displayed on the display 180 may be a still image or a video, and may be a 2D image or a 3D image.
  • the controller 170 may control to display a list of applications or applications that can be downloaded from the video display device 100 or from an external network.
  • the controller 170 may control to install and run an application downloaded from an external network along with various user interfaces. In addition, by selecting a user, an image related to an executed application may be controlled to be displayed on the display 180.
  • the channel browsing processor may receive a stream signal TS output from the demodulator 120 or a stream signal output from the external device interface 135, extract a video from the input stream signal, and generate a thumbnail image. Can be.
  • the generated thumbnail image may be input as it is or encoded to the controller 170.
  • the generated thumbnail image may be encoded in a stream form and input to the controller 170.
  • the controller 170 may display a thumbnail list including a plurality of thumbnail images on the display 180 by using the input thumbnail image. Meanwhile, the thumbnail images in the thumbnail list may be updated sequentially or simultaneously. Accordingly, the user can easily grasp the contents of the plurality of broadcast channels.
  • the display 180 converts an image signal, a data signal, an OSD signal processed by the controller 170, or an image signal, data signal, etc. received from the external device interface unit 135 into R, G, and B signals, respectively. Generate a signal.
  • the display 180 may be a PDP, an LCD, an OLED, a flexible display, a 3D display, or the like.
  • the display 180 may be configured as a touch screen and used as an input device in addition to the output device.
  • the audio output unit 185 receives a signal processed by the controller 170, for example, a stereo signal, a 3.1 channel signal, or a 5.1 channel signal, and outputs a voice signal.
  • the voice output unit 185 may be implemented by various types of speakers.
  • a photographing unit (not shown) for photographing the user may be further provided.
  • the photographing unit (not shown) may be implemented by one camera, but is not limited thereto and may be implemented by a plurality of cameras.
  • the image information photographed by the photographing unit (not shown) is input to the controller 170.
  • a sensing unit including at least one of a touch sensor, a voice sensor, a position sensor, and a motion sensor may be further provided in the image display apparatus 100. have.
  • the signal detected by the sensing unit may be transmitted to the controller 170 through the user input interface unit 150.
  • the controller 170 may detect a gesture of a user by combining or combining an image captured by a photographing unit (not shown) or a detected signal from a sensing unit (not shown).
  • the power supply unit 190 supplies the corresponding power throughout the image display device 100.
  • power may be supplied to the controller 170, which may be implemented in the form of a System On Chip (SOC), a display 180 for displaying an image, and an audio output unit 185 for audio output.
  • SOC System On Chip
  • the power supply unit 190 may include a converter (not shown) for converting AC power into DC power.
  • a converter for converting AC power into DC power.
  • an inverter capable of PWM operation may be further provided for driving of variable brightness or dimming. have.
  • the remote control apparatus 200 transmits the user input to the user input interface unit 150.
  • the remote control device 200 may use Bluetooth, RF (Radio Frequency) communication, infrared (IR) communication, UWB (Ultra Wideband), ZigBee (ZigBee) method and the like.
  • the remote control apparatus 200 may receive an image, an audio or a data signal output from the user input interface unit 150, display it on the remote control apparatus 200 or output an audio or vibration.
  • the video display device 100 described above is a fixed type of ATSC (8-VSB) digital broadcasting, DVB-T (COFDM) digital broadcasting, ISDB-T (BST-OFDM) digital broadcasting, and the like. It may be a digital broadcast receiver capable of receiving at least one.
  • the image display device described herein is an image display device in which the display 180 and the audio output unit 185 shown in FIG. 2 are excluded, and through the wireless communication, the display 180 and the audio output unit ( 185 may be of a wireless type.
  • a block diagram of the image display apparatus 100 shown in FIG. 2 is a block diagram for an embodiment of the present invention.
  • Each component of the block diagram may be integrated, added, or omitted according to the specification of the image display device 100 that is actually implemented. That is, two or more components may be combined into one component as needed, or one component may be divided into two or more components.
  • the function performed in each block is for explaining an embodiment of the present invention, the specific operation or device does not limit the scope of the present invention.
  • the image display device 100 unlike shown in Figure 2, does not include the tuner 110 and the demodulator 120 shown in Figure 2, the network interface unit 130 or the external device interface unit 135 ), The video content may be received and played back.
  • the photographing unit may obtain a photographed image photographing the user.
  • the photographing unit may acquire an image captured by the user by processing an image frame such as a still image or a moving image of the user acquired by an image sensor such as a CCD.
  • the image sensor may be provided inside the image display apparatus 100, and in another embodiment, the image sensor may be provided outside the image display apparatus 100 so that the image sensor is a still image or a video for the user in the photographing unit. Can output an image frame.
  • the photographing unit may store the photographed image photographing the user in the storage 140 or output it to the controller 170.
  • the broadcast receiving unit 105 may receive a reference image including the first feature data.
  • the broadcast receiving unit 105 may receive a reference image from a broadcasting station or a network server under the control of the controller 170, and store the received reference image in the storage 140 or output the received reference image to the controller 170.
  • the controller 170 may extract second feature data from the captured image.
  • the controller 170 may extract the user region by removing noise, which is not related to the user, through edge extraction or pattern extraction from the captured image stored in the storage 140 or output from the photographing unit.
  • the controller 170 may extract second feature data, which is relative position information between feature objects corresponding to a user's body part, from the user region extracted from the captured image.
  • controller 170 may compare the first feature data and the second feature data.
  • the controller 170 may compare the relative position information between the feature objects included in the first feature data and the relative position information between the feature objects included in the second feature data.
  • the controller 170 may control the display 180 to display the indicator indicating the comparison result on the captured image or the reference image.
  • the controller 170 generates an indicator indicating a feature object having different positional information relative to the feature object included in the first feature data among the feature objects included in the second feature data, and photographs the generated indicator.
  • the display 180 may be controlled to display in a superimposed manner.
  • the controller 170 may generate an indicator indicating a feature object that is different from the feature object included in the second feature data among the feature objects included in the first feature data, and refers to the generated indicator.
  • the display 180 may be controlled to overlap the image.
  • the first feature data may be feature data reflecting exercise prescription information for the user.
  • the exercise prescription information may be exercise prescription information regarding any one of aerobic, muscular strength and stretching exercise.
  • FIG. 3 is a diagram illustrating a device capable of transmitting and receiving data with the image display device of FIG. 2.
  • the image display device 100 may communicate with the broadcasting station 210, the network server 220, or the external device 230.
  • the video display device 100 may receive a broadcast signal including a video signal transmitted from the broadcast station 210.
  • the video display device 100 may process a video signal, an audio signal, or a data signal included in a broadcast signal so as to be suitable for outputting from the video display device 100.
  • the image display device 100 may output an image or audio based on the processed image signal.
  • the image display apparatus 100 may communicate with the network server 220.
  • the network server 220 is a device capable of transmitting and receiving a signal with the image display device 100 through an arbitrary network 310.
  • the network server 220 may be a mobile phone terminal that may be connected to the image display device 100 through a wired or wireless base station.
  • the network server 220 may be a device capable of providing content to the image display device 100 through an Internet network.
  • the content provider may provide content to the video display device 100 using a network server.
  • the image display device 100 may communicate with the external device 230.
  • the external device 230 is a device capable of directly transmitting and receiving a signal with the image display device 100 by wire or wireless 320.
  • the external device 230 may be a media storage device or a playback device used by a user. That is, the external device 230 may be a camera, a DVD or a Blu-ray player, a personal computer, or the like.
  • the broadcasting station 210, the network server 220, or the external device 230 may transmit a signal including a video signal to the video display device 100.
  • the image display device 100 may display an image based on an image signal included in an input signal.
  • the image display device 100 may transmit a signal transmitted from the broadcasting station 210 or the network server 220 to the image display device 100 to the external device 230.
  • the signal transmitted from the external device 230 to the image display device 100 may be transmitted to the broadcasting station 210 or the network server 220. That is, the video display device 100 may transmit content included in a signal transmitted from the broadcasting station 210, the network server 220, and the external device 230 in addition to directly playing the content on the video display device 100.
  • FIG. 4 is a diagram illustrating an output screen of a captured image and a reference image according to embodiments of the present invention.
  • the display 180 may simultaneously display the captured image and the reference image.
  • the display 180 may display the captured image and the reference image in each of the divided regions of the screen.
  • the display 180 may display the captured image and the reference image by overlapping them.
  • the display 180 may divide a screen into two areas and display the screen.
  • the display 180 may display the reference image 410 in the first area of the two divided areas.
  • the display 180 may display the captured image 420 in the second area of the divided two areas.
  • the display 180 may superimpose and display a captured image and a reference image 430 on a screen.
  • the controller 170 may extract the user region from the captured image, and adjust the size of the captured image so that the size of the user region corresponds to the size of the user region included in the reference image.
  • the display 180 may superimpose and display the reference image on the adjusted image.
  • FIG. 5 is a diagram illustrating an output screen of a captured image and a reference image according to embodiments of the present invention.
  • the display 180 may display the captured image and the reference image in a picture in picture (PIP) manner.
  • the PIP method may mean a method in which one program (image) is displayed in full screen and another program (image) is displayed in an internal window inserted in the full screen.
  • the display 180 may display the reference image 510 in full screen. At the same time, the display 180 may display the captured image 520 in a window inserted into a partial region of the full screen.
  • the display 180 may display the captured image 530 in full screen.
  • the display 180 may display the reference image 540 in a window inserted into a portion of the full screen.
  • FIG. 6 is a diagram illustrating a display screen of an indicator according to a first embodiment of the present invention.
  • the feature data is data reflecting characteristics of a user's motion or posture included in each image, and may be preset in the image or extracted from the image.
  • the feature data may include relative position information between each feature object corresponding to a predetermined body part.
  • the feature data may further include relative position information between the same objects in two or more frames having a front-rear relationship.
  • the controller 170 generates an indicator indicating a feature object among the feature objects included in the second feature data, and a feature object whose relative position information does not coincide with the feature object included in the first feature data, and generates the indicator.
  • the display 180 may be controlled to overlap the captured image.
  • the display 180 may divide and display a screen into two regions.
  • the display 180 may display the reference images 610 and 630 in the first area among the two divided areas.
  • the display 180 may display the captured images 620 and 640 in the second area of the divided two areas.
  • the display 180 may superimpose the indicator 622 generated by the controller 170 on the captured image 620.
  • the display 180 may display the indicator 622 so as to correspond to the position of the feature object included in the second feature data whose position information relative to the feature object included in the first feature data is different.
  • the display 180 may display the indicator 632 generated by the controller 170 overlapping the reference image 630.
  • the display 180 may display the indicator 632 so as to correspond to the position of the feature object included in the first feature data having different positional information relative to the feature object included in the second feature data.
  • FIG. 7 is a flowchart illustrating an image display process according to the first embodiment of the present invention.
  • the photographing unit may acquire a photographed image photographing the user (S200).
  • the photographing unit may store the photographed image photographing the user in the storage 140 or output it to the controller 170.
  • the broadcast receiving unit 105 may receive a reference image including the first feature data (S300).
  • the broadcast receiving unit 105 receives a reference image from the broadcasting station 210 or the network server 220 under the control of the controller 170, and stores the received reference image in the storage unit 140 or outputs it to the control unit 170. can do.
  • the display 180 and the audio output unit 185 may output a captured image and a reference image.
  • the controller 170 may extract the second feature data from the captured image (S400).
  • the controller 170 may extract the user region by removing noise, which is not related to the user, through edge extraction or pattern extraction from the captured image stored in the storage 140 or output from the photographing unit.
  • the controller 170 may extract second feature data including relative position information between feature objects corresponding to a body part of the user, from the user region extracted from the captured image.
  • controller 170 may compare the first feature data and the second feature data (S500).
  • the controller 170 may compare the relative position information between the feature objects included in the first feature data and the relative position information between the feature objects included in the second feature data.
  • the controller 170 may control the display 180 to display the indicator indicating the comparison result in a superimposed image or reference image (S600). For example, the controller 170 generates an indicator indicating a feature object whose relative position information does not match the feature object included in the first feature data among the feature objects included in the second feature data, and generates the indicator.
  • the display 180 may be controlled to overlap the captured image.
  • the controller 170 may control the network interface 130 or the external device interface 135 to transmit the comparison result to the external device 230.
  • FIG. 8 is a diagram illustrating an output screen of guide information according to the second embodiment of the present invention.
  • the display 180 may divide a screen into two regions and display the screen.
  • the display 180 may display the reference image 710 in the first region of the divided two regions.
  • the display 180 may display the captured image 720 in the second area of the divided two areas.
  • the display 180 may superimpose and display the guide information 724 and 726 generated by the controller 170 on the captured image 720.
  • the display 180 stores the guide information 724 and 726 between the feature object included in the first feature data and the position of the feature object included in the second feature data whose position information relative to the feature object is different. I can display it.
  • the guide information 724 may reflect a moving direction according to the position of the feature object included in the first feature data and the feature object included in the corresponding second feature data, and the moving direction may be, for example, an arrow. Direction can be indicated.
  • the guide information 724 may reflect a movement distance according to the position of the feature object included in the first feature data and the feature object included in the corresponding second feature data, and the movement distance may be, for example, the length of the arrow. Or the like.
  • the guide information 726 may include an image of a feature object included in the first feature data.
  • the guide information 730 may further include text information regarding a name and a moving direction (and / or a moving distance) of a body part representing a feature object included in the second feature data.
  • the display 180 may display the guide information 730 on the screen, and the audio output unit 185 may output the guide information 730 as a voice.
  • FIG. 9 is a flowchart illustrating an image display process according to a second embodiment of the present invention.
  • the controller 170 may generate guide information based on a comparison result of the first feature data and the second feature data (S710).
  • the controller 170 may identify a feature object from which feature information included in the first feature data does not match relative position information among feature objects included in the second feature data. Also, the controller 170 may move the identified feature object to the corresponding feature object based on a difference (including a direction or a distance) of position information of the identified feature object and the corresponding feature object in the first feature data. Guide information may be generated.
  • the guide information may include information about a direction or distance for moving the body part of the user corresponding to the feature object to the position of the corresponding body part included in the feature data of the reference image.
  • the guide information may include information for correcting the posture of the user included in the captured image to the posture of the user included in the reference image.
  • the display 180 or the audio output unit 185 may output the guide information generated by the controller 170 (S720).
  • the display 180 may visually output the superimposed guide information on the captured image.
  • the audio output unit 185 may acoustically output the guide information.
  • FIG. 10 is a diagram illustrating a display screen of a message according to a third embodiment of the present invention.
  • the display 180 may divide a screen into two regions and display the screen.
  • the display 180 may display the reference image 810 in the first region of the divided two regions.
  • the display 180 may display the captured image 820 in the second area of the divided two areas.
  • the controller 170 may generate a message 830 indicating that the operation does not match.
  • the display 180 may display the message 830 on the screen, and the audio output unit 185 may output the message 830 as a voice.
  • the controller 170 may generate a message 840 indicating that the operation matches.
  • the display 180 may display the message 840 on the screen, and the audio output unit 185 may output the message 840 as a voice.
  • FIG. 11 is a flowchart illustrating an image display process according to a third embodiment of the present invention.
  • the controller 170 may determine whether the feature data match by comparing the first feature data and the second feature data (S810). The controller 170 compares the relative position information between the feature objects included in the first feature data and the relative position information between the feature objects included in the second feature data, and based on whether the relative position information matches. It may be determined whether the feature data match.
  • the controller 170 When the first feature data and the second feature data do not match in operation 810, the controller 170 generates a message indicating that the operation does not match, and displays the generated message on the display 180 or the audio output unit 185. ) May be controlled to output (S820).
  • the controller 170 When the first feature data and the second feature data match in operation 810, the controller 170 generates a message indicating that the operations match, and displays the generated message by the display 180 or the audio output unit 185. It may be controlled to output (S830).
  • FIG. 12 is a view showing a display screen of the exercise amount according to the fourth embodiment of the present invention.
  • the display 180 may divide a screen into two regions and display the screen.
  • the display 180 may display the reference image 910 in the first area of the two divided areas.
  • the display 180 may display the captured image 920 in the second area of the divided two areas.
  • the controller 170 may measure the amount of exercise based on the feature data included in the captured image 920 or the feature data included in the reference image 910.
  • the progression exercise amount may mean a calorie consumed by the exercise until the user starts the exercise and the exercise amount is measured.
  • the controller 170 may control the display 180 or the audio output unit 185 to output the progress momentum, or the network interface unit 130 or the external device interface unit 135 may output the progress momentum to the external device 230. It may also be controlled to transmit to.
  • the storage unit 140 may store the target exercise amount, and the controller 170 may control the display 180 or the audio output unit 185 to output the target exercise amount.
  • the target exercise amount may mean a calorie amount that the user should consume for a predetermined period of time.
  • the controller 170 measures the amount of exercise in progress, and the display 180 may display information 930 on the amount of exercise in the screen, and the audio output unit 185 may measure the amount of exercise in progress.
  • Information 930 may be output as voice.
  • the controller 170 measures the progression exercise amount, and the storage 140 stores the target exercise amount.
  • the display 180 may display information 940 on the progression exercise and the target exercise amount on the screen, and the audio output unit 185 may output information 940 on the progression exercise and the target exercise quantity as a voice.
  • FIG. 13 is a flowchart illustrating an image display process according to a fourth embodiment of the present invention.
  • the controller 170 may measure the amount of progression based on the feature data included in the captured image or the feature data included in the reference image (S910).
  • the feature data included in the reference image may include the type of exercise, and the controller 170 may determine the type of exercise from the feature data included in the captured image.
  • the controller 170 may measure the amount of calories consumed by the exercise after receiving a progression exercise, for example, after receiving the reference image or after obtaining the captured image, based on the type of exercise.
  • the storage 140 may store information about the weight of the user, and the controller 170 may measure the amount of progression based on the information about the weight of the user.
  • controller 170 may control the display 180 or the audio output unit 185 to output information about the progression exercise amount, or the network interface unit 130 or the external device interface unit 135 may perform the information on the progression exercise amount. It may be controlled to transmit to the external device 230 (S920).
  • FIG. 14 is a flowchart illustrating an image display process according to a fourth embodiment of the present invention.
  • the controller 170 may measure the progression exercise amount based on the feature data included in the captured image or the feature data included in the reference image, and may obtain a target exercise amount of the user from the storage 140 (S1010). In addition, the controller 170 may control the display 180 or the audio output unit 185 to output information about the progression exercise amount and the target exercise amount, or the network interface unit 130 or the external device interface unit 135 may perform the progression exercise amount. And transmitting information about the target exercise amount to the external device 230 (S1020). In this case, the display 180 may display information about the progression exercise information and the target exercise amount together on the screen.
  • 15 is a diagram illustrating an output screen of a notification message according to a fifth embodiment of the present invention.
  • the display 180 may divide a screen into two areas and display the screen.
  • the display 180 may display the reference image 1010 in the first region of the divided two regions.
  • the display 180 may display the captured image 1020 in the second area among the two divided areas.
  • the controller 170 generates a notification message when the progress exercise amount reaches the target exercise amount, and controls the display 180 or the audio output unit 185 to output the notification message or the network interface unit 130 or the external device interface.
  • the unit 135 may control to transmit the notification message to the external device.
  • the controller 170 compares the measured progress exercise amount with the target exercise amount stored in the storage 140, and when the progress exercise amount reaches the target exercise amount, the controller 170 displays a notification message 1030 indicating that the goal has been reached. Can be generated.
  • the display 180 may display the notification message 1030 on the screen, and the audio output unit 185 may output the notification message 1030 as a voice.
  • 16 is a flowchart illustrating an image display process according to a fifth embodiment of the present invention.
  • the controller 170 may measure the progression exercise amount based on the feature data included in the captured image or the feature data included in the reference image, and may obtain a target exercise amount of the user from the storage 140. In addition, the controller 170 may determine whether the amount of exercise reaches the target amount of exercise by comparing the measured amount of exercise with the target amount of exercise (S1110).
  • the controller 170 In the case where the progressed exercise amount reaches the target exercise amount in step 1120, for example, when the progressed exercise amount is greater than or equal to the target exercise amount, the controller 170 generates a notification message indicating that the progressed exercise amount has reached the target exercise amount,
  • the display 180 may control to display the notification message on the screen or control the audio output unit 185 to output the notification message as a voice.
  • the controller 170 may control the network interface 130 or the external device interface 135 to transmit a notification message to the external device.
  • step 1120 when the progression exercise amount does not reach the target exercise amount, for example, when the progress exercise amount is smaller than the target exercise amount, the controller 170 re-measures the progress exercise amount and uses the remeasured amount of exercise amount as the target exercise amount. Can be compared with
  • 17 is a diagram illustrating an output screen of health information according to a sixth embodiment of the present invention.
  • the broadcast receiver 105 may acquire a broadcast image, and the display 180 may output the broadcast image.
  • the broadcast image may include the content 1110, and the controller 170 may obtain health information related to the content, for example, information about calories, from the metadata of the content 1110.
  • the display 180 may display health information 1120 related to the content on the screen.
  • the health information 1120 related to the content may further include dietary information of the user, for example, a target intake amount, a daily intake amount, and the like, and may further include information about a recommended exercise program according to the content.
  • the health information 1120 related to the content may further include an object 1122 configured to call a function of receiving a reference image reflecting the recommended exercise program. When the object 1122 is selected, the captured image and the reference image are received, and the user may perform an operation according to the feature data according to the recommended exercise program.
  • the broadcast receiver 105 may acquire a broadcast image, and the display 180 may output the broadcast image.
  • the broadcast image may include the content 1110, and the controller 170 may obtain health information related to the content, for example, information about calories, from the metadata of the content 1110.
  • the display 180 may display health information 1130 related to the content on the screen.
  • the health information 1130 related to the content may further include user's exercise information, for example, a target exercise amount, a daily exercise amount, and the like, and may further include information about a recommended exercise program according to the content.
  • the health information 1130 related to the content may further include an object 1132 configured to call a function of receiving a reference image reflecting the recommended exercise program. When the object 1132 is selected, the captured image and the reference image are received, and the user may perform an operation according to the feature data according to the recommended exercise program.
  • FIG. 18 is a flowchart illustrating an image display process according to a sixth embodiment of the present invention.
  • the broadcast receiving unit 105 may receive a broadcast image (S112).
  • the display 180 and the audio output unit 185 may output the received broadcast image.
  • the controller 170 may obtain health information related to the content included in the broadcast image based on the metadata included in the broadcast image, and control the display 180 to output the checked health information (S114). ).
  • the network interface unit 130 may transmit health information to the network server 220 or the external device 230 (S116).
  • the broadcast receiving unit 105 may receive a reference image including the first characteristic data based on the health information from the network server 220 (S300).
  • the first characteristic data may reflect exercise prescription information generated according to the health information.
  • FIG. 19 is a view showing a user authentication screen according to the seventh and eighth embodiments of the present invention.
  • the display 180 displays user information 1210. can do.
  • the user information 1210 may include identification information of the external device 230, identification information of the user, and health information of the user.
  • the health information of the user may include a user's goal, daily intake, exercise amount, and the like.
  • the display 180 may display user information 1220 and 1230.
  • User information 1220 may include an image (virtual or real image) identifying the user.
  • the user information 1230 may include identification information of the user and health information of the user.
  • the health information of the user may include a user's goal, daily intake, exercise amount, and the like.
  • FIG. 20 is a flowchart illustrating an image display process according to a seventh embodiment of the present invention.
  • the network interface unit 130 or the external device interface unit 135 may monitor the connection with the external device 230 (S122).
  • the network interface unit 130 may monitor a connection with the external device 230 through Wi-Fi, Bluetooth, ZigBee, or the like.
  • the external device interface unit 135 may monitor the connection with the external device 230 through USB, Firewire, and the like.
  • the controller 170 may obtain identification information of the external device 230 connected from the network interface unit 130 or the external device interface unit 135. There is (S126).
  • the identification information of the external device 230 may include a device ID, a MAC address or an IP address.
  • the controller 170 may identify the user based on the identification information of the external device 230 (S128).
  • the storage 140 may store health information including identification information of the external device, identification information of the user, dietary information of the user, exercise information, and the like.
  • the controller 170 may check the identification information of the user and the health information of the user corresponding to the identification information of the external device 230.
  • the broadcast receiving unit 105 may receive a reference image including the first characteristic data based on the health information from the network server 220 (S300).
  • the first characteristic data may reflect exercise prescription information generated according to the health information.
  • 21 is a flowchart illustrating an image display process according to an eighth embodiment of the present invention.
  • the photographing unit may acquire a photographed image photographing the user (S200).
  • the controller 170 may detect a face region of the user through edge extraction or pattern extraction from the captured image (S210).
  • the controller 170 may identify the user based on the detected face area (S220).
  • the storage 140 may store health information including information on a face area of the user, identification information of the user, dietary information of the user, exercise information, and the like.
  • the controller 170 may check the identification information of the corresponding user and the health information of the user by comparing the detected face area with edges or patterns extracted from the face area of the user stored in the storage 140.
  • the broadcast receiving unit 105 may receive a reference image including the first characteristic data based on the health information from the network server 220 (S300).
  • the first characteristic data may reflect exercise prescription information generated according to the health information.
  • FIG. 22 is a diagram illustrating a screen for checking whether an image is displayed according to the ninth embodiment of the present invention.
  • the display 180 may display content 1300 included in a broadcast image.
  • the display 180 may display an object 1310 for checking whether the exercise is performed.
  • the object 1310 confirming whether the exercise is performed may include an object 1312 configured to call a function of receiving a reference image reflecting exercise prescription information included in the exercise schedule of the user.
  • the object 1312 is selected, the captured image and the reference image are received, and the user may perform an operation according to the reference image reflecting the exercise prescription information included in the exercise schedule of the user.
  • FIG. 23 is a flowchart illustrating an image display process according to a ninth embodiment of the present invention.
  • the broadcast receiving unit 105 receives a broadcast image (S132), and the controller 170 outputs the broadcast image received through the display 180 and the audio output unit 185, and the controller 170 receives the received broadcast image.
  • the type of the content being output through the display 180 and the audio output unit 185 is determined based on the metadata included in the.
  • the controller 170 may monitor whether the type of the content being output through the display 180 and the audio output unit 185 is advertisement content (S134).
  • the photographing unit acquires a photographed image photographing the user in operation S200. Since the description of the steps 300 to 600 may be the same as the description of the steps 300 to 600 of FIG. 7, the description thereof will be omitted.
  • the mobile terminal described herein transmits and receives text messages as well as mobile phones, such as smart phones, laptop computers, digital broadcasting terminals, PDAs (Personal Digital Assistants), Portable Multimedia Players (PMPs), navigation devices, and the like. It can be any terminal that can do it.
  • mobile phones such as smart phones, laptop computers, digital broadcasting terminals, PDAs (Personal Digital Assistants), Portable Multimedia Players (PMPs), navigation devices, and the like. It can be any terminal that can do it.
  • image display apparatus and the image display method described herein is not limited to the configuration and method of the embodiments described as described above, the embodiments of the embodiments can be modified in various ways All or part may be optionally combined.
  • the image display method of the image display device described herein can be implemented as a processor-readable code on a processor-readable recording medium provided in the image display device.
  • the processor-readable recording medium includes all kinds of recording devices that store data that can be read by the processor. Examples of the processor-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet. .
  • the processor-readable recording medium can also be distributed over network coupled computer systems so that the processor-readable code is stored and executed in a distributed fashion.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

L'invention concerne un dispositif d'affichage d'image qui compare des données caractéristiques incluses dans une image de référence à des données caractéristiques extraites d'une image photographiée d'un utilisateur pour évaluer la congruence des données caractéristiques et fournir le résultat de l'évaluation à un utilisateur, et permettre ainsi la gestion systématique de l'exercice par le contrôle du fait que l'utilisateur s'adapte aux données caractéristiques incluses dans l'image de référence et par le guidage de l'utilisateur pour qu'il fasse un mouvement précis en fonction de l'image de référence. L'invention porte également sur un procédé d'affichage d'image correspondant. A cette fin, le dispositif d'affichage d'image selon un mode de réalisation de l'invention comprend : une unité de photographie pour obtenir l'image photographiée d'un utilisateur; une unité de communication pour recevoir l'image de référence contenant des premières données caractéristiques; une unité de sortie pour produire l'image photographiée et l'image de référence; et une unité de commande pour extraite des secondes données caractéristiques de l'image photographiée, comparer les premières et les secondes données caractéristiques, et commander l'unité de sortie afin qu'elle recouvre et affiche un indicateur montrant le résultat de la comparaison sur l'image photographiée ou l'image de référence.
PCT/KR2010/006733 2010-10-01 2010-10-01 Dispositif d'affichage d'image et procédé d'affichage d'image correspondant WO2012043910A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2010/006733 WO2012043910A1 (fr) 2010-10-01 2010-10-01 Dispositif d'affichage d'image et procédé d'affichage d'image correspondant

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2010/006733 WO2012043910A1 (fr) 2010-10-01 2010-10-01 Dispositif d'affichage d'image et procédé d'affichage d'image correspondant

Publications (1)

Publication Number Publication Date
WO2012043910A1 true WO2012043910A1 (fr) 2012-04-05

Family

ID=45893332

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2010/006733 WO2012043910A1 (fr) 2010-10-01 2010-10-01 Dispositif d'affichage d'image et procédé d'affichage d'image correspondant

Country Status (1)

Country Link
WO (1) WO2012043910A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19990086247A (ko) * 1998-05-27 1999-12-15 박권상 스포츠 영상효과 처리장치
JP2004506996A (ja) * 2000-08-22 2004-03-04 バーチャルメディア カンパニー リミテッド 顔映像の形態情報に基づく合成顔映像の生成装置およびその方法
KR20100079356A (ko) * 2008-12-31 2010-07-08 갤럭시아커뮤니케이션즈 주식회사 선택적 참조영상을 이용한 움직임 보상기법을 적용한 동영상 압축부호화장치및 복호화 장치와 움직임 보상을 위한 선택적 참조영상 결정방법

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19990086247A (ko) * 1998-05-27 1999-12-15 박권상 스포츠 영상효과 처리장치
JP2004506996A (ja) * 2000-08-22 2004-03-04 バーチャルメディア カンパニー リミテッド 顔映像の形態情報に基づく合成顔映像の生成装置およびその方法
KR20100079356A (ko) * 2008-12-31 2010-07-08 갤럭시아커뮤니케이션즈 주식회사 선택적 참조영상을 이용한 움직임 보상기법을 적용한 동영상 압축부호화장치및 복호화 장치와 움직임 보상을 위한 선택적 참조영상 결정방법

Similar Documents

Publication Publication Date Title
WO2012070812A2 (fr) Procédé de commande utilisant la voix et les gestes dans un dispositif multimédia et dispositif multimédia correspondant
WO2015030347A1 (fr) Appareil d'affichage d'image et procédé de fonctionnement associé
WO2011028073A2 (fr) Appareil d'affichage d'image et son procédé de fonctionnement
WO2011071285A2 (fr) Appareil d'affichage d'image et son procédé d'exploitation
WO2011074794A2 (fr) Appareil d'affichage d'image et procédé permettant de faire fonctionner ledit appareil d'affichage d'image
WO2012093767A2 (fr) Procédé de fourniture d'un service de commande à distance et appareil d'affichage d'image associé
WO2012005421A1 (fr) Procédé pour une extension d'application et appareil d'affichage d'image associé
WO2012074189A1 (fr) Procédé de commande d'affichage sur écran et dispositif d'affichage d'image l'utilisant
EP2377310A1 (fr) Appareil servant à traiter des images et procédé associé
WO2010151027A4 (fr) Dispositif d'affichage vidéo et méthode pour le faire fonctionner
WO2012070742A1 (fr) Procédé d'installation d'applications, et dispositif d'affichage d'images utilisant celui-ci
WO2012053764A2 (fr) Procédé permettant de déplacer un pointeur dans un appareil d'affichage vidéo et appareil d'affichage vidéo associé
WO2019135433A1 (fr) Dispositif d'affichage et système comprenant ce dernier
WO2016129840A1 (fr) Appareil d'affichage et son procédé de fourniture d'informations
WO2019137016A1 (fr) Procédé de recommandation d'émissions de télévision, dispositif, et support de stockage lisible par ordinateur
EP3756086A1 (fr) Dispositif d'affichage et son procédé de fonctionnement
WO2011152644A2 (fr) Procédé de fourniture d'interface utilisateur et système utilisant le procédé
WO2020171657A1 (fr) Dispositif d'affichage et procédé d'affichage d'image associé
WO2012020945A2 (fr) Procédé de saisie de données sur un dispositif d'affichage d'image, et dispositif d'affichage d'image associé
WO2015115850A1 (fr) Appareil de réception de diffusion
WO2012150731A1 (fr) Commande d'objet à l'aide d'un procédé d'entrée hétérogène
WO2012043910A1 (fr) Dispositif d'affichage d'image et procédé d'affichage d'image correspondant
WO2016200078A1 (fr) Procédé et dispositif permettant de partager un contenu multimédia
WO2016076541A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2019182255A1 (fr) Dispositif d'affichage vidéo et son procédé de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10857911

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10857911

Country of ref document: EP

Kind code of ref document: A1