WO2014192155A1 - Electronic device and method for controlling same - Google Patents

Electronic device and method for controlling same Download PDF

Info

Publication number
WO2014192155A1
WO2014192155A1 PCT/JP2013/065279 JP2013065279W WO2014192155A1 WO 2014192155 A1 WO2014192155 A1 WO 2014192155A1 JP 2013065279 W JP2013065279 W JP 2013065279W WO 2014192155 A1 WO2014192155 A1 WO 2014192155A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal device
user
communication
wearable terminal
electronic device
Prior art date
Application number
PCT/JP2013/065279
Other languages
French (fr)
Japanese (ja)
Inventor
広昭 古牧
Original Assignee
株式会社 東芝
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社 東芝 filed Critical 株式会社 東芝
Priority to PCT/JP2013/065279 priority Critical patent/WO2014192155A1/en
Publication of WO2014192155A1 publication Critical patent/WO2014192155A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4108Peripherals receiving signals from specially adapted client devices characterised by an identification number or address, e.g. local network address
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4432Powering on the client, e.g. bootstrap loading using setup parameters being stored locally or received from the server

Definitions

  • the present invention relates to an electronic device and a control method thereof.
  • Electronic devices such as television broadcast receivers, monitor devices and player devices, personal computers, portable tablet terminal devices, mobile phone devices, and other devices that play video and audio have been put into practical use.
  • the viewer that is, the viewer's face captured by the attached camera, the voice captured by the microphone, or the like is recognized, and the display and operation mode prepared for each viewer are displayed. It has been proposed to start.
  • the viewer when the viewer is recognized using the viewer's face captured by the camera attached to the electronic device, the viewer is positioned at a predetermined position so that the camera of the electronic device can capture the viewer's image. It is necessary. Further, when a viewer is recognized using a voice captured by a microphone attached to the electronic device, the viewer is requested to utter (speak, etc.).
  • An object of the present invention is to provide an electronic device that detects a user holding a portable terminal device and activates a display or an operation mode prepared for each user, and a control method thereof.
  • the electronic device includes a communication unit, a user specifying unit, and a setting unit.
  • the communication means acquires device information unique to the terminal device by wireless or wired.
  • the user specifying unit specifies a user based on the device information of the terminal device acquired by the communication unit.
  • the setting means sets personal settings related to the user specified by the user specifying means.
  • 1 illustrates an example of an electronic device and a control method to which the embodiment can be applied.
  • 1 illustrates an example of an electronic device and a control method to which the embodiment can be applied.
  • 1 illustrates an example of an electronic device and a control method to which the embodiment can be applied.
  • 1 illustrates an example of an electronic device and a control method to which the embodiment can be applied.
  • 1 illustrates an example of an electronic device and a control method to which the embodiment can be applied.
  • 1 illustrates an example of an electronic device and a control method to which the embodiment can be applied.
  • 1 illustrates an example of an electronic device and a control method to which the embodiment can be applied.
  • FIG. 1 illustrates an example of a configuration of a video display apparatus (electronic device) to which an embodiment can be applied.
  • An example of the structure of the portable terminal (wearable terminal) apparatus with which an embodiment is applicable is shown.
  • movement of the portable terminal (wearable terminal) apparatus to which embodiment is applied is shown.
  • movement of the portable terminal (wearable terminal) apparatus to which embodiment is applied is shown.
  • movement of the portable terminal (wearable terminal) apparatus to which embodiment is applied is shown.
  • FIG. 1 shows an example of an electronic device to which the control method realized by the embodiment can be applied.
  • the electronic device includes, for example, a client device 1 which is a digital television broadcast reception / playback device (hereinafter referred to as a DTV), and a portable terminal device (a portable terminal, which can be carried by the user or attached to the user's body). (Hereinafter referred to as a wearable terminal device) 2.
  • a client device 1 which is a digital television broadcast reception / playback device (hereinafter referred to as a DTV)
  • a portable terminal device a portable terminal, which can be carried by the user or attached to the user's body.
  • a wearable terminal device 2
  • data processing data transmission / reception
  • the client device 1 and the wearable terminal device 2 is a communication method control block 3 defined by the combination of the devices (not required for direct communication). Is responsible.
  • the client device 1 is not limited to the DTV, and may be various devices capable of reproducing video and audio.
  • the client device 1 may be a monitor device (video display device) connected to a digital recorder (recording device). Good.
  • the client device 1 may be a personal computer (PC) having a function of receiving a broadcast, a portable terminal device having a function of receiving a television broadcast, or the like.
  • the wearable terminal device 2 may be any device that can transmit (send) data unique to the device, for example, a MAC (Media Access Control) address, by at least one communication method.
  • the wearable terminal device 2 is a portable terminal capable of reproducing video and audio
  • the present invention can be applied to various devices such as a device, a measuring device capable of transmitting information of an attached (attached) user, and an information reproducing device capable of providing information to the attached user.
  • the wearable terminal device 2 is, for example, a mobile phone device, a smart phone (multifunctional mobile phone device), a device or an arm that incorporates a communication function in a wristband or a wristwatch that is assumed to be always worn by a user and used.
  • Various shapes and usage forms may be used, such as a communication terminal device to be worn on a mobile phone, or a terminal device in the form of a personal belonging that must be worn by the user when using glasses, headphones, or earphones.
  • the wearable terminal device 2 may have a user interface that can be directly operated by the user, such as a touch panel. However, assuming that the user wears it on the body, a sound collection mechanism and a voice control mechanism such as a microphone are provided. (A mechanism for acquiring sound and performing control corresponding to the sound) may be included, or both may be used together.
  • the wearable terminal device 2 may be capable of direct communication with, for example, the client device 1 in a device that is assumed to be worn by a user such as a wristband. It is also possible to enable only short-range communication with a smartphone or the like that the user will carry (carry). That is, the wearable terminal device 2 preferably has a communication (transmission / reception) function for a large amount of data with the client device 1, but it is assumed that the user wears it on the body, for example, GPS (Global Positioning System) ) And the like, it may have only a function of transmitting the position information of the own apparatus specified by receiving the data to the outside (transmitting information usable by the client apparatus).
  • GPS Global Positioning System
  • the elements and configurations described in the client device 1 and the wearable terminal device 2 described above may be realized by software by a microcomputer (processing device, CPU (Central Processing Unit)), or by hardware. It may be realized.
  • a microcomputer processing device, CPU (Central Processing Unit)
  • CPU Central Processing Unit
  • broadcasting is distributed by a broadcaster (broadcasting station) provided by a radio wave propagating in space, or by a cable (including optical fiber) or an Internet protocol (IP) communication / distribution network.
  • Broadcasting also includes video and audio and / or music, or encoded data, etc., and provides a program in units of a certain time (broadcasting time) continuously or for a certain period (time). To do.
  • a program may be referred to as a content or a stream.
  • the video includes moving image and still image, text information represented by text (characters represented by a coded code string of data) information, and any combination thereof.
  • FIG. 2 to 6 show an example of a combination of the client device 1 and the wearable terminal device 2 and the communication method or control block 3 shown in FIG.
  • FIG. 2 shows an example of using a communication method that allows direct communication between the client device 1 and the wearable terminal device 2 that do not require the control block 3.
  • each of the client device 1 and the wearable terminal device 2 includes a Bluetooth (registered trademark) / Bluetooth (registered trademark)) communication unit 1-1, 2 compliant with IEEE (Institute of Electrical and Electronics Electronics) (802.15.1).
  • -1 is included (equipped), when the user holding (wearing) the wearable terminal device 2 falls within the distance range recommended in the Bluetooth standard when the distance from the client device 1 falls within the range of the distance recommended by the Bluetooth standard
  • the user (holding the wearable terminal device 2) can be specified.
  • the wearable terminal device 2 (user) can be recognized by the client device 1 receiving, for example, the MAC address of the wearable terminal device 2.
  • FIG. 3 is an example in which, for example, a wireless LAN (Local Area Network) is used as the control block 3, and a client is obtained by wireless communication with an access point (AP) 32 connected to a router 31 that can be connected to an external network. Data can be exchanged between the device 1 and the wearable terminal device 2.
  • a wireless LAN Local Area Network
  • AP access point
  • each of the client device 1 and the wearable terminal device 2 includes a WiFi (registered trademark) communication unit 1 that conforms to IEEE802.11 (x and x indicate classifications such as b / g / n).
  • WiFi registered trademark
  • x and x indicate classifications such as b / g / n.
  • -2, 2-2 included (equipped) the distance between the user holding (wearing) the wearable terminal device 2 and the client device 1 can communicate with the access point (AP) 32
  • the client device 1 can identify the user (holding the wearable terminal device 2).
  • the wearable terminal device 2 (user) can be recognized by the client device 1 receiving, for example, the MAC address of the wearable terminal device 2.
  • FIG. 4 shows an example in which the client device 1 side is connected to the control block 3 by, for example, Ethernet (registered trademark) / Ethernet (registered trademark).
  • the client device 1 includes a communication unit 1-3 capable of Ethernet connection, and data exchange between the client device 1 and a hub 42 connected to the router 41 and the client device 1 is performed.
  • the wired communication conforms to X (X is an identification name of 100 or 1000) -Base-TX / T (TX / T is a code determined by a combination with the identification name).
  • the data transfer between the wearable terminal apparatus 2 and the communication unit 2-2 of the wearable terminal apparatus 2 is WiFi communication.
  • the hub 42 and the wired In the connected client device 1 a user (holding the wearable terminal device 2) can be specified.
  • the wearable terminal device 2 (user) can be recognized by the client device 1 receiving, for example, the MAC address of the wearable terminal device 2.
  • FIGS. 5 and 6 show an example in which a remote controller (remote controller) 5 attached to the client device 1 is used for communication with the wearable terminal device 2.
  • the control signal between the remote controller 5 and the DTV (client device) 1 is, for example, Ir (infrared) communication or RF (wireless) communication unique to the remote control receiving unit 1-5 of the DTV 1 and the remote controller 5.
  • the example shown in FIG. 5 includes each of the wearable terminal device 2 and the remote controller 5 including, for example, (low-speed) magnetic coupling units (NFC (Near Field Communication)) 2-5 and 51 capable of near field communication.
  • the wearable terminal device 2 is positioned at a distance where the remote controller 5 can be NFC-connected, or is positioned at a position where the remote control 5 is substantially in contact with the wearable terminal device 2 (the remote controller 5 is held over the wearable terminal device 2).
  • the remote controller 5 can identify the user (holding the wearable terminal device 2).
  • the wearable terminal device 2 (user) can be recognized when, for example, 1 receives the MAC address of the wearable terminal device 2.
  • each of the wearable terminal device 2 and the remote controller 5 is compliant with, for example, (high-speed) wireless communication capable of close proximity wireless communication, for example, Transferjet (registered trademark) / TransferJet (registered trademark) standard.
  • the communication unit 2-6, 61 is included, and the wearable terminal device 2 is located at a position where the remote control 5 can be NFC-connected, or at a position where the remote control 5 is substantially in contact with the wearable terminal device 2 (remote control).
  • Communication between the remote controller 5 and the remote control receiving unit 1-5 of the DTV 1 substantially an input of a control signal from the remote control 5 to the remote control receiving unit 1-5).
  • the user holding the wearable terminal device 2 can be specified. Therefore, in the subsequent control of the DTV (client device) 1 by the operation of the remote controller 5, it is possible to perform control in a state where the user is specified.
  • the wearable terminal device 2 since the distance and direction of each communication unit may be specified in communication between the wearable terminal device 2 and the remote controller 5, the wearable terminal device 2 is connected to the remote controller 5, for example. It is preferable to prepare a characteristic shape that allows the remote controller 5 to support a part of the wearable terminal device 2 such as a mark for approaching the device 5 or a card reader.
  • the wearable terminal device 2 is connected to the wearable terminal device 2 as shown in FIG. 7 and FIG.
  • Communication between the wearable terminal device 2 and the terminal device 21 is divided into a terminal device (2) carried by the user and a terminal device 21 such as a wristband, a wristwatch, or the like (measuring equipment) worn by the user.
  • a terminal device 21 such as a wristband, a wristwatch, or the like (measuring equipment) worn by the user.
  • the functions that can be realized in the client device 1 and the wearable terminal device 2 shown in FIGS. 1 to 8 are, for example, Selection of channel selection (last channel) at startup Select volume at startup Select video / audio mode Setting / selecting viewing restrictions Selection of layout / item / content of portal screen when using cloud service Selection of user login screen layout / item / content when using cloud service Select a folder (user) for recorded programs OTT (Over-The-Top), for example Netflix (registered trademark) USSTREAM (registered trademark) Daimotion (registered trademark) Automatic login to Internet (Web) services such as Specific startup services (Personalize, etc.) classified by individual users including.
  • the DTV 1 that identifies the channel and the content supply source can be activated. That is, the sound volume setting and the sound (for example, the reproduction environment of the surround system) setting can be started (DTV 1) as the setting immediately before the DTV 1 is turned off.
  • DTV 1 the sound volume setting and the sound (for example, the reproduction environment of the surround system) setting can be started (DTV 1) as the setting immediately before the DTV 1 is turned off.
  • the volume it is possible to suppress activation at a large volume in consideration of nighttime or midnight / early morning based on time information of a built-in clock.
  • the DTV 1 can be activated with the settings associated with the user who turned on the power.
  • the broadcast channel being received when the power is turned off and the volume set at that time are activated with standard settings unique to the DTV 1. .
  • the screen selection screen
  • the screen that displays a list of channels that can be received at that time is displayed. It is also possible to start.
  • the functions described above are defined in advance as a superset in the DTV 1 or the attached remote controller (remote controller), and which function is supported (executed), communication between the client device 1 and the wearable terminal device 2 It can be realized by defining (for example, setting on the wearable terminal device 2 side).
  • FIG. 9 shows an example of the configuration of a television broadcast receiving apparatus that can be used as the DTV shown in FIGS.
  • the DTV 1 includes a tuner 111, a demodulation unit 112, a signal processing unit 113, an audio processing unit 121, a video processing unit 131, an OSD processing unit 132, a display processing unit 133, a control unit 150, and an operation input unit. 161 and a light receiving unit 162.
  • the DTV 1 includes a speaker 122 and a display 134.
  • the tuner 111 receives a broadcast signal received by the antenna ANT, for example, and tunes the broadcast signal (channel selection).
  • the tuner 111 inputs the tuned channel broadcast signal to the demodulation unit 112.
  • the tuner 111 can also accept input of content (program) made up of video and audio as external input. Further, the tuner 111 can process at least two contents at the same time. For example, a program of an arbitrary channel supplied as a spatial wave and, for example, an HD (High-Definition) video with high image quality input from an external input. It can be processed. Note that HD video can also be acquired via a network or a dedicated distribution network, for example.
  • the demodulator 112 demodulates the tuned channel broadcast signal or external input content.
  • the demodulator 112 inputs the demodulated broadcast signal (content) to the signal processor 113.
  • the signal processing unit 113 includes at least signal processing means for processing the demodulated broadcast signal (content data), that is, a DSP (Digital Signal Processor).
  • the signal processing unit 113 separates the broadcast signal (content) demodulated by the demodulation unit 112 into a video signal, an audio signal, and other data signals, the audio signal is input to the audio processing unit 121, and the video signal is input to the video processing unit 131. Supply.
  • the signal processing unit 113 also supplies a data signal to the control unit 150 and / or the OSD processing unit 132.
  • the audio processing unit 121 converts the digital audio signal from the signal processing unit 113 into a signal (audio signal) in a format that can be reproduced by the speaker 122.
  • the audio processing unit 121 supplies an audio signal to the speaker 122.
  • the speaker 122 reproduces sound and / or sound (Audio) from the supplied audio signal.
  • the video processing unit 131 decodes (reproduces) the video signal received from the signal processing unit 113 into a video signal in a format reproducible on the display 134 to a video signal in a format reproducible on the display 134.
  • the video processing unit 131 superimposes the OSD signal supplied from the OSD processing unit 132 on the video signal.
  • the video processing unit 131 outputs the video signal to the display processing unit 133.
  • the OSD processing unit 132 is configured to display a GUI (Graphical User Interface), subtitle, time, or An OSD signal for displaying other information superimposed on the screen is generated.
  • GUI Graphic User Interface
  • the display processing unit 133 performs, for example, color, brightness, sharpness, contrast, or other image quality adjustment processing on the received video signal based on the control from the control unit 150.
  • the display processing unit 133 supplies the video signal subjected to the image quality adjustment to the display 134.
  • the display 134 displays video based on the supplied video signal.
  • the display 134 includes, for example, a liquid crystal display device including a liquid crystal display panel including a plurality of pixels arranged in a matrix and a backlight for illuminating the liquid crystal panel.
  • the display 134 displays a video based on the video signal supplied from the DTV 1.
  • the DTV 1 may be configured to include a video output terminal instead of the display 134. Further, the DTV 1 may be configured to include an audio output terminal instead of the speaker 122.
  • the control unit 150 functions as a control unit (control block) that controls the operation of each unit of the DTV 1.
  • the control unit 150 includes a CPU (main control unit) 151, a ROM (read only memory) 152, a RAM (rewritable (random access) memory) 153, an EEPROM (nonvolatile memory) 154, a communication control unit 155, and the like. ing.
  • the control unit 150 performs various processes based on operation signals supplied from the remote controller 163 through the operation input unit 161 or the light receiving unit 162.
  • the CPU 151 includes an arithmetic element that executes various arithmetic processes, a memory area that holds and executes firmware, and the like.
  • the CPU 151 implements various functions by executing programs stored in the ROM 152, the EEPROM 154, or the like.
  • the ROM 152 stores a program for controlling the DTV 1, a program for realizing various functions, and the like.
  • the CPU 151 activates a program stored in the ROM 152 based on an operation signal supplied from the operation input unit 161 or the remote controller 163. Thereby, the control part 150 controls operation
  • the RAM 153 functions as a work memory for the CPU 151. That is, the RAM 153 stores the calculation result of the CPU 151, data read by the CPU 151, input information input by the operation input unit 161 or a remote controller (hereinafter referred to as a remote controller) 163, and the like.
  • the EEPROM 154 stores various setting information, programs, user authentication, that is, user identification (authentication) corresponding to the MAC address, information set as personalization for the user, and the like.
  • the communication control unit 155 communicates with the outside via the network, for example, acquisition of content provided by the distribution company, acceptance of recording reservation via the Internet (network), or wearable terminal device 2 using Bluetooth or the like. Control communications and so on.
  • the storage 160 has a storage medium for storing content.
  • the storage 160 is configured by a hard disk drive (HDD), a solid state drive (SSD), a semiconductor memory, or the like.
  • the storage 160 can store a recording stream acquired by the signal processing unit 113.
  • the operation input unit 161 is an input unit including, for example, an operation key, a keyboard, a mouse, a touch pad, or another input device that can generate an operation signal in response to an operation input.
  • the operation input unit 161 generates an operation signal according to the operation input.
  • the operation input unit 161 supplies the generated operation signal to the control unit 150.
  • the touch pad includes a device that generates position information based on a capacitive sensor, a thermo sensor, or another method.
  • the operation input unit 161 may include a touch panel formed integrally with the display 134.
  • the light receiving unit 162 receives an operation signal from, for example, the remote controller 163 (indicated by “5” in FIGS. 5 to 8) and supplies the operation signal to the control unit 150.
  • the control unit 150 decodes the original operation signal transmitted from the remote controller 163 based on the signal supplied from the light receiving unit 162 and inputs the decoded operation signal to the CPU 151.
  • the remote controller 163 generates an operation signal based on a user operation input.
  • the remote controller 163 transmits the generated operation signal to the light receiving unit 162 by infrared communication.
  • the light receiving unit 162 and the remote controller 163 may be configured to transmit and receive operation signals by other wireless communication such as radio waves of a predetermined frequency, for example, after infrared communication.
  • the LAN interface 171 is connected to other television devices and recorder devices or tablet devices (wearable terminal devices 2) located on a home network (DLNA) including an access point (AP) 32, and to an external network (Internet network). And the like, control of communication according to X-base-TX / TP with the hub (Hub) 42 connected to the router 41 described with reference to FIG.
  • DLNA home network
  • AP access point
  • Internet network Internet network
  • the wireless communication unit 172 performs streaming transmission of video (Video) / audio (Audio) conforming to the IEEE802.11 (a / b / g / n) standard described above with reference to FIG. 3, particularly high-speed transfer using 802.11n, Alternatively, high-speed communication using Bluetooth (IEEE 802.15.1) described with reference to FIG. 2 is performed. Note that the wireless communication unit 172 may be integrated with the LAN interface 171.
  • the HDMI processing unit 173 is an interface including an HDMI terminal that performs communication based on the HDMI (registered trademark), that is, the HDMI (High Definition Multimedia Interface) standard.
  • the HDMI terminal of the HDMI processing unit 173 is connected to a Blu-ray (registered trademark) recorder, DVD recorder, hard disk recorder, or other device and a device (HDMI device) that complies with the HDMI standard, such as a recorder device.
  • the HDMI processing unit 173 can receive a stream output from the connected HDMI device.
  • the control unit 150 causes the signal processing unit 113 to input the content data received by the HDMI processing unit 173.
  • the signal processing unit 113 separates a digital video signal, a digital audio signal, and the like from the received content data.
  • the signal processing unit 113 transmits the separated digital video signal to the video processing unit 131 and transmits the separated digital audio signal to the audio processing unit 121.
  • the DTV 1 may include other interfaces such as Serial-ATA (Serial-Advanced Technology Attachment) and HDMI-HEC (HDMI Ethernet Channel).
  • Serial-ATA Serial-Advanced Technology Attachment
  • HDMI-HEC HDMI Ethernet Channel
  • FIG. 10 shows an example of the configuration of a smartphone (multifunctional mobile phone device) that can be used as the wearable terminal device shown in FIGS.
  • FIG. 10 shows an example of the configuration of the wearable terminal device 2.
  • the display unit 201 operates as a video display unit and can operate as a touch screen.
  • the operation input is recognized by the operation command processing unit 224 of the control unit 220.
  • the operation mode setting unit 226 sets the wearable terminal device 2 to the mobile phone mode.
  • the display unit 201 displays an operation screen for performing dial input.
  • the mobile phone function unit 225 transmits the data to the communication partner via the data processing unit 202, the communication control unit 203, and the transceiver 204. It becomes a state.
  • a signal from the communication partner is decoded via the transceiver 204, the communication control unit 203, and the data processing unit 202.
  • audio data is reproduced and audio is output from the speaker 206.
  • Audio data from the wearable terminal device 2 side is input to the data processing unit 202 via the microphone 207, encoded, and sent to the communication control unit 203. Thereafter, the transmission data is transmitted to the communication partner via the transceiver 204.
  • the memory 221 holds various data and applications (programs).
  • a battery 223 is prepared as a power source for the operation of the wearable terminal device 2.
  • the wearable terminal device 2 can access an arbitrary server (company) via the Internet (network) and download contents and applications, for example.
  • the downloaded content and application can be transferred to, for example, the DTV 1 shown in FIGS. 1 to 9 based on the control of the data transfer unit 227.
  • the wearable terminal device 2 can request the DTV 1 for program table image data, content reproduction data, or control screen data for controlling the DTV 1.
  • the control screen data includes, for example, data of a menu screen, an image quality adjustment (resolution, brightness, etc.) screen, a color adjustment screen, and a volume adjustment screen.
  • the user can give various adjustment inputs of the DTV 1 via the touch screen of the wearable terminal device 2.
  • the adjustment data such as the adjustment level described above can be stored in the memory 221 and provided to the DTV 1 when the power is turned on next time, if necessary, following authentication of the MAC address by the DTV 1.
  • the wearable terminal device 2 will be described below with reference to FIGS. 11A to 11C.
  • the power source that detects that the power source of the DTV 1 has already been turned on is detected prior to the above-described communication. It is preferable to have a state acquisition unit 251.
  • FIG. 11A to FIG. 11C are settings for setting the timing of user authentication (start of personalization by MAC address) described with reference to FIG. 1 to FIG. 8, taking a smartphone (multifunctional mobile phone device) that can be used as a wearable terminal device as an example. An example will be described.
  • the user is close to the place where the DTV 1 is installed by a certain distance, for example, immediately after the user enters the premises of the home or through the common part of the housing complex. In this state, it is expected that the DTV 1 is turned on. In addition, it should be avoided that a program (content) that is already viewed by another person (family member) is switched based on the personalization of the user due to the approach of the user holding the wearable terminal device 2.
  • the setting is made so that the MAC address of the own device is provided only when there is a MAC address authentication inquiry from the DTV 1 after the DTV 1 is turned on. can do.
  • a check box such as “Turn on TV automatically” (or a radio button in which only one of “ON” / “ON” is enabled) is displayed. It is preferable to control the power on / off of the DTV 1 in accordance with the setting by the check box (radio button). Note that the fact that the power of the DTV 1 is off can be acquired by communication between the power state acquisition unit 251 and the communication control unit 155 (LAN interface 171 / wireless communication unit 172) of the DTV 1, for example.
  • FIG. 11B is an example in which more detailed settings can be made with respect to the setting items. For example, for connection to the network, “setting each time” / “automatic” can be set following the setting screen shown in FIG. 11A. It is possible to select “Connect to the network”.
  • “personalization priority” that can execute personalization of a user in preference to other person's settings, for example, “ It is also possible to set “top priority”, “when priority is allowed from another terminal”, “not prioritized”, and the like.
  • “priority of personalization” is preferably invalidated when priority is already set on the DTV 1 side. That is, it is useful in the case where a program already received (reproduced) in the DTV 1 should not be changed due to a reason such as a visitor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Software Systems (AREA)
  • Telephone Function (AREA)

Abstract

The present invention provides an electronic device for detecting a user holding a portable terminal device and then activating a display or operation mode prepared for each user, and a method for controlling the same. The electronic device of an embodiment is provided with a communication means, a user specification means, and a setting means. The communication means acquires, by wire or radio, device information specific to the terminal device. The user specification means specifies a user on the basis of the device information about the terminal device as acquired by the communication means. The setting means sets personal settings pertaining to the user specified by the user specification means.

Description

電子機器及びその制御方法Electronic device and control method thereof
 この発明は、電子機器及びその制御方法に関する。 The present invention relates to an electronic device and a control method thereof.
 電子機器、例えばテレビジョン放送受信装置、モニタ装置とプレーヤ装置、パーソナルコンピュータ、携帯可能なタブレット端末装置や携帯電話装置等の映像や音声を再生するさまざまな機器が実用化されている。 Electronic devices such as television broadcast receivers, monitor devices and player devices, personal computers, portable tablet terminal devices, mobile phone devices, and other devices that play video and audio have been put into practical use.
 また、上述電子機器においては、例えば付属するカメラにより取り込んだ視聴者の顔、あるいはマイクにより取り込んだ声等を用いてユーザすなわち視聴者を認識し、視聴者毎に用意された表示や動作モードを起動することが提案されている。 Further, in the above-described electronic device, for example, the viewer, that is, the viewer's face captured by the attached camera, the voice captured by the microphone, or the like is recognized, and the display and operation mode prepared for each viewer are displayed. It has been proposed to start.
特開平10-211191号公報Japanese Patent Application Laid-Open No. 10-2111191 特開平10-243309号公報JP-A-10-243309 特開2010-272077号公報JP 2010-272077 A
 しかしながら、電子機器に付属するカメラにより取り込んだ視聴者の顔を用いて視聴者を認識する場合には、電子機器のカメラが視聴者の画像を取り込み可能に、視聴者が所定の位置に位置することが必要である。また、電子機器に付属するマイクにより取り込んだ声を用いて視聴者を認識する場合には、視聴者に発声(会話等)を要求することになる。 However, when the viewer is recognized using the viewer's face captured by the camera attached to the electronic device, the viewer is positioned at a predetermined position so that the camera of the electronic device can capture the viewer's image. It is necessary. Further, when a viewer is recognized using a voice captured by a microphone attached to the electronic device, the viewer is requested to utter (speak, etc.).
 この発明の目的は、携帯可能な端末装置を保持したユーザを検出して、ユーザ毎に用意された表示や動作モードを起動する電子機器及びその制御方法を提供することである。 An object of the present invention is to provide an electronic device that detects a user holding a portable terminal device and activates a display or an operation mode prepared for each user, and a control method thereof.
 実施形態の電子機器は、通信手段と、ユーザ特定手段と、設定手段と、を具備する。通信手段は、無線または有線により、端末装置に固有の機器情報を取得する。ユーザ特定手段は、前記通信手段が取得した前記端末装置の前記機器情報に基づいて、ユーザを特定する。設定手段は、前記ユーザ特定手段が特定したユーザに関連する個人設定を設定する。 The electronic device according to the embodiment includes a communication unit, a user specifying unit, and a setting unit. The communication means acquires device information unique to the terminal device by wireless or wired. The user specifying unit specifies a user based on the device information of the terminal device acquired by the communication unit. The setting means sets personal settings related to the user specified by the user specifying means.
実施形態が適用可能な電子機器及び制御方法の概要の一例を示す。An example of the outline | summary of the electronic device and control method which embodiment can apply is shown. 実施形態が適用可能な電子機器及び制御方法の一例を示す。1 illustrates an example of an electronic device and a control method to which the embodiment can be applied. 実施形態が適用可能な電子機器及び制御方法の一例を示す。1 illustrates an example of an electronic device and a control method to which the embodiment can be applied. 実施形態が適用可能な電子機器及び制御方法の一例を示す。1 illustrates an example of an electronic device and a control method to which the embodiment can be applied. 実施形態が適用可能な電子機器及び制御方法の一例を示す。1 illustrates an example of an electronic device and a control method to which the embodiment can be applied. 実施形態が適用可能な電子機器及び制御方法の一例を示す。1 illustrates an example of an electronic device and a control method to which the embodiment can be applied. 実施形態が適用可能な電子機器及び制御方法の一例を示す。1 illustrates an example of an electronic device and a control method to which the embodiment can be applied. 実施形態が適用可能な電子機器及び制御方法の一例を示す。1 illustrates an example of an electronic device and a control method to which the embodiment can be applied. 実施形態が適用可能な映像表示装置(電子機器)の構成の一例を示す。1 illustrates an example of a configuration of a video display apparatus (electronic device) to which an embodiment can be applied. 実施形態が適用可能な携帯端末(ウェアラブル端末)装置の構成の一例を示す。An example of the structure of the portable terminal (wearable terminal) apparatus with which an embodiment is applicable is shown. 実施形態を適用する携帯端末(ウェアラブル端末)装置の動作の一例を示す。An example of operation | movement of the portable terminal (wearable terminal) apparatus to which embodiment is applied is shown. 実施形態を適用する携帯端末(ウェアラブル端末)装置の動作の一例を示す。An example of operation | movement of the portable terminal (wearable terminal) apparatus to which embodiment is applied is shown. 実施形態を適用する携帯端末(ウェアラブル端末)装置の動作の一例を示す。An example of operation | movement of the portable terminal (wearable terminal) apparatus to which embodiment is applied is shown.
 以下、図面を参照して、本発明の実施の一形態について説明する。 Hereinafter, an embodiment of the present invention will be described with reference to the drawings.
 図1は、実施形態が実現する制御方法が適用可能な電子機器の一例を示す。 FIG. 1 shows an example of an electronic device to which the control method realized by the embodiment can be applied.
 電子機器は、例えばデジタルテレビジョン放送受信再生装置(以下、DTVと称する)であるクライアント装置1、及び例えばユーザが所持し、あるいはユーザの体に取り付けることが可能な携帯可能端末装置(携帯端末、以下ウェアラブル端末装置と称する)2、を示す。なお、図1が示す通り、クライアント装置1とウェアラブル端末装置2との間のデータ処理(データの送受信)は、それぞれの装置の組み合わせが規定する通信方式制御ブロック3(直接通信の場合は不要)が受け持つ。 The electronic device includes, for example, a client device 1 which is a digital television broadcast reception / playback device (hereinafter referred to as a DTV), and a portable terminal device (a portable terminal, which can be carried by the user or attached to the user's body). (Hereinafter referred to as a wearable terminal device) 2. As shown in FIG. 1, data processing (data transmission / reception) between the client device 1 and the wearable terminal device 2 is a communication method control block 3 defined by the combination of the devices (not required for direct communication). Is responsible.
 なお、クライアント装置1は、DTVに限らず、映像や音声の再生が可能なさまざまな機器であってよく、例えばデジタルレコーダ(記録装置)と接続するモニタ装置(映像表示装置)等であってもよい。また、クライアント装置1は、放送を受信する機能を有するパーソナルコンピュータ(PC)、あるいはテレビジョン放送の受信機能を有する携帯可能端末装置等であってもよい。 The client device 1 is not limited to the DTV, and may be various devices capable of reproducing video and audio. For example, the client device 1 may be a monitor device (video display device) connected to a digital recorder (recording device). Good. The client device 1 may be a personal computer (PC) having a function of receiving a broadcast, a portable terminal device having a function of receiving a television broadcast, or the like.
 ウェアラブル端末装置2は、少なくとも1つの通信方式により、装置に固有のデータ、例えばMAC(Media Access Control)アドレスを送信(発信)できる装置であればよく、例えば映像や音声の再生が可能な携帯端末機器、取り付けた(装着中の)ユーザの情報を送信することが可能な計測機器、装着中のユーザに情報を提供可能な情報再生装置等のさまざまな装置への適用が可能である。 The wearable terminal device 2 may be any device that can transmit (send) data unique to the device, for example, a MAC (Media Access Control) address, by at least one communication method. For example, the wearable terminal device 2 is a portable terminal capable of reproducing video and audio The present invention can be applied to various devices such as a device, a measuring device capable of transmitting information of an attached (attached) user, and an information reproducing device capable of providing information to the attached user.
 すなわち、ウェアラブル端末装置2は、例えば携帯電話装置やスマートフォン(多機能携帯電話装置)、ユーザが常時装着して利用することを想定したリストバンドや腕時計等に通信機能を組みこんだ機器もしくは腕等に装着する通信端末装置、あるいは眼鏡形状やヘッドフォンもしくはイヤフォンのような利用時に必ずユーザが装着する携行品状とした端末装置等、さまざまな形状及び利用形態であってよい。なお、ウェアラブル端末装置2は、タッチパネルのようなユーザが直接操作可能なユーザインタフェースを有する場合もあるが、ユーザが体に装着することを想定して、マイクロフォンのような集音機構と音声制御機構(音声を取得し、その音声に対応する制御を行う機構)を有してもよいし、両者が併用できるものであってもよい。 That is, the wearable terminal device 2 is, for example, a mobile phone device, a smart phone (multifunctional mobile phone device), a device or an arm that incorporates a communication function in a wristband or a wristwatch that is assumed to be always worn by a user and used. Various shapes and usage forms may be used, such as a communication terminal device to be worn on a mobile phone, or a terminal device in the form of a personal belonging that must be worn by the user when using glasses, headphones, or earphones. The wearable terminal device 2 may have a user interface that can be directly operated by the user, such as a touch panel. However, assuming that the user wears it on the body, a sound collection mechanism and a voice control mechanism such as a microphone are provided. (A mechanism for acquiring sound and performing control corresponding to the sound) may be included, or both may be used together.
 また、ウェアラブル端末装置2としては、リストバンドのようなユーザが体に装着することを想定した機器においては、例えばクライアント装置1との間で、直接通信可能とすることも可能で有るが、例えばユーザが携帯(携行)するであろうスマートフォン等との間の近距離通信のみを可能とすることもできる。すなわち、ウェアラブル端末装置2は、クライアント装置1との間で、多くのデータについての通信(送受信)機能を有することが好ましいが、ユーザが体に装着することを想定し、例えばGPS(Global Positioning System)等のデータを受信して特定した自装置の位置情報を外部に送信する(クライアント装置が利用可能な情報を発信する)機能のみを有するものであってもよい。 The wearable terminal device 2 may be capable of direct communication with, for example, the client device 1 in a device that is assumed to be worn by a user such as a wristband. It is also possible to enable only short-range communication with a smartphone or the like that the user will carry (carry). That is, the wearable terminal device 2 preferably has a communication (transmission / reception) function for a large amount of data with the client device 1, but it is assumed that the user wears it on the body, for example, GPS (Global Positioning System) ) And the like, it may have only a function of transmitting the position information of the own apparatus specified by receiving the data to the outside (transmitting information usable by the client apparatus).
 なお、上述のクライアント装置1及びウェアラブル端末装置2において説明する要素や構成は、マイクロコンピュータ(処理装置、CPU(Central Processing Unit))により、ソフトウェアで実現するものであってもよいし、ハードウェアで実現するものであってもよい。 The elements and configurations described in the client device 1 and the wearable terminal device 2 described above may be realized by software by a microcomputer (processing device, CPU (Central Processing Unit)), or by hardware. It may be realized.
 また、以下の記述において、放送は、空間を伝播する電波により放送事業者(放送局)が提供するもの、あるいはケーブル(光ファイバを含む)やインターネット・プロトコル(IP)通信/配信網等により配信事業者が配信するもの(ストリーミング映像信号)、DLNA(Digital Living Network Alliance)規格やWiDi(Wireless Display-Distribution System)規格の利用による家庭内や小規模事業所等において利用する閉じたネットワーク、もしくは半導体メモリや光ディスク等の記録媒体を介して供給されるもの、等、を含む。放送はまた、映像と音声および/または音楽、もしくは符号(コード)化されたデータ等を含み、一定の時間(放送時間)を単位とする番組を、連続して、あるいは一定期間(時間)提供する。なお、番組は、コンテンツもしくはストリームと呼称する場合もある。また、映像は、動画と静止画あるいは、テキスト(データのうちのコード化された符号列で示される文字)情報等で表されるテキスト情報、ならびにその任意の組み合わせを含む。 Also, in the following description, broadcasting is distributed by a broadcaster (broadcasting station) provided by a radio wave propagating in space, or by a cable (including optical fiber) or an Internet protocol (IP) communication / distribution network. Closed networks or semiconductors used in homes or small-scale offices by using broadcasters (streaming video signals), DLNA (Digital Living Network Alliance) standards, and WiDi (Wireless Display-Distribution System) standards And the like supplied via a recording medium such as a memory or an optical disk. Broadcasting also includes video and audio and / or music, or encoded data, etc., and provides a program in units of a certain time (broadcasting time) continuously or for a certain period (time). To do. A program may be referred to as a content or a stream. In addition, the video includes moving image and still image, text information represented by text (characters represented by a coded code string of data) information, and any combination thereof.
 図2~図6は、図1に示したクライアント装置1及びウェアラブル端末装置2ならびに通信方式または制御ブロック3の組み合わせの一例を示す。 2 to 6 show an example of a combination of the client device 1 and the wearable terminal device 2 and the communication method or control block 3 shown in FIG.
 図2は、制御ブロック3が不要なクライアント装置1とウェアラブル端末装置2との間の直接通信が可能な通信方式を用いる一例を示す。 FIG. 2 shows an example of using a communication method that allows direct communication between the client device 1 and the wearable terminal device 2 that do not require the control block 3.
 例えば、クライアント装置1及びウェアラブル端末装置2のそれぞれが、IEEE(Institute of Electrical and Electronics Engineers)802.15.1に準拠するBlutooth(登録商標)/ブルートゥース(登録商標))通信部1-1、2-1を含む(装備する)ことで、ウェアラブル端末装置2を保持した(装着した)ユーザがクライアント装置1との間の距離がブルートゥース規格において推奨する距離の範囲内に入った場合、クライアント装置1において、(ウェアラブル端末装置2を保持した)ユーザを特定できる。なお、上述した通り、ウェアラブル端末装置2(ユーザ)の認識には、例えばウェアラブル端末装置2のMACアドレスを、クライアント装置1が受信することで実現できる。 For example, each of the client device 1 and the wearable terminal device 2 includes a Bluetooth (registered trademark) / Bluetooth (registered trademark)) communication unit 1-1, 2 compliant with IEEE (Institute of Electrical and Electronics Electronics) (802.15.1). -1 is included (equipped), when the user holding (wearing) the wearable terminal device 2 falls within the distance range recommended in the Bluetooth standard when the distance from the client device 1 falls within the range of the distance recommended by the Bluetooth standard The user (holding the wearable terminal device 2) can be specified. As described above, the wearable terminal device 2 (user) can be recognized by the client device 1 receiving, for example, the MAC address of the wearable terminal device 2.
 図3は、例えば無線LAN(Local Area Network)を制御ブロック3とする例であり、外部ネットワークとの接続が可能なルータ31と接続するアクセスポイント(AP)32との間の無線通信により、クライアント装置1とウェアラブル端末装置2との間のデータの受け渡しが可能である。 FIG. 3 is an example in which, for example, a wireless LAN (Local Area Network) is used as the control block 3, and a client is obtained by wireless communication with an access point (AP) 32 connected to a router 31 that can be connected to an external network. Data can be exchanged between the device 1 and the wearable terminal device 2.
 図3に示す例では、クライアント装置1及びウェアラブル端末装置2のそれぞれが、IEEE802.11(x、xは、b/g/n等の区分を示す)に準拠するWiFi(登録商標)通信部1-2、2-2を含む(装備する)場合、ウェアラブル端末装置2を保持した(装着した)ユーザがクライアント装置1との間の距離がアクセスポイント(AP)32との間の通信が可能な距離の範囲内に入った場合、クライアント装置1において、(ウェアラブル端末装置2を保持した)ユーザを特定できる。なお、上述した通り、ウェアラブル端末装置2(ユーザ)の認識には、例えばウェアラブル端末装置2のMACアドレスを、クライアント装置1が受信することで実現できる。 In the example illustrated in FIG. 3, each of the client device 1 and the wearable terminal device 2 includes a WiFi (registered trademark) communication unit 1 that conforms to IEEE802.11 (x and x indicate classifications such as b / g / n). -2, 2-2 included (equipped), the distance between the user holding (wearing) the wearable terminal device 2 and the client device 1 can communicate with the access point (AP) 32 When entering the range of the distance, the client device 1 can identify the user (holding the wearable terminal device 2). As described above, the wearable terminal device 2 (user) can be recognized by the client device 1 receiving, for example, the MAC address of the wearable terminal device 2.
 図4は、クライアント装置1側が、例えばEthernet(登録商標)/イーサネット(登録商標)により制御ブロック3と接続する例を示す。 FIG. 4 shows an example in which the client device 1 side is connected to the control block 3 by, for example, Ethernet (registered trademark) / Ethernet (registered trademark).
 図4が示す実施形態においては、クライアント装置1は、Ethernet接続が可能な通信部1-3を有し、ルータ41と接続するハブ(Hub)42とクライアント装置1と間のデータの受け渡しが、例えばX(Xは、100または1000のいずれかの識別名)-ベース(Base)-TX/T(TX/Tは、識別名との組み合わせにより決まる符合)に準拠した有線通信であり、ルータ41とウェアラブル端末装置2との間のデータの受け渡しが、ウェアラブル端末装置2の通信部2-2によるWiFi通信である。 In the embodiment shown in FIG. 4, the client device 1 includes a communication unit 1-3 capable of Ethernet connection, and data exchange between the client device 1 and a hub 42 connected to the router 41 and the client device 1 is performed. For example, the wired communication conforms to X (X is an identification name of 100 or 1000) -Base-TX / T (TX / T is a code determined by a combination with the identification name). The data transfer between the wearable terminal apparatus 2 and the communication unit 2-2 of the wearable terminal apparatus 2 is WiFi communication.
 すなわち、図4に示す例では、ウェアラブル端末装置2を保持した(装着した)ユーザが、アクセスポイント(AP)32との間の通信が可能な距離の範囲内に入った場合、ハブ42と有線接続されたクライアント装置1において、(ウェアラブル端末装置2を保持した)ユーザを特定できる。なお、上述した通り、ウェアラブル端末装置2(ユーザ)の認識には、例えばウェアラブル端末装置2のMACアドレスを、クライアント装置1が受信することで実現できる。 That is, in the example shown in FIG. 4, when the user holding (wearing) the wearable terminal device 2 enters the range of the distance where communication with the access point (AP) 32 is possible, the hub 42 and the wired In the connected client device 1, a user (holding the wearable terminal device 2) can be specified. As described above, the wearable terminal device 2 (user) can be recognized by the client device 1 receiving, for example, the MAC address of the wearable terminal device 2.
 図5及び図6は、クライアント装置1に付属するリモートコントローラ(リモコン)5を、ウェアラブル端末装置2との間の通信に利用する例を示す。すなわち、図5及び図6に示す例では、制御ブロック3が不要で、クライアント装置(DTV)1がネットワークと接続していない環境においても、実施形態の利用が可能となる。なお、リモコン5とDTV(クライアント装置)1との間の制御信号は、DTV1のリモコン受けつけ部1-5とリモコン5とにおいて固有の、例えばIr(赤外線)通信もしくはRF(無線)通信となる。 5 and 6 show an example in which a remote controller (remote controller) 5 attached to the client device 1 is used for communication with the wearable terminal device 2. FIG. That is, in the example shown in FIGS. 5 and 6, the control block 3 is unnecessary, and the embodiment can be used even in an environment where the client device (DTV) 1 is not connected to the network. The control signal between the remote controller 5 and the DTV (client device) 1 is, for example, Ir (infrared) communication or RF (wireless) communication unique to the remote control receiving unit 1-5 of the DTV 1 and the remote controller 5.
 すなわち、図5が示す例は、ウェアラブル端末装置2及びリモコン5のそれぞれが、例えば近接無線通信が可能な(低速)磁気カップリングユニット(NFC(Near Field Communication))2-5、51を含む。従い、ウェアラブル端末装置2をリモコン5がNFC接続可能な距離に位置することで、あるいはウェアラブル端末装置2にリモコン5を実質的に接する位置に位置する(リモコン5をウェアラブル端末装置2にかざす)ことで、リモコン5が(ウェアラブル端末装置2を保持した)ユーザを特定できる。 That is, the example shown in FIG. 5 includes each of the wearable terminal device 2 and the remote controller 5 including, for example, (low-speed) magnetic coupling units (NFC (Near Field Communication)) 2-5 and 51 capable of near field communication. Accordingly, the wearable terminal device 2 is positioned at a distance where the remote controller 5 can be NFC-connected, or is positioned at a position where the remote control 5 is substantially in contact with the wearable terminal device 2 (the remote controller 5 is held over the wearable terminal device 2). Thus, the remote controller 5 can identify the user (holding the wearable terminal device 2).
 この場合、以降のリモコン5の操作によるDTV(クライアント装置)1の制御においては、ユーザを特定した状態での制御が可能となる。なお、上述した通り、ウェアラブル端末装置2(ユーザ)の認識には、例えばウェアラブル端末装置2のMACアドレスを、1が受信することで実現できる。 In this case, in the subsequent control of the DTV (client device) 1 by the operation of the remote controller 5, it is possible to control the user in a specified state. Note that, as described above, the wearable terminal device 2 (user) can be recognized when, for example, 1 receives the MAC address of the wearable terminal device 2.
 一方、図6が示す例は、ウェアラブル端末装置2及びリモコン5のそれぞれが、例えば近接無線通信が可能な(高速)無線通信、例えばトランスファージェット(登録商標)/TransferJet(登録商標)規格に準拠した通信部2-6、61を含み、ウェアラブル端末装置2を、リモコン5がNFC接続可能な距離に位置することで、あるいはウェアラブル端末装置2に、リモコン5を実質的に接する位置に位置する(リモコン5をウェアラブル端末装置2にかざす)ことで、リモコン5がDTV1のリモコン受けつけ部1-5との間の(実質的にリモコン5からリモコン受けつけ部1-5への制御信号の入力である)通信により、ウェアラブル端末装置2を保持したユーザを特定できる。従って、以降のリモコン5の操作によるDTV(クライアント装置)1の制御において、ユーザを特定した状態での制御が可能となる。 On the other hand, in the example shown in FIG. 6, each of the wearable terminal device 2 and the remote controller 5 is compliant with, for example, (high-speed) wireless communication capable of close proximity wireless communication, for example, Transferjet (registered trademark) / TransferJet (registered trademark) standard. The communication unit 2-6, 61 is included, and the wearable terminal device 2 is located at a position where the remote control 5 can be NFC-connected, or at a position where the remote control 5 is substantially in contact with the wearable terminal device 2 (remote control). Communication between the remote controller 5 and the remote control receiving unit 1-5 of the DTV 1 (substantially an input of a control signal from the remote control 5 to the remote control receiving unit 1-5). Thus, the user holding the wearable terminal device 2 can be specified. Therefore, in the subsequent control of the DTV (client device) 1 by the operation of the remote controller 5, it is possible to perform control in a state where the user is specified.
 なお、トランスファージェット規格においては、ウェアラブル端末装置2とリモコン5との間の通信において、それぞれの通信部の距離及び向きが指定される場合があるため、リモコン5に、例えばウェアラブル端末装置2をリモコン5に近接するためのマークや、カードリーダのようにウェアラブル端末装置2の一部をリモコン5で支持できる特徴的な形状を用意することが好ましい。 In the transfer jet standard, since the distance and direction of each communication unit may be specified in communication between the wearable terminal device 2 and the remote controller 5, the wearable terminal device 2 is connected to the remote controller 5, for example. It is preferable to prepare a characteristic shape that allows the remote controller 5 to support a part of the wearable terminal device 2 such as a mark for approaching the device 5 or a card reader.
 また、図5及び図6に示したウェアラブル端末装置2との間の通信にリモコン5を利用する場合においては、図7及び図8に示す通り、ウェアラブル端末装置2を、例えばスマートフォン等のようにユーザが携帯する端末装置(2)と、ユーザが身に着ける例えばリストバンドや腕時計あるいはのような(計測機器)端末装置21とに分割し、ウェアラブル端末装置2と端末装置21との間の通信を、例えば上述のブルートゥースあるいはWiFiによる通信等により分担することも可能である。 Further, in the case where the remote controller 5 is used for communication with the wearable terminal device 2 shown in FIGS. 5 and 6, the wearable terminal device 2 is connected to the wearable terminal device 2 as shown in FIG. 7 and FIG. Communication between the wearable terminal device 2 and the terminal device 21 is divided into a terminal device (2) carried by the user and a terminal device 21 such as a wristband, a wristwatch, or the like (measuring equipment) worn by the user. Can be shared, for example, by the above-described Bluetooth or WiFi communication.
 なお、図1~図8が示すクライアント装置1とウェアラブル端末装置2とにおいて実現可能な機能としては、例えば  
 起動時における選局(ラストチャンネル)の選択  
 起動時の音量の選択  
 映像/音声モードの選択  
 視聴制限の設定/選択  
 クラウドサービス利用時のポータル画面のレイアウト/項目/内容の選択  
 クラウドサービス利用時のユーザログイン画面のレイアウト/項目/内容の選択  
 録画済み番組のフォルダ(ユーザ)の選択  
 OTT(Over-The-Top)、例えば  
   YouTube(登録商標)  
   USTREAM(登録商標)  
   Dailymotion(登録商標)  
等のインターネット(Web)サービスへの自動ログイン  
等の個々のユーザ毎に区分する固有の起動サービス(Personalize、パーソナライズ)  
を含む。
The functions that can be realized in the client device 1 and the wearable terminal device 2 shown in FIGS. 1 to 8 are, for example,
Selection of channel selection (last channel) at startup
Select volume at startup
Select video / audio mode
Setting / selecting viewing restrictions
Selection of layout / item / content of portal screen when using cloud service
Selection of user login screen layout / item / content when using cloud service
Select a folder (user) for recorded programs
OTT (Over-The-Top), for example
Youtube (registered trademark)
USSTREAM (registered trademark)
Daimotion (registered trademark)
Automatic login to Internet (Web) services such as
Specific startup services (Personalize, etc.) classified by individual users
including.
 これにより、例えばDTV(クライアント装置)1の電源をオフしたユーザがもう一度DTV1の電源をオンした(電源をオフしたユーザと次に電源をオンしたユーザが同じ)場合、そのユーザが視聴していたチャンネルやコンテンツの供給元を特定したDTV1の起動が可能となる。すなわち、音量設定や音響(例えばサラウンドシステムの再現環境)設定を、DTV1の電源をオフする直前の設定として(DTV1を)起動できる。なお、例えば音量については、内蔵する時計の時刻情報に基づき、夜間あるいは深夜/早朝を考慮して、大きな音量での起動を抑止することも可能である。 Thus, for example, when a user who turned off the power of the DTV (client device) 1 turns on the power of the DTV 1 again (the user who turned off the power is the same as the user who turned on the next time), the user was watching The DTV 1 that identifies the channel and the content supply source can be activated. That is, the sound volume setting and the sound (for example, the reproduction environment of the surround system) setting can be started (DTV 1) as the setting immediately before the DTV 1 is turned off. For example, with regard to the volume, it is possible to suppress activation at a large volume in consideration of nighttime or midnight / early morning based on time information of a built-in clock.
 一方、電源をオフしたユーザと次に電源をオンしたユーザが異なる場合は、電源をオンしたユーザに関連付けられている設定で、DTV1を起動できる。 On the other hand, when the user who turned off the power is different from the user who turned on the next time, the DTV 1 can be activated with the settings associated with the user who turned on the power.
 また、MACアドレスが取得できない場合には、例えば電源がオフされた時点で受信中であった放送のチャンネルとその時点で設定されていた音量等、DTV1に固有の標準的な設定で起動される。なお、例えば深夜帯等において、電源がオフされた時点で受信中であったチャンネルが放送休止中であるような場合には、その時点で受信可能なチャンネルを一覧表示する画面(選択画面)から起動することも可能である。 If the MAC address cannot be acquired, for example, the broadcast channel being received when the power is turned off and the volume set at that time are activated with standard settings unique to the DTV 1. . For example, in the case of midnight, when the channel that was being received when the power was turned off is not broadcasting, the screen (selection screen) that displays a list of channels that can be received at that time is displayed. It is also possible to start.
 また、上述した機能は、DTV1もしくは付属するリモコン(リモートコントローラ)において、予めスーパーセットとして定義されており、どの機能をサポート(実行)するか、クライアント装置1とウェアラブル端末装置2との間の通信(例えば、ウェアラブル端末装置2側における設定)により、定義することで実現できるものとする。 The functions described above are defined in advance as a superset in the DTV 1 or the attached remote controller (remote controller), and which function is supported (executed), communication between the client device 1 and the wearable terminal device 2 It can be realized by defining (for example, setting on the wearable terminal device 2 side).
 図9は、図1~図8に示したDTVとして利用可能なテレビジョン放送受信装置の構成の一例を示す。 FIG. 9 shows an example of the configuration of a television broadcast receiving apparatus that can be used as the DTV shown in FIGS.
 DTV1は、図9に一例を示すが、チューナ111、復調部112、信号処理部113、音声処理部121、映像処理部131、OSD処理部132、表示処理部133、制御部150、操作入力部161、及び受光部162を備える。また、DTV1は、スピーカ122及びディスプレイ134を備える。 Although an example is shown in FIG. 9, the DTV 1 includes a tuner 111, a demodulation unit 112, a signal processing unit 113, an audio processing unit 121, a video processing unit 131, an OSD processing unit 132, a display processing unit 133, a control unit 150, and an operation input unit. 161 and a light receiving unit 162. The DTV 1 includes a speaker 122 and a display 134.
 チューナ111は、例えばアンテナANTが受信する放送信号を受けつけ、放送信号のチューニング(チャンネル選局)を行う。チューナ111は、チューニングしたチャンネルの放送信号を復調部112に入力する。なお、チューナ111は、外部入力としての映像及び音声からなるコンテンツ(番組)の入力も受けつけることができる。また、チューナ111は、少なくとも2つのコンテンツを同時に処理可能で、例えば空間波として供給される任意のチャンネルの番組と、例えば外部入力から入力される画像品質の高いHD(High-Definition)映像とを処理できる。なお、HD映像は、例えばネットワークあるいは専用の配信網を経由して取得することもできる。 The tuner 111 receives a broadcast signal received by the antenna ANT, for example, and tunes the broadcast signal (channel selection). The tuner 111 inputs the tuned channel broadcast signal to the demodulation unit 112. The tuner 111 can also accept input of content (program) made up of video and audio as external input. Further, the tuner 111 can process at least two contents at the same time. For example, a program of an arbitrary channel supplied as a spatial wave and, for example, an HD (High-Definition) video with high image quality input from an external input. It can be processed. Note that HD video can also be acquired via a network or a dedicated distribution network, for example.
 復調部112は、チューニングしたチャンネルの放送信号もしくは外部入力コンテンツを復調する。復調部112は、復調した放送信号(コンテンツ)を信号処理部113に、入力する。 The demodulator 112 demodulates the tuned channel broadcast signal or external input content. The demodulator 112 inputs the demodulated broadcast signal (content) to the signal processor 113.
 信号処理部113は、復調した放送信号(コンテンツデータ)を処理する信号処理手段すなわちDSP(Digital Signal Processor)を、少なくとも含む。信号処理部113は、復調部112が復調した放送信号(コンテンツ)を、映像信号、音声信号、及びその他のデータ信号に分離し、音声処理部121に音声信号を、映像処理部131に映像信号を、供給する。信号処理部113はまた、制御部150及び/またはOSD処理部132にデータ信号を供給する。 The signal processing unit 113 includes at least signal processing means for processing the demodulated broadcast signal (content data), that is, a DSP (Digital Signal Processor). The signal processing unit 113 separates the broadcast signal (content) demodulated by the demodulation unit 112 into a video signal, an audio signal, and other data signals, the audio signal is input to the audio processing unit 121, and the video signal is input to the video processing unit 131. Supply. The signal processing unit 113 also supplies a data signal to the control unit 150 and / or the OSD processing unit 132.
 音声処理部121は、信号処理部113からのディジタル音声信号を、スピーカ122により再生可能なフォーマットの信号(オーディオ信号)に変換する。音声処理部121は、オーディオ信号をスピーカ122に供給する。スピーカ122は、供給されるオーディオ信号から、音声及び/または音響(Audio)を再生する。 The audio processing unit 121 converts the digital audio signal from the signal processing unit 113 into a signal (audio signal) in a format that can be reproduced by the speaker 122. The audio processing unit 121 supplies an audio signal to the speaker 122. The speaker 122 reproduces sound and / or sound (Audio) from the supplied audio signal.
 映像処理部131は、信号処理部113から受信した映像信号を、ディスプレイ134で再生可能なフォーマットの映像信号を、ディスプレイ134で再生可能なフォーマットの映像信号にデコード(再生)する。また、映像処理部131は、OSD処理部132から供給されるOSD信号を映像信号に重畳する。映像処理部131は、映像信号を表示処理部133に出力する。 The video processing unit 131 decodes (reproduces) the video signal received from the signal processing unit 113 into a video signal in a format reproducible on the display 134 to a video signal in a format reproducible on the display 134. In addition, the video processing unit 131 superimposes the OSD signal supplied from the OSD processing unit 132 on the video signal. The video processing unit 131 outputs the video signal to the display processing unit 133.
 OSD処理部132は、信号処理部113から供給されるデータ信号、及び/または制御部150から供給される制御信号に基づいて、GUI(Graphical User Interface、グラフィカルユーザーインタフェース)画面、字幕、時刻、または他の情報などを画面に重畳して表示するOSD信号を生成する。 Based on the data signal supplied from the signal processing unit 113 and / or the control signal supplied from the control unit 150, the OSD processing unit 132 is configured to display a GUI (Graphical User Interface), subtitle, time, or An OSD signal for displaying other information superimposed on the screen is generated.
 表示処理部133は、例えば、制御部150からの制御に基づいて、受信した映像信号に対して色味、明るさ、シャープ、コントラスト、またはその他の画質調整処理を行う。表示処理部133は、画質調整を施した映像信号をディスプレイ134に供給する。ディスプレイ134は、供給される映像信号に基づいて映像(Video)を表示する。 The display processing unit 133 performs, for example, color, brightness, sharpness, contrast, or other image quality adjustment processing on the received video signal based on the control from the control unit 150. The display processing unit 133 supplies the video signal subjected to the image quality adjustment to the display 134. The display 134 displays video based on the supplied video signal.
 ディスプレイ134は、例えば、マトリクス状に配列された複数の画素を備える液晶表示パネルと、この液晶パネルを照明するバックライトとを備える液晶表示装置などを備える。ディスプレイ134は、DTV1から供給される映像信号に基づいて映像を表示する。 The display 134 includes, for example, a liquid crystal display device including a liquid crystal display panel including a plurality of pixels arranged in a matrix and a backlight for illuminating the liquid crystal panel. The display 134 displays a video based on the video signal supplied from the DTV 1.
 なお、DTV1は、ディスプレイ134の代わりに映像出力端子を備える構成であってもよい。また、DTV1は、スピーカ122の代わりに音声出力端子を備える構成であってもよい。 Note that the DTV 1 may be configured to include a video output terminal instead of the display 134. Further, the DTV 1 may be configured to include an audio output terminal instead of the speaker 122.
 制御部150は、DTV1の各部の動作を制御する制御手段(制御ブロック)として機能する。制御部150は、CPU(主制御ユニット)151、ROM(読み出し専用メモリ)152、RAM(書き換え可能(ランダムアクセス)メモリ)153、EEPROM(不揮発性メモリ)154、及び通信制御部155、等を備えている。制御部150は、操作入力部161、もしくは受光部162を通じてリモートコントローラ163から供給される操作信号に基づいて、種々の処理を行う。 The control unit 150 functions as a control unit (control block) that controls the operation of each unit of the DTV 1. The control unit 150 includes a CPU (main control unit) 151, a ROM (read only memory) 152, a RAM (rewritable (random access) memory) 153, an EEPROM (nonvolatile memory) 154, a communication control unit 155, and the like. ing. The control unit 150 performs various processes based on operation signals supplied from the remote controller 163 through the operation input unit 161 or the light receiving unit 162.
 CPU151は、種々の演算処理を実行する演算素子、ファームウエアを保持し、実行するメモリ領域などを備える。CPU151は、ROM152、またはEEPROM154などに記憶されているプログラムを実行することにより種々の機能を実現する。 The CPU 151 includes an arithmetic element that executes various arithmetic processes, a memory area that holds and executes firmware, and the like. The CPU 151 implements various functions by executing programs stored in the ROM 152, the EEPROM 154, or the like.
 ROM152は、DTV1を制御するプログラム、及び各種の機能を実現するプログラム等、を記憶する。CPU151は、操作入力部161、もしくはリモートコントローラ163から供給される操作信号に基づいて、ROM152に記憶されているプログラムを起動する。これにより、制御部150は、各部の動作を制御する。 The ROM 152 stores a program for controlling the DTV 1, a program for realizing various functions, and the like. The CPU 151 activates a program stored in the ROM 152 based on an operation signal supplied from the operation input unit 161 or the remote controller 163. Thereby, the control part 150 controls operation | movement of each part.
 RAM153は、CPU151のワークメモリとして機能する。すなわち、RAM153は、CPU151の演算結果、CPU151により読み込まれたデータ、操作入力部161もしくはリモートコントローラ(以下、リモコンと称する)163により入力される入力情報などを記憶する。 The RAM 153 functions as a work memory for the CPU 151. That is, the RAM 153 stores the calculation result of the CPU 151, data read by the CPU 151, input information input by the operation input unit 161 or a remote controller (hereinafter referred to as a remote controller) 163, and the like.
 EEPROM154は、各種の設定情報、プログラム、及びユーザ認証すなわちMACアドレスに対応するユーザの特定(認証)及びそのユーザについてパーソナライズとして設定されている情報、等を記憶する。 The EEPROM 154 stores various setting information, programs, user authentication, that is, user identification (authentication) corresponding to the MAC address, information set as personalization for the user, and the like.
 通信制御部155は、ネットワークを経由する外部との通信、例えば配信事業者が提供するコンテンツの取得やインターネット(ネットワーク)を経由する録画予約の受けつけあるいはブルートゥース等を用いるウェアラブル端末装置2との間の通信等を制御する。 The communication control unit 155 communicates with the outside via the network, for example, acquisition of content provided by the distribution company, acceptance of recording reservation via the Internet (network), or wearable terminal device 2 using Bluetooth or the like. Control communications and so on.
 ストレージ160は、コンテンツを記憶する記憶媒体を有する。例えば、ストレージ160は、ハードディスクドライブ(HDD)、ソリッドステートドライブ(SSD)、または半導体メモリ等により構成される。ストレージ160は、信号処理部113が取得する録画ストリームを記憶することができる。 The storage 160 has a storage medium for storing content. For example, the storage 160 is configured by a hard disk drive (HDD), a solid state drive (SSD), a semiconductor memory, or the like. The storage 160 can store a recording stream acquired by the signal processing unit 113.
 操作入力部161は、例えば操作キー、キーボード、マウス、タッチパッドまたは操作入力に応じて操作信号を生成する事ができる他の入力装置などを備える入力手段である。例えば、操作入力部161は、操作入力に応じて操作信号を生成する。操作入力部161は、生成した操作信号を制御部150に供給する。なお、タッチパッドは、静電容量式センサ、サーモセンサ、または他の方式に基づいて位置情報を生成するデバイスを含む。また、DTV1がディスプレイ134を備える場合、操作入力部161は、ディスプレイ134と一体に形成されるタッチパネルなどを備える構成であってもよい。 The operation input unit 161 is an input unit including, for example, an operation key, a keyboard, a mouse, a touch pad, or another input device that can generate an operation signal in response to an operation input. For example, the operation input unit 161 generates an operation signal according to the operation input. The operation input unit 161 supplies the generated operation signal to the control unit 150. Note that the touch pad includes a device that generates position information based on a capacitive sensor, a thermo sensor, or another method. When the DTV 1 includes the display 134, the operation input unit 161 may include a touch panel formed integrally with the display 134.
 受光部162は、例えばリモコン163(図5~図8では「5」を付与している)からの操作信号を受信し、制御部150に供給する。制御部150は、受光部162から供給された信号に基づいてリモコン163から送信された元の操作信号を復号し、CPU151に入力する。 The light receiving unit 162 receives an operation signal from, for example, the remote controller 163 (indicated by “5” in FIGS. 5 to 8) and supplies the operation signal to the control unit 150. The control unit 150 decodes the original operation signal transmitted from the remote controller 163 based on the signal supplied from the light receiving unit 162 and inputs the decoded operation signal to the CPU 151.
 リモコン163は、ユーザーの操作入力に基づいて操作信号を生成する。リモコン163は、生成した操作信号を赤外線通信により、受光部162に送信する。なお、受光部162及びリモコン163は、赤外線通信以下に、例えば所定周波数の電波等の他の無線通信により操作信号の送受信を行う構成であってもよい。 The remote controller 163 generates an operation signal based on a user operation input. The remote controller 163 transmits the generated operation signal to the light receiving unit 162 by infrared communication. The light receiving unit 162 and the remote controller 163 may be configured to transmit and receive operation signals by other wireless communication such as radio waves of a predetermined frequency, for example, after infrared communication.
 LANインタフェース171は、アクセスポイント(AP)32を含むホームネットワーク(DLNA)上に位置する他のテレビ装置やレコーダ装置、あるいはタブレット装置(ウェアラブル端末装置2)相互の接続及び外部ネットワーク(インターネット網)との間のデータの受け渡し、あるいはコンテンツのダウンロード(取得)、もしくは図4により説明したルータ41と接続するハブ(Hub)42との間のX-ベース-TX/TPに従う通信等を制御する。 The LAN interface 171 is connected to other television devices and recorder devices or tablet devices (wearable terminal devices 2) located on a home network (DLNA) including an access point (AP) 32, and to an external network (Internet network). And the like, control of communication according to X-base-TX / TP with the hub (Hub) 42 connected to the router 41 described with reference to FIG.
 無線通信部172は、図3により上述したIEEE802.11(a/b/g/n))規格に準拠する映像(Video)/音声(Audio)のストリーミング伝送、特に802.11nを用いる高速転送、あるいは図2により説明したブルートゥース(IEEE802.15.1)を適用する高速通信を行う。なお、無線通信部172は、LANインタフェース171と一体であってもよい。 The wireless communication unit 172 performs streaming transmission of video (Video) / audio (Audio) conforming to the IEEE802.11 (a / b / g / n) standard described above with reference to FIG. 3, particularly high-speed transfer using 802.11n, Alternatively, high-speed communication using Bluetooth (IEEE 802.15.1) described with reference to FIG. 2 is performed. Note that the wireless communication unit 172 may be integrated with the LAN interface 171.
 HDMI処理部173は、HDMI(登録商標)すなわちHDMI(High Definition Multimedia Interface)規格に基づいた通信を行なうHDMI端子を備えたインタフェースである。HDMI処理部173のHDMI端子には、ブルーレイ(登録商標)レコーダ、DVDレコーダ、ハードディスクレコーダ、または他の機器とHDMI規格に対応した機器(HDMI機器)、例えばレコーダ装置が接続される。HDMI処理部173は、接続されたHDMI機器から出力されたストリームを受信することができる。 The HDMI processing unit 173 is an interface including an HDMI terminal that performs communication based on the HDMI (registered trademark), that is, the HDMI (High Definition Multimedia Interface) standard. The HDMI terminal of the HDMI processing unit 173 is connected to a Blu-ray (registered trademark) recorder, DVD recorder, hard disk recorder, or other device and a device (HDMI device) that complies with the HDMI standard, such as a recorder device. The HDMI processing unit 173 can receive a stream output from the connected HDMI device.
 制御部150は、HDMI処理部173により受信したコンテンツデータを信号処理部113に入力させる。信号処理部113は、受信したコンテンツデータからディジタル映像信号、及びディジタル音声信号などを分離する。信号処理部113は、分離したディジタル映像信号を映像処理部131に送信し、分離したディジタル音声信号を音声処理部121に送信する。 The control unit 150 causes the signal processing unit 113 to input the content data received by the HDMI processing unit 173. The signal processing unit 113 separates a digital video signal, a digital audio signal, and the like from the received content data. The signal processing unit 113 transmits the separated digital video signal to the video processing unit 131 and transmits the separated digital audio signal to the audio processing unit 121.
 また、DTV1は、例えばSerial-ATA(Serial-Advanced Technology Attachment)、HDMI-HEC(HDMI Ethernet Channel)等の他のインタフェースを備えていても良い。 Also, the DTV 1 may include other interfaces such as Serial-ATA (Serial-Advanced Technology Attachment) and HDMI-HEC (HDMI Ethernet Channel).
 図10は、図1~図8に示したウェアラブル端末装置として利用可能なスマートフォン(多機能携帯電話装置)の構成の一例を示す。 FIG. 10 shows an example of the configuration of a smartphone (multifunctional mobile phone device) that can be used as the wearable terminal device shown in FIGS.
 図10は、ウェアラブル端末装置2の構成の一例を示している。表示部201は映像表示部として動作し、かつタッチスクリーンとして動作することができる。 FIG. 10 shows an example of the configuration of the wearable terminal device 2. The display unit 201 operates as a video display unit and can operate as a touch screen.
 ウェアラブル端末装置2においては、メニュー画面が表示されている状態で、ユーザが所望の項目にタッチすると、その操作入力が、制御部220の操作コマンド処理部224で認識される。 In the wearable terminal device 2, when the user touches a desired item while the menu screen is displayed, the operation input is recognized by the operation command processing unit 224 of the control unit 220.
 例えば、タッチ操作が、電話機能を選択する操作入力であった場合、動作モード設定部226が、ウェアラブル端末装置2を携帯電話モードに設定する。その場合、表示部201には、ダイヤル入力を行うための操作画面が表示される。以下、所望の宛先のダイヤル入力の操作が行われると、携帯電話機能部225が、データ処理部202、通信制御部203、送受信器204を介して通信相手との間の通信を可能に、送信状態となる。 For example, when the touch operation is an operation input for selecting a telephone function, the operation mode setting unit 226 sets the wearable terminal device 2 to the mobile phone mode. In this case, the display unit 201 displays an operation screen for performing dial input. Thereafter, when a dial input operation of a desired destination is performed, the mobile phone function unit 225 transmits the data to the communication partner via the data processing unit 202, the communication control unit 203, and the transceiver 204. It becomes a state.
 通信先との回線が接続状態になると、通信相手からの信号が送受信器204、通信制御部203、データ処理部202を介して復号される。また、音声データが再生され、スピーカ206から音声が出力される。ウェアラブル端末装置2側からの音声データは、マイクロフォン207を介してデータ処理部202に入力され、符号化処理され、通信制御部203に送られる。以下、送信データとして、送受信器204を経由して、通信相手先に送信される。メモリ221は、さまざまなデータや、アプリケーション(プログラム)を保持する。なお、ウェアラブル端末装置2の動作のための電源として、バッテリ223が用意されている。 When the line with the communication destination is connected, a signal from the communication partner is decoded via the transceiver 204, the communication control unit 203, and the data processing unit 202. In addition, audio data is reproduced and audio is output from the speaker 206. Audio data from the wearable terminal device 2 side is input to the data processing unit 202 via the microphone 207, encoded, and sent to the communication control unit 203. Thereafter, the transmission data is transmitted to the communication partner via the transceiver 204. The memory 221 holds various data and applications (programs). A battery 223 is prepared as a power source for the operation of the wearable terminal device 2.
 ウェアラブル端末装置2は、例えば、インターネット(ネットワーク)を経由して任意のサーバ(企業)にアクセスし、コンテンツやアプリケーションをダウンロードすることができる。ダウンロードしたコンテンツやアプリケーションをデータ転送部227の制御に基づいて、例えば図1~図9に示すDTV1に転送することができる。 The wearable terminal device 2 can access an arbitrary server (company) via the Internet (network) and download contents and applications, for example. The downloaded content and application can be transferred to, for example, the DTV 1 shown in FIGS. 1 to 9 based on the control of the data transfer unit 227.
 ウェアラブル端末装置2は、DTV1に対して、番組表の画像データ、コンテンツの再生データ、あるいはDTV1を制御するための制御画面データを要求することができる。制御画面データとしては、例えば、メニュー画面、画質調整(解像度、輝度など)画面、色調整画面、音量調整画面のデータがある。制御画面を取得した場合、ユーザは、DTV1の各種調整入力を、ウェアラブル端末装置2のタッチスクリーンを介して与えることができる。また、上述の調整レベル等の調整データは、メモリ221に保存し、DTV1によるMACアドレスの認証に続いて、必要に応じて、次回の電源オン時にDTV1に提供可能である。 The wearable terminal device 2 can request the DTV 1 for program table image data, content reproduction data, or control screen data for controlling the DTV 1. The control screen data includes, for example, data of a menu screen, an image quality adjustment (resolution, brightness, etc.) screen, a color adjustment screen, and a volume adjustment screen. When acquiring the control screen, the user can give various adjustment inputs of the DTV 1 via the touch screen of the wearable terminal device 2. The adjustment data such as the adjustment level described above can be stored in the memory 221 and provided to the DTV 1 when the power is turned on next time, if necessary, following authentication of the MAC address by the DTV 1.
 なお、ウェアラブル端末装置2は、図11A~図11Cを用いて以下に説明するが、DTV1の現在の動作状態として、既にDTV1の電源がオンされていることを上述の通信に先立って検出する電源状態取得部251を有することが好ましい。 The wearable terminal device 2 will be described below with reference to FIGS. 11A to 11C. As a current operation state of the DTV 1, the power source that detects that the power source of the DTV 1 has already been turned on is detected prior to the above-described communication. It is preferable to have a state acquisition unit 251.
 図11A~図11Cに、ウェアラブル端末装置として利用可能なスマートフォン(多機能携帯電話装置)を例に、図1~図8により説明したユーザ認証(MACアドレスによるパーソナライズの開始)のタイミングを設定する設定例を説明する。 FIG. 11A to FIG. 11C are settings for setting the timing of user authentication (start of personalization by MAC address) described with reference to FIG. 1 to FIG. 8, taking a smartphone (multifunctional mobile phone device) that can be used as a wearable terminal device as an example. An example will be described.
 図2~図4に示した例においては、ユーザがDTV1の設置されている場所に一定距離だけ近づいた状態、例えばユーザが自宅の敷地内に入った直後や、集合住宅の共用部分を通っている状態で、DTV1の電源がオンすることが予想される。また、既に他の人(家人)が視聴している番組(コンテンツ)が、ウェアラブル端末装置2を保持したユーザの接近により、ユーザのパーソナライズに基づいて、切り替えられることは避けられるべきである。 In the example shown in FIGS. 2 to 4, the user is close to the place where the DTV 1 is installed by a certain distance, for example, immediately after the user enters the premises of the home or through the common part of the housing complex. In this state, it is expected that the DTV 1 is turned on. In addition, it should be avoided that a program (content) that is already viewed by another person (family member) is switched based on the personalization of the user due to the approach of the user holding the wearable terminal device 2.
 このため、例えば図11Aが示すウェアラブル端末装置2の設定画面において、DTV1の電源をオンした後にDTV1からのMACアドレスの認証の問い合わせがあった場合にのみ自装置のMACアドレスを提供するよう、設定することができる。 For this reason, for example, in the setting screen of the wearable terminal device 2 shown in FIG. 11A, the setting is made so that the MAC address of the own device is provided only when there is a MAC address authentication inquiry from the DTV 1 after the DTV 1 is turned on. can do.
 図11Aが示すように、設定画面において、「自動的にテレビの電源をオンする」等のチェックボックス(あるいは「オンする」/「オンしない」の一方のみが有効となるラジオボタン)等を表示させ、チェックボックス(ラジオボタン)による設定に従い、DTV1の電源のオン/オフを制御することが好ましい。なお、DTV1の電源がオフであったことは、例えば電源状態取得部251とDTV1の通信制御部155(LANインタフェース171/無線通信部172)との間の通信により取得できるものとする。 As shown in FIG. 11A, on the setting screen, a check box such as “Turn on TV automatically” (or a radio button in which only one of “ON” / “ON” is enabled) is displayed. It is preferable to control the power on / off of the DTV 1 in accordance with the setting by the check box (radio button). Note that the fact that the power of the DTV 1 is off can be acquired by communication between the power state acquisition unit 251 and the communication control unit 155 (LAN interface 171 / wireless communication unit 172) of the DTV 1, for example.
 なお、図11Bは、設定項目に関してより詳細な設定を可能とする例であり、図11Aが示す設定画面に引き続いて設定可能に、例えばネットワークへの接続について、「そのつど設定する」/「自動的に接続する」等を、選ぶことを可能とする。 FIG. 11B is an example in which more detailed settings can be made with respect to the setting items. For example, for connection to the network, “setting each time” / “automatic” can be set following the setting screen shown in FIG. 11A. It is possible to select “Connect to the network”.
 また、図11Cに示すように、図11Aが示す設定画面に引き続いて、例えば他の人の設定よりも優先してユーザのパーソナライズを実行することのできる「パーソナライズの優先度」等を、例えば『最優先』、『他の端末から優先が許可されている場合』、『優先しない』等を、設定することも可能である。但し、「パーソナライズの優先度」については、DTV1側において既に優先度が設定されている場合、無効となることが好ましい。すなわち、来客中等の理由により、既にDTV1において受信(再生)されている番組を変更すべきではない、等の場合において有益である。 Also, as shown in FIG. 11C, following the setting screen shown in FIG. 11A, for example, “personalization priority” that can execute personalization of a user in preference to other person's settings, for example, “ It is also possible to set “top priority”, “when priority is allowed from another terminal”, “not prioritized”, and the like. However, “priority of personalization” is preferably invalidated when priority is already set on the DTV 1 side. That is, it is useful in the case where a program already received (reproduced) in the DTV 1 should not be changed due to a reason such as a visitor.
 本発明のいくつかの実施形態を説明したが、これらの実施形態は、例として提示したものであり、発明の範囲を限定することは意図していない。これら新規な実施形態は、その他の様々な形態で実施されることが可能であり、発明の要旨を逸脱しない範囲で、種々の省略、置き換え、変更を行うことができる。これら実施形態やその変形は、発明の範囲や要旨に含まれるとともに、特許請求の範囲に記載された発明とその均等の範囲に含まれる。 Although several embodiments of the present invention have been described, these embodiments are presented as examples and are not intended to limit the scope of the invention. These novel embodiments can be implemented in various other forms, and various omissions, replacements, and changes can be made without departing from the scope of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, and are included in the invention described in the claims and the equivalents thereof.

Claims (8)

  1.  無線または有線により、端末装置に固有の機器情報を取得する通信手段と、
     前記通信手段が取得した前記端末装置の前記機器情報に基づいて、ユーザを特定するユーザ特定手段と、
     前記ユーザ特定手段が特定したユーザに関連する個人設定を設定する設定手段と、
    を具備する電子機器。
    A communication means for acquiring device information specific to the terminal device by wireless or wired;
    User identifying means for identifying a user based on the device information of the terminal device acquired by the communication means;
    Setting means for setting personal settings related to the user specified by the user specifying means;
    An electronic device comprising:
  2.  前記通信手段は、前記設定手段に制御指示を入力するリモートコントローラを含む請求項1の電子機器。 The electronic device according to claim 1, wherein the communication unit includes a remote controller that inputs a control instruction to the setting unit.
  3.  前記機器情報は前記端末装置毎に割り当てられるMAC(Media Access Control)アドレスを含む請求項1記載の電子機器。 The electronic device according to claim 1, wherein the device information includes a MAC (Media Access Control) address assigned to each terminal device.
  4.  前記通信手段は、前記端末装置に従う前記設定の実行に先立って前記端末装置に対し、前記設定の実行タイミングを問い合わせる請求項1の電子機器。 The electronic device according to claim 1, wherein the communication unit inquires the terminal device about execution timing of the setting prior to execution of the setting according to the terminal device.
  5.  映像を表示する表示手段をさらに具備する請求項1の電子機器。 2. The electronic apparatus according to claim 1, further comprising display means for displaying an image.
  6.  無線または有線により、端末装置に固有の機器情報を取得し、
     取得した機器情報に基づいて、ユーザを特定し、
     特定したユーザに関連する個人設定を設定する
    電子機器の制御方法。
    Acquire device information specific to the terminal device by wireless or wired,
    Based on the acquired device information, identify the user,
    An electronic device control method for setting personal settings related to a specified user.
  7.  機器情報は端末装置毎に割り当てられるMAC(Media Access Control)アドレスを含む請求項6記載の電子機器の制御方法。 The electronic device control method according to claim 6, wherein the device information includes a MAC (Media Access Control) address assigned to each terminal device.
  8.  設定の実行に先立って端末装置に、設定の実行タイミングを問い合わせる請求項6の電子機器の制御方法。 The method for controlling an electronic device according to claim 6, wherein the terminal device is inquired about the execution timing of the setting prior to the execution of the setting.
PCT/JP2013/065279 2013-05-31 2013-05-31 Electronic device and method for controlling same WO2014192155A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2013/065279 WO2014192155A1 (en) 2013-05-31 2013-05-31 Electronic device and method for controlling same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2013/065279 WO2014192155A1 (en) 2013-05-31 2013-05-31 Electronic device and method for controlling same

Publications (1)

Publication Number Publication Date
WO2014192155A1 true WO2014192155A1 (en) 2014-12-04

Family

ID=51988223

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/065279 WO2014192155A1 (en) 2013-05-31 2013-05-31 Electronic device and method for controlling same

Country Status (1)

Country Link
WO (1) WO2014192155A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004214976A (en) * 2002-12-27 2004-07-29 Sharp Corp Av data transmitting apparatus, av data receiving apparatus, av data wireless communication system, and electronic apparatus
JP2005117185A (en) * 2003-10-03 2005-04-28 Canon Inc Digital television set
JP2009118231A (en) * 2007-11-07 2009-05-28 D-Link Japan Kk Information relay system and communication terminal
JP2010087755A (en) * 2008-09-30 2010-04-15 Kddi Corp Viewing restriction method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004214976A (en) * 2002-12-27 2004-07-29 Sharp Corp Av data transmitting apparatus, av data receiving apparatus, av data wireless communication system, and electronic apparatus
JP2005117185A (en) * 2003-10-03 2005-04-28 Canon Inc Digital television set
JP2009118231A (en) * 2007-11-07 2009-05-28 D-Link Japan Kk Information relay system and communication terminal
JP2010087755A (en) * 2008-09-30 2010-04-15 Kddi Corp Viewing restriction method and system

Similar Documents

Publication Publication Date Title
US9553972B2 (en) Method and system for reproducing contents, and computer-readable recording medium thereof
US10674219B2 (en) Method and system for reproducing contents, and computer-readable recording medium thereof
KR102279600B1 (en) Method for operating in a portable device, method for operating in a content reproducing apparatus, the protable device, and the content reproducing apparatus
US20180084202A1 (en) Display apparatus and control method thereof
CN105681969B (en) Electronic device, audio devices, the method for controlling electronic device and audio devices power supply
US10096237B2 (en) Electronic apparatus, external apparatus, and method of controlling power supply to external apparatus
CN108886632B (en) Digital device and method for processing data in the digital device
EP3057332B1 (en) Device and method for transmitting and receiving video data
KR102102748B1 (en) Electronic apparatus, external apparatus and method for controlling a power supply of external apparatus
KR101545904B1 (en) Image display apparatus, and method for operating the same
JP2014082722A (en) Electronic device, control method of electronic device, and program of electronic device
WO2014192155A1 (en) Electronic device and method for controlling same
JP2013135411A (en) Electronic apparatus and control method therefor
KR20130062477A (en) A method for transmitting and receiving data and display apparatus using it
KR20160102653A (en) Terminal and operating method thereof
KR102582543B1 (en) A wireless power transmitting apparatus and a method for operating in the wireless power transmitting apparatus
JP6302062B2 (en) Electronic apparatus, method and program
JP2011205489A (en) Radio communication device and radio communication method
KR20170081454A (en) Display device and operating method thereof
KR20160055472A (en) Video display device and operating method thereof
KR20150102336A (en) Terminal and method for obtaining contents thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13886103

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13886103

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP