WO2021028994A1 - Device controller, device control method, and device control program - Google Patents

Device controller, device control method, and device control program Download PDF

Info

Publication number
WO2021028994A1
WO2021028994A1 PCT/JP2019/031750 JP2019031750W WO2021028994A1 WO 2021028994 A1 WO2021028994 A1 WO 2021028994A1 JP 2019031750 W JP2019031750 W JP 2019031750W WO 2021028994 A1 WO2021028994 A1 WO 2021028994A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
unit
control signal
signal
type
Prior art date
Application number
PCT/JP2019/031750
Other languages
French (fr)
Japanese (ja)
Inventor
要 林
平野 康博
Original Assignee
Groove X株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Groove X株式会社 filed Critical Groove X株式会社
Priority to JP2021539727A priority Critical patent/JPWO2021028994A1/ja
Priority to PCT/JP2019/031750 priority patent/WO2021028994A1/en
Publication of WO2021028994A1 publication Critical patent/WO2021028994A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/08Speech classification or search
    • G10L15/10Speech classification or search using distance or distortion measures between unknown speech and reference templates
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom

Definitions

  • the disclosed technology relates to a device control device, a device control method, and a device control program.
  • a technique for setting an electronic device is known (for example, Japanese Patent Application Laid-Open No. 2015-76775).
  • a smartphone which is an example of an electronic device
  • the smartphone performs short-range wireless communication with the television receiver.
  • the smartphone receives the setting information transmitted from the television receiving device.
  • This setting information is information for remotely controlling the television receiving device.
  • the smartphone can control the operation of the television receiving device.
  • the TV receiver which is the device to be controlled, needs to have a function of transmitting information for remote control of itself to the smartphone.
  • the device to be controlled does not have such a function, it is necessary to set a function for sending and receiving information regarding remote control to the device to be controlled. Therefore, if the device to be controlled does not have a function of transmitting and receiving information related to its own remote control, there is a problem that the device to be controlled cannot be easily controlled by another device.
  • the disclosed technology allows device control to easily control the device to be controlled, even if the device to be controlled does not have the ability to send and receive information about its own remote control. It is intended to provide equipment, methods, and programs.
  • the device control device of the disclosed technology includes a control signal receiving unit that receives a control signal indicating the control content to the device and a control signal transmission that transmits a control signal representing the control content to the device.
  • a device type recognition unit that recognizes the type of device represented by the control signal based on the unit and the control signal received by the control signal receiving unit, and a device type recognized by the device type recognition unit.
  • a registration unit registered in the first storage unit an instruction signal acquisition unit that acquires an instruction signal that is a signal of a type different from the control signal and is a signal representing the control content to the device, and the instruction signal acquisition unit.
  • a control target device type recognition unit that recognizes the type of the control target device represented by the instruction signal based on the instruction signal acquired by the unit and the type of the device stored in the first storage unit.
  • the control target device type recognized by the control target device type recognition unit and the instruction signal are included in the second storage unit in which the control signal corresponding to the device type and the control content is stored.
  • the control signal recognition unit that recognizes the target control signal corresponding to the control content to the control target device and the control signal transmission unit are made to transmit the target control signal to the control target device. It is equipped with a signal transmission control unit.
  • FIG. 1 shows an explanatory diagram for explaining the outline of the present embodiment.
  • the remote controller 142 controls the device to be controlled by the operation of the remote controller 142 (generally also referred to as a remote controller) by the user 140. It is assumed that a signal is output.
  • the devices to be controlled are, for example, a television 144, an air conditioner 146, and the like.
  • the control signal is infrared data output from the remote controller 142, and represents the content of control to the device.
  • the robot 100 when a control signal is output from the remote controller 142 to the device to be controlled, the robot 100 receives the control signal. Then, the robot 100 recognizes the type of the device to be controlled from the received control signal, and stores the type of the device in a predetermined storage unit. As a result, the robot 100 learns the types of devices that the user operates on a daily basis.
  • the robot 100 learns the type of device, as shown in FIG. 2, the user 140 operates the device to be controlled by, for example, utterance 148A, instead of operating the device via the remote controller 142. Do. Specifically, as shown in FIG. 2, when the utterance 148A is uttered by the user, the robot 100 acquires the utterance 148A. Then, the robot 100 identifies the type of the device according to the utterance 148A, and outputs a control signal for the type of the device to the device to be controlled.
  • the user 140 does not operate the device via the remote controller 142, but operates the device to be controlled by, for example, the smartphone terminal 148B.
  • the robot 100 acquires the signal output from the smartphone terminal 148B via an external server. Then, the robot 100 specifies the type of the device according to the signal output from the smartphone terminal 148B, and outputs the control signal for the type of the device to the device to be controlled.
  • the user can easily operate the device to be controlled without operating the remote controller 142.
  • FIG. 3A is a front view of the robot 100, which is an example of the device control device.
  • FIG. 3B is a side view of the robot 100.
  • the robot 100 in this embodiment is an autonomous action type robot that determines an action based on an external environment and an internal state.
  • the external environment is detected by various sensors such as a camera, a thermo sensor, and a microphone.
  • the internal state is quantified as various parameters expressing the emotions of the robot 100. These will be described later.
  • the robot 100 includes two wheels 102, which is an example of a moving mechanism. The rotation speed or direction of rotation of the two wheels 102 can be controlled individually. Also, the robot 100 has two hands 106. The hand 106 can perform simple operations such as raising, shaking, and vibrating. The two hands 106 can also be individually controlled. Further, a camera is built in the eyes 110 of the robot 100. The eye 110 can also display an image by a liquid crystal element or an organic EL element. In addition to the camera built into the eyes 110, the robot 100 is equipped with various sensors such as a sound collecting microphone and an ultrasonic sensor. In addition, the robot 100 has a built-in speaker and can emit a simple voice.
  • FIG. 4 is a configuration diagram of the robot system 300.
  • the robot system 300 includes a robot 100, a server 200, and a plurality of external sensors 114.
  • a plurality of external sensors 114 are installed in advance in the house.
  • the position coordinates of the external sensor 114 are registered in the server 200.
  • the position coordinates are defined as x, y coordinates in the house, which is assumed to be the action range of the robot 100. For example, when the robot 100 moves to P3, the position information of the robot 100 is acquired by communicating with the external sensor 114b.
  • FIG. 6 is a hardware configuration diagram of the robot 100.
  • the robot 100 includes a battery 116, a drive mechanism 118, a processor 120, a storage device 122, a communication device 124, a display device 125, a speaker 126, and an internal sensor 128.
  • Each unit is connected to each other by a power line 130 and a signal line 132.
  • the battery 116 supplies electric power to each unit via the power line 130.
  • Each unit sends and receives control signals via signal lines 132.
  • the battery 116 is a secondary battery such as a lithium ion secondary battery, and is a power source for the robot 100.
  • the communication device 124 is an example of a control signal receiving unit and a control signal transmitting unit.
  • the internal sensor 128 is an assembly of various sensors built in the robot 100. Specifically, it includes a camera, a sound collecting microphone, an infrared sensor, a thermo sensor, a touch sensor, an acceleration sensor, an odor sensor, and the like.
  • the communication device 124 is a communication module that performs wireless communication for various external devices such as a server 200, an external sensor 114, and a mobile device owned by a user.
  • the storage device 122 is composed of a non-volatile memory and a volatile memory, and stores computer programs and various setting information.
  • the processor 120 is a means for executing a computer program.
  • the drive mechanism 118 is an actuator that controls each mechanism such as the wheel 102 and the hand 106.
  • a display and speakers will be installed.
  • the processor 120 selects the action of the robot 100 while communicating with the server 200 and the external sensor 114 via the communication device 124.
  • Various external information obtained by the internal sensor 128 also influences the action selection.
  • the drive mechanism 118 mainly controls the wheels 102 and the hands 106.
  • the drive mechanism 118 changes the movement direction and the movement speed of the robot 100 by changing the rotation speed and the rotation direction of each of the two wheels 102.
  • the drive mechanism 118 can also raise and lower the wheels 102. When the wheel 102 rises, the wheel 102 is completely retracted in the body 104, and the robot 100 comes into contact with the floor surface at the seating surface 108 to be in a seated state.
  • the hand 106 can be lifted by the drive mechanism 118 pulling the hand 106 via the wire 134. It is also possible to make a gesture like waving by vibrating the hand 106. Even more complicated gestures can be expressed by using a large number of wires 134.
  • FIG. 7 is a functional block diagram of the robot system 300.
  • the robot system 300 includes a robot 100, a server 200, and a plurality of external sensors 114.
  • Each component of the robot 100 and the server 200 includes a CPU (Central Processing Unit), a computing unit such as various coprocessors, a storage device such as a memory and a storage device, hardware including a wired or wireless communication line connecting them, and storage. It is realized by software that is stored in the device and supplies processing instructions to the arithmetic unit.
  • CPU Central Processing Unit
  • computing unit such as various coprocessors
  • storage device such as a memory and a storage device
  • hardware including a wired or wireless communication line connecting them
  • storage storage. It is realized by software that is stored in the device and supplies processing instructions to the arithmetic unit.
  • the computer program may be composed of a device driver, an operating system, various application programs located in the upper layers thereof, and a library that provides common functions to these programs.
  • Each block described below shows a block for each function, not a configuration for each hardware.
  • a part of the functions of the robot 100 may be realized by the server 200, and a part or all of the functions of the server 200 may be realized by the robot 100.
  • the processor 120 is functionally an example of a position acquisition unit.
  • the position information acquisition unit 150, the time information acquisition unit 152 which is an example of the time acquisition unit, the device type recognition unit 154, the registration unit 156, the instruction signal acquisition unit 158, the voice analysis unit 160, and the control target device type. It includes a recognition unit 162, a control signal recognition unit 164, and a signal transmission control unit 166.
  • the storage device 122 is functionally an example of the communication type storage unit 170, the device type storage unit 172 which is an example of the first storage unit, and the second storage unit. It includes a control content storage unit 174 and a map information storage unit 176.
  • the communication format storage unit 170 stores information regarding the communication format of the control signal transmitted from the remote controller 142 operated by the user.
  • Information about the communication format is stored in the form of a table, for example, as shown in FIG.
  • the identification information of the communication format, the communication format, the format of the control signal of the communication format, and the information for identifying the data unit in the control signal are stored in association with each other.
  • Information on the type of device to be controlled is stored in the device type storage unit 172.
  • Information about the type of device is stored in the form of a table, for example, as shown in FIG. 9A.
  • the communication format No. indicating the identification information of the communication format
  • the type of the device the information about the manufacturer that manufactured the device
  • the position information indicating the position when the control signal is received is stored in association with the position information.
  • the device type is a device type number representing device identification information, a first device type representing a home appliance type, and a second device representing a device type more detailed than the first device type. Includes type.
  • the device type storage unit 172 stores information regarding the wording indicating the device type. As shown in FIG. 9B, for example, the information regarding the wording indicating the type of the device is stored in association with the wording indicating the type of the device and the type of the first device corresponding to the wording.
  • the control content storage unit 174 stores information on the control content for the device.
  • Information about the control content for the device is stored in the form of a table, for example, as shown in FIG. 10A.
  • the type of the device, the information representing the control content, and the control signal that realizes the control content are stored in association with each other.
  • control content storage unit 174 stores information regarding the wording indicating the control content of the device.
  • Information about the wording indicating the control content of the device is stored in the form of a table, for example, as shown in FIG. 10B.
  • the wording indicating the control content of the device and the control content represented by the wording are stored in association with each other.
  • the processor 120 acquires, for example, information on the control contents of each of the registered device types. For example, the processor 120 acquires information representing the control content and a control signal that realizes the control content for each of the registered device types from the external server via the communication device 124, and is a control content storage unit. Store in 174.
  • the list of control signals related to the type of device and the control content stored in the control content storage unit 174 may be configured based on a database provided in advance by the manufacturer of the device or the like.
  • the control content storage unit 174 stores various control contents for the type of device. For example, as shown in FIG. 10A, the control content "ON" control signal "10101 *” and the control content “OFF” control signal "01010 *" are stored in the device type No. "100". To. Therefore, once the type of device is known, the control signal representing the control content is known. Therefore, various control signals for controlling the device can be obtained from the type of the device.
  • the map information storage unit 176 stores map information representing an indoor map.
  • the map information is stored in the form of a table, for example, as shown in FIG.
  • a range on indoor position coordinates, a name of a room located in the range, and another name of the room are stored in association with each other.
  • a control signal for the device to be controlled is output from the remote controller 142.
  • the control signal is a signal indicating the content of control to the device.
  • the communication device 124 of the robot 100 receives the control signal output from the remote controller 142.
  • the processor 120 of the robot 100 reads the program from the storage device 122 and executes the learning processing routine shown in FIG.
  • step S100 the device type recognition unit 154 of the processor 120 acquires the control signal received by the communication device 124.
  • step S102 the time information acquisition unit 152 of the processor 120 acquires the time information when the control signal is received by the communication device 124.
  • step S104 the position information acquisition unit 150 of the processor 120 acquires the position information when the control signal is received by the communication device 124.
  • step S106 the device type recognition unit 154 of the processor 120 recognizes the type of device represented by the control signal based on the control signal acquired in step S100.
  • the device type recognition unit 154 When recognizing the type of device from the control signal, for example, the device type recognition unit 154 decodes the control signal. Next, the device type recognition unit 154 refers to the table stored in the communication format storage unit 170 to specify the communication format of the control signal. For example, when the control signal format is "111100", the device type recognition unit 154 refers to the table stored in the communication format storage unit 170, and the communication format of the control signal is "X method". To identify.
  • the device type recognition unit 154 of the processor 120 refers to the table stored in the communication format storage unit 170, and extracts the data unit after 6 bits of the control signal. Then, the device type recognition unit 154 includes a first device type representing the home appliance type and a second device type representing a device type more detailed than the first device type, which is included in the control signal data unit. , And recognize information representing the manufacturer.
  • the second device type is a device type that is more detailed than the first device type, and is information for identifying a plurality of devices of the same type when there are a plurality of devices of the same type. For example, when there are two TVs indoors, the two TVs can be identified by the type of the second device.
  • step S108 the registration unit 156 of the processor 120 registers the type of the first device, the type of the second device, and the manufacturer recognized in step S102 in the device type storage unit 172. Further, the registration unit 156 of the processor 120 registers the combination of the time information acquired in the step S102 and the position information acquired in the step S104 in the device type storage unit 172.
  • the communication format No. for example, the device type No., the first device type, and the second device type
  • the manufacturer for example, the device type No., the first device type, and the second device type
  • the time information are stored in association with each other.
  • the processor 120 of the robot 100 ends the learning processing routine.
  • the above processing routine is executed every time the communication device 124 receives the control signal from the remote controller 142. Therefore, the device type storage unit 172 of the robot 100 stores information regarding the type of device operated by the user by the remote controller 142. As a result, the robot 100 learns the types of devices that the user operates on a daily basis.
  • the robot 100 When the device type is stored in the device type storage unit 172, the robot 100 responds to the instruction signal when it receives an instruction signal which is a signal of a type different from the control signal output from the remote controller 142.
  • the control signal is output to the device to be controlled.
  • the instruction signal is a signal of a type different from the control signal, and is a signal representing the control content to the device.
  • An example of the instruction signal is voice information emitted by the user.
  • a signal output from a smartphone terminal may be output to the robot 100 via an external server.
  • a signal is transmitted from the smartphone terminal to the robot 100 by a dedicated application.
  • the instruction signal is the voice information of the user.
  • the microphone included in the internal sensor 128 of the robot 100 detects the voice information emitted from the user.
  • the processor 120 of the robot 100 reads the program from the storage device 122 and executes the device control processing routine shown in FIG.
  • step S200 the instruction signal acquisition unit 158 of the processor 120 acquires the instruction signal which is the voice information acquired by the sound collecting microphone in the internal sensor 128.
  • step S202 the voice analysis unit 160 of the processor 120 performs voice analysis of the voice information acquired by the instruction signal acquisition unit 158.
  • the voice analysis unit 160 converts voice information into character information by using a known voice analysis technique.
  • voice analysis by the voice analysis unit 160 the voice information is converted into the character information, and for example, the character information "Terebitsuke” is obtained.
  • the voice analysis unit 160 converts the character information "Telebitsuke” into the character information "Television” by the existing natural language processing technology.
  • step S204 the voice analysis unit 160 of the processor 120 uses the wording indicating the control content included in the instruction signal and the wording indicating the type of the device included in the instruction signal based on the result obtained in step S204. get.
  • the voice analysis unit 160 acquires the wording "television” indicating the type of device and the wording "turning on” indicating the control content from the character information "television on”.
  • step S206 the control target device type recognition unit 162 of the processor 120 is based on the wording indicating the device type obtained in step S204 and the device type stored in the device type storage unit 172.
  • the type of the device to be controlled represented by the instruction signal acquired in step S200 is recognized. For example, when the wording indicating the type of device is "television", the first word matching the wording "television” indicating the type of device with reference to the table of FIG. 9B stored in the device type storage unit 172.
  • the device type "Television” is recognized as the device to be controlled.
  • the control target device type recognition unit 162 recognizes "TV-01" whose first device type is "Television” as an actual control target device.
  • control signal recognition unit 164 refers to the control content storage unit 174 in which the control signal corresponding to the type of device and the control content is stored, and refers to the control target device recognized in step S206. Recognize the target control signal corresponding to the type and the control content for the controlled target device included in the voice information.
  • control signal recognition unit 164 recognizes the target control signal based on the information stored in the control content storage unit 174 and the wording indicating the control content obtained in step S204. For example, when the wording indicating the control content is "attach”, the control corresponding to the control content corresponding to the word "attach” indicating the control content is referred to by referring to the table stored in the control signal recognition unit 164. The content "ON" is recognized as the target control content.
  • control signal recognition unit 164 recognizes the target control signal based on the type of the device recognized in step S206 and the target control content. For example, the control signal recognition unit 164 uses a control signal "10101 *” whose device type "TV-01" recognized in step S206 and the target control content "ON" match as the target control signal. recognize.
  • step S210 the signal transmission control unit 166 controls the communication device 124 so that the device to be controlled recognized in step S206 transmits the control signal of the target recognized in step S208.
  • the signal transmission control unit 166 controls the communication device 124 so as to transmit the control signal “10101 *” meaning ON to the device “TV-01” to be controlled.
  • the processor 120 of the robot 100 ends the device control processing routine.
  • the above device control processing routine is executed every time the microphone of the robot 100 detects an instruction signal.
  • the robot 100 of the first embodiment receives the control signal transmitted from the remote controller 142. Then, the robot 100 recognizes the type of the device represented by the control signal based on the received control signal. Then, the robot 100 registers the recognized device type in the device type storage unit 172. The robot 100 acquires user voice information as an instruction signal, which is a signal of a type different from the control signal from the remote controller 142 and is a signal indicating the control content to the device. Then, the robot 100 recognizes the type of the device to be controlled represented by the voice information based on the voice information and the type of the device stored in the device type storage unit 172.
  • the robot 100 refers to the control content storage unit 174 in which the control signal corresponding to the device type and the control content is stored, and refers to the recognized control target device type and the voice information included in the robot 100. Recognize the control signal of the target corresponding to the control content to the device to be controlled. Then, the robot 100 transmits the target control signal to the device to be controlled. As a result, even when the device to be controlled does not have the function of transmitting / receiving information regarding its own remote control, the device to be controlled can be easily controlled.
  • the user can operate the target device without operating the remote control device 142 unique to the device.
  • the user can easily operate the device by voice.
  • the device to be controlled is determined according to the time information when the control signal from the remote controller 142 is received or the position information when the control signal from the remote controller 142 is received. , Different from the first embodiment.
  • the robot 100 determines a device to be controlled, there may be a plurality of candidates for the device to be controlled. For example, when there are two televisions. In this case, two televisions are registered in the device type storage unit 172. For example, in the table of FIG. 9A of the device type storage unit 172, two televisions, "TV-01" and "TV-02", are registered as devices whose first device type is "Television". There is.
  • the robot 100 determines the device to be controlled, for example, it is conceivable that the device having the smaller device type number is determined as the device to be controlled. However, the device thus determined may not be the device that the user intends to operate.
  • the robot 100 of the second embodiment determines the device to be controlled according to the time information when the control signal is received and the position information when the control signal is received.
  • control target device type recognition unit 162 is based on the device type included in the instruction signal and the time when the instruction signal is received, and the device type and time information stored in the device type storage unit 172. Then, the type of the device to be controlled represented by the instruction signal is recognized.
  • control target device type recognition unit 162 gives an instruction based on the type of the device included in the instruction signal and the position where the instruction signal is received, and the type and position information of the device stored in the device type storage unit 172. Recognize the type of device to be controlled represented by the signal.
  • the device type storage unit 172 stores time information when the control signal is received and position information when the control signal is received. Each time a control signal is received, time information and position information are stored. Therefore, for each type of device, the number of times the control signal is received can be calculated for each time zone.
  • the number of times the control signal is received at each position is calculated as 5 times for (X1, Y1) to (X2, Y2) and 0 times for (X3, Y3) to (X4, Y4). ..
  • the control target device type recognition unit 162 determines that the control target device is "Television", but there are two devices corresponding to "Television", "TV-01” and "TV-02". ..
  • the control target device type recognition unit 162 obtains the control signal of "TV-01” from the time information received in the past, the time zone in which the number of times the control signal is received is the highest, and "TV-02". Obtains the time zone in which the number of times the control signal is received is the highest, which is obtained from the time information received in the past.
  • the time zone in which the control signal of "TV-01” is received most frequently is “8:00 to 12:00", and the time zone in which the control signal of "TV-02" is received is the highest time. It is assumed that the band is "12:00 to 16:00". In this case, the control target device type recognition unit 162 recognizes that the control target device is "TV-01" because the time when the control signal is received is 10:00.
  • the control target device type recognition unit 162 determines that the control target device is "Television", but there are two devices corresponding to "Television", "TV-01” and "TV-02". ..
  • the control target device type recognition unit 162 obtains the control signal of "TV-01” from the position information received in the past, the position range in which the number of times the control signal is received is the highest, and "TV-02". Obtains the position range in which the number of times the control signal has been received is the highest, which is obtained from the time information received in the past.
  • the position range in which the number of times the control signal of "TV-01” is received is the highest is (X1, Y1) to (X3, Y3), and the number of times the control signal of "TV-02" is received is the highest.
  • the high position range is (X4, Y4) to (X5, Y5).
  • the control target device type recognition unit 162 receives the control signal at the position (X2, Y2) and is within the range of (X1, Y1) to (X3, Y3). Recognizes as "TV-01".
  • the robot 100 of the second embodiment has the type of device included in the instruction signal and the time when the instruction signal is received, and the type and time information of the device stored in the device type storage unit 172. Based on this, the type of the device to be controlled represented by the instruction signal is recognized. As a result, even if there are a plurality of candidates for the device to be controlled, the device intended by the user is appropriately controlled according to the time zone in which the control signal from the remote controller 142 operated by the user is received. be able to.
  • the robot 100 of the second embodiment gives an instruction based on the type of the device included in the instruction signal and the position where the instruction signal is received, and the type and position information of the device stored in the device type storage unit 172. Recognize the type of device to be controlled represented by the signal. As a result, even if there are a plurality of candidates for the device to be controlled, the device intended by the user can be appropriately controlled according to the position where the control signal from the remote controller 142 operated by the user is received. Can be done.
  • the third embodiment is different from the first embodiment and the second embodiment in that an evaluation value is given to the device.
  • the processor 220 of the third embodiment functionally has a position information acquisition unit 150, a time information acquisition unit 152, a device type recognition unit 154, a registration unit 156, and an evaluation unit. It includes 257, an instruction signal acquisition unit 158, a voice analysis unit 160, a control target device type recognition unit 162, a control signal recognition unit 164, and a signal transmission control unit 166.
  • the evaluation unit 257 determines the evaluation value for each device stored in the device type storage unit 172, and assigns the evaluation value to each device. Then, when there are a plurality of candidates for the type of the device to be controlled, the control target device type recognition unit 162 of the third embodiment determines the type of the device to be controlled according to the evaluation value determined by the evaluation unit 257. Recognize.
  • the evaluation unit 257 determines the evaluation value for each device according to the user's behavior.
  • the transmission of the control signal by the remote controller 142 and the output of the voice information as the instruction signal are regarded as the user's behavior will be described as an example.
  • the evaluation unit 257 is stored in the device type storage unit 172 according to the time-series relationship between the transmission of the control signal to the target device by the communication device 124 and the reception of the control signal by the communication device 124. Update the evaluation value for each device.
  • the user may operate the remote controller 142 and retransmit the control signal to the intended device. For example, if the robot 100 transmits a control signal to turn on the TV B, but the user's intention is to turn on the TV A, the user operates the remote controller 142 to turn on the TV A. Is likely to transmit a control signal. In this case, the television B may not actually exist.
  • the evaluation unit 257 transmits the control signal to the first device (television B) by the communication device 124 of the robot 100, and then within a predetermined time, the communication device 124 sends the control signal to the second device (television A).
  • the evaluation value for the first device (television B) is updated to be lowered.
  • the evaluation unit 257 updates so as to raise the evaluation value of the second device (television A).
  • the control signal by the robot 100 is erroneous.
  • the user simply wants to turn off the second device (television A) instead of transmitting.
  • the evaluation unit 257 determines that the control content represented by the control signal to the first device (television B) and the control content represented by the control signal to the second device (television A) are the same. It may be updated so as to raise the evaluation value of the second device (television A).
  • the evaluation unit 257 receives the control signal to the first device (television B) within a predetermined time after transmitting the control signal to the first device (television B), and If the control content (ON) represented by the transmitted control signal and the control content (OFF) represented by the received control signal are different, the evaluation value for the first device (television B) is updated to be lowered. ..
  • the evaluation unit 257 may update the evaluation value according to the content of the voice information which is the instruction signal acquired after the control signal is transmitted by the robot 100.
  • the robot 100 After the robot 100 transmits a control signal indicating ON to the TV B, the robot 100 recognizes the instruction signal (“Turn on the TV (TV A) in the kitchen”) of the same control content (ON) again within a predetermined time. If this is the case, it is highly possible that the robot 100 has mistakenly transmitted the control signal to the television B.
  • the instruction signal (“Turn on the TV (TV A) in the kitchen”) of the same control content (ON) again within a predetermined time. If this is the case, it is highly possible that the robot 100 has mistakenly transmitted the control signal to the television B.
  • the evaluation unit 257 acquires the instruction signal to the second device (television A) within a predetermined time after transmitting the control signal to the first device (television B), the first Update to lower the evaluation value for the device (TV B).
  • the robot 100 recognizes the instruction signal (turning off the TV B) of the opposite control content (OFF) within a predetermined time after transmitting the control signal indicating ON to the TV B, the control signal to the TV B It is highly possible that the transmission was incorrect.
  • the evaluation unit 257 acquires the instruction signal to the first device (television B) within a predetermined time after transmitting the control signal to the first device (television B), and When the control content (ON) represented by the control signal and the control content (OFF) represented by the instruction signal are different, the evaluation value for the first device (television B) is updated to be lowered.
  • the evaluation unit 257 determines the evaluation value for each device according to the area corresponding to the position information acquired by the position information acquisition unit 150. Further, the evaluation unit 257 determines the evaluation value for each device according to the time zone corresponding to the time information acquired by the time information acquisition unit 152. Thereby, as in the second embodiment, the control signal for each device can be transmitted based on the evaluation value according to the time and the position.
  • the evaluation unit 257 lowers the evaluation value of each device stored in the device type storage unit 172 for which a predetermined time has passed from the time when the control signal is received by the time information acquisition unit 152. To control. As a result, the evaluation value becomes lower as the device is not used for a long time, and it is possible to make it difficult to control the device that is not used.
  • the control target device type recognition unit 162 of the third embodiment determines the type of the device to be controlled according to the evaluation value determined by the evaluation unit 257. Recognize. Specifically, when there are a plurality of candidates for the type of the device to be controlled, the control target device type recognition unit 162 recognizes the device having the highest evaluation value as the device to be controlled.
  • the signal transmission control unit 166 of the third embodiment controls the communication device 124 so that the control target device recognized by the control target device type recognition unit 162 transmits the target control signal.
  • the robot 100 of the third embodiment determines the type of the device to be controlled according to the evaluation value determined by the evaluation unit 257 when there are a plurality of candidates for the type of the device to be controlled. recognize. As a result, when there are a plurality of candidates for the type of the device to be controlled, the device to be controlled can be appropriately selected according to the evaluation value, and the selected device to be controlled can be controlled. ..
  • the fourth embodiment is different from the first to third embodiments in that a control signal is transmitted to the device to be controlled according to the information detected by the internal sensor 128 and the external sensor 114.
  • the processor 420 of the fourth embodiment functionally includes a position information acquisition unit 150, a time information acquisition unit 152, a device type recognition unit 154, a registration unit 156, and a state acquisition unit. It includes a unit 357, an instruction signal acquisition unit 158, a voice analysis unit 160, a control target device type recognition unit 162, a control signal recognition unit 164, and a signal transmission control unit 166.
  • the state acquisition unit 357 acquires the state of the device to be controlled based on the information detected by the internal sensor 128 or the external sensor 114.
  • the state acquisition unit 357 acquires a state regarding whether or not the device to be controlled is operating based on the image captured by the camera which is the internal sensor 128.
  • the signal transmission control unit 166 of the fourth embodiment selects the device to be controlled according to the state of the device acquired by the state acquisition unit 357 and the control content represented by the target control signal. Then, the signal transmission control unit 166 of the fourth embodiment causes the device corresponding to the selection result of the device to transmit the target control signal.
  • the signal transmission control unit 166 of the fourth embodiment has an ON state or OFF state as the state of the device acquired by the state acquisition unit 357 and an ON state or OFF state as the control content represented by the target control signal.
  • the TV A which is the device to be controlled, is selected according to the above. Then, the signal transmission control unit 166 of the fourth embodiment transmits the control signal of the OFF signal to the television A.
  • the device to be controlled is an alternate type device, it can be appropriately controlled.
  • an alternate type device if an OFF control signal is sent in the OFF state, the device will be in the ON state, which may deviate from the user's intention.
  • the fourth embodiment it is possible to avoid control that deviates from the intention of the user.
  • the state acquisition unit 357 recognizes the presence or absence of a person based on the information obtained by the camera or the thermo sensor which is the internal sensor 128. Then, the signal transmission control unit 166 causes the state acquisition unit 357 to transmit the target control signal according to the recognition result of the existence of the person.
  • the signal transmission control unit 166 sets the communication device 124 so as to transmit an OFF control signal for a predetermined device such as a TV, a lighting, and an air conditioner.
  • a predetermined device such as a TV, a lighting, and an air conditioner.
  • Control for example, the presence or absence of automatic control by the robot 100 may be determined by setting the application from the smartphone terminal.
  • the state acquisition unit 357 may further recognize the state of a person based on the information obtained by the camera or the thermo sensor which is the internal sensor 128.
  • the signal transmission control unit 166 causes the communication device 124 to transmit the target control signal according to the recognition result of the human state by the state acquisition unit 357. For example, when a person's body temperature is obtained by a thermo sensor which is an internal sensor 128, a control signal for turning on the air conditioner is transmitted to the air conditioner when the person's body temperature exceeds a predetermined temperature.
  • the signal transmission control unit 166 transmits an OFF control signal to the device in the ON state based on whether or not each device is in the ON state, which is recognized by the state acquisition unit 357. You may.
  • the TV may be determined by an image or sound
  • the lighting may be determined by an image
  • the air conditioner may be determined by an image or a thermal sensor.
  • the signal transmission control unit 166 may notify the user's smartphone terminal whether or not to turn off each device via the application. Further, the user may be able to remotely control the robot 100 from the smartphone terminal.
  • the state acquisition unit 357 may determine whether or not the user is out based on the information detected by the internal sensor 128 or the external sensor 114. For example, if no one is detected in a room designated in advance based on the detection results of the camera and the microphone for a certain period of time or longer, the device in the ON state may be turned OFF in the smartphone terminal of the user. Send a signal about the camera.
  • the state acquisition unit 357 determines whether or not the user is out via the application of the user's smartphone terminal. For example, the state acquisition unit 357 determines that the user is out when the GPS signal of the smartphone terminal is separated from the house by a predetermined distance or more. Then, the state acquisition unit 357 controls OFF for predetermined devices such as TV, lighting, and air conditioner if all the target users (for example, family members) are out. Send a signal.
  • the state acquisition unit 357 acquires the state of the infant or pet existing in the house based on the information detected by the internal sensor 128 or the external sensor 114, and the signal transmission control unit 166 acquires the state of the infant or pet. Depending on the situation, control signals of various devices may be transmitted.
  • a control signal may be transmitted to the target device so as to reproduce the voice.
  • the state acquisition unit 357 may transmit a predetermined control signal to the air conditioner.
  • a control signal such as lowering the volume of the audio device or lowering the volume of the television may be transmitted.
  • the signal transmission control unit 166 may output a control signal for stopping the operation of each device.
  • the state acquisition unit 357 causes the signal transmission control unit 166 to perform the robot.
  • a predetermined control signal may be transmitted to the air conditioner in order to avoid the failure of 100.
  • the state acquisition unit 357 when the state acquisition unit 357 receives GPS information from the user's smartphone terminal via an external server and detects the user's return home, the state acquisition unit 357 is a position corresponding to a predetermined position in the house or a place where the user is heading.
  • a control signal such as turning on the lighting of the device may be output to each device.
  • the robot 100 of the fourth embodiment causes the target device to transmit the target control signal according to the information detected by the sensor.
  • the device to be controlled can be appropriately controlled.
  • the fifth embodiment is different from the first to fourth embodiments in that the position of the target device is estimated by estimating the direction of the source of the control signal.
  • the internal sensor 128 of the robot 100 of the fifth embodiment includes a direction estimation device 30 as shown in FIG.
  • the direction estimation device 30 corresponds to the "direction estimation unit" of the disclosed technology.
  • the direction estimation device 30 is provided, for example, at a corner portion of the robot 100.
  • the direction estimation device 30 includes a plurality of infrared light emitting elements 11a to 11d and a plurality of light receiving elements 21a to 21d (corresponding to “plurality of receiving elements” of the disclosed technology). ing.
  • infrared light emitting elements 11a to 11d are collectively referred to, or when a specific infrared light emitting element is not specified, it is simply referred to as "infrared light emitting element 11".
  • the infrared light emitting element 11 also corresponds to the "control signal transmission unit" of the disclosed technology. Further, when a plurality of light receiving elements 21a to 21d are collectively referred to, or when a specific light receiving element is not specified, it is simply referred to as "light receiving element 21".
  • the light receiving element 12 also corresponds to the "control signal receiving unit" of the disclosed technology.
  • FIG. 17 is an aerial view of the positional relationship between the user U, the robot 100, and the television 144, which is the target device.
  • the light receiving element 21 of the direction estimation device 30 receives infrared data which is a control signal output from the remote controller 142.
  • step S302 the processor 120 of the robot 100 acquires information on the amount of light received by the light receiving element 21 of the direction estimation device 30. Specifically, the processor 120 of the robot 100 acquires information on the amount of light received by each of the plurality of light receiving elements 21a to 21d.
  • step S304 the processor 120 of the robot 100 estimates the direction of the source of the infrared data based on the light receiving amount information acquired in step S302. For example, the processor 120 of the robot 100 estimates the direction of the remote controller 142 shown in FIG. 17 as the direction of the source of infrared data based on the amount of light received by each light receiving element 21.
  • step S306 the processor 120 of the robot 100 has the target device in the region R in the direction opposite to the direction of the source, based on the estimation result of the direction of the source of the infrared data in step S304. Estimate that you are. For example, as shown in FIG. 17, the processor 120 of the robot 100 estimates that the television 144 is present in the region of R, which is in the direction opposite to the direction of the remote controller 142. The processor 120 of the robot 100 corresponds to the "existence position estimation unit" of the disclosed technology.
  • step S308 the processor 120 of the robot 100 drives the drive mechanism 118 and controls the camera of the internal sensors 128 to image the range of R.
  • step S310 the processor 120 of the robot 100 applies known image processing to the image captured in step S308 to recognize the target device.
  • the processor 120 determines the target device by comparing the preset feature amount of the target device with the feature amount extracted from the image captured by the camera. Identify.
  • the processor 120 may identify the target device to be captured in the image by using a learned model in which the identification information of the target device to be captured in the image is output when the image is input.
  • step S311 the processor 120 of the robot 100 estimates the position of the target device by using a known position estimation algorithm or the like based on the image recognition result obtained in step S310. For example, the processor 120 of the robot 100 estimates the three-dimensional position of the target device based on the image recognition result.
  • step S312 the processor 120 of the robot 100 stores the position information of the target device obtained in step S311 in the device type storage unit 172.
  • the robot 100 is based on at least one type of information, which is map information representing an indoor map, information detected by a thermosensor among internal sensors 128, and information detected by a microphone among internal sensors 128.
  • map information representing an indoor map
  • the position of the device to which the control signal is transmitted may be recognized.
  • the robot 100 When the position information of the device is accumulated in the device type storage unit 172, the robot 100 outputs the control signal corresponding to the voice information to the direction of the device to be controlled when the voice information which is the instruction signal is received. .. First, when the user emits voice information, the microphone included in the internal sensor 128 of the robot 100 detects the voice information emitted from the user. When the microphone of the robot 100 detects the voice information, the processor 120 of the robot 100 reads the program from the storage device 122 and executes the device control processing routine shown in FIG.
  • step S410 the processor 120 of the robot 100 reads the position information associated with the device based on the recognition result of the device in step S206, and recognizes the position of the target device.
  • step S412 the processor 120 of the robot 100 controls the infrared light emitting element 11 of the direction estimation device 30 so as to output an infrared signal which is a control signal in the direction of the position of the target device recognized in step S410. ..
  • the robot 100 recognizes its own position and controls the infrared light emitting element 11 of the direction estimation device 30 according to the relationship between its own position and the position of the target device.
  • the robot 100 of the fifth embodiment recognizes the position of the target device and outputs a control signal in that direction. As a result, the device intended by the user can be appropriately controlled.
  • the processor 120 of the robot 100 when the instruction signal contains information indicating the location, the processor 120 of the robot 100 outputs a control signal to the drive mechanism 118 to move to the position representing the location. Then, the processor 120 of the robot 100 may transmit the control signal to the device to be controlled existing at the position.
  • the controlled device type recognition unit 162 refers to the table stored in the map information storage unit 176 to determine the position of the bedroom. Identify. Then, the processor 120 of the robot 100 outputs a signal to the wheels 102 to move to the position of the bedroom, and moves itself to the bedroom. Then, the control target device type recognition unit 162 transmits a control signal indicating ON to the control target lighting existing in the bedroom.
  • the robot 100 may perform control based on the information regarding the preference representing the preference for each device. For example, as shown in FIG. 20, when the preference score given to the television 144 is higher than the preference score given to the air conditioner 146, the robot 100 is the television 144, which is a highly preferred device. The above may be set so that it becomes easier to approach the vicinity of the television 144, the television 144 is directed toward the television 144, and the control signal for the television 144 is easily learned at an early stage.
  • the robot 100 may perform predetermined sound information or motion when transmitting a control signal to the target device. Further, the robot 100 may transmit a control signal to the target device according to the surrounding situation. For example, when the robot 100 detects the voice of a surrounding human being through a microphone at a frequency equal to or higher than a predetermined frequency, or when it is detected from imaging that the surrounding human faces are facing each other, the communication of the surrounding human beings When it becomes active, a control signal such as lowering the sound of the television may be transmitted.
  • the robot 100 may perform controls such as lowering the volume of the television and controlling the light when the user uses the telephone function of the robot 100.
  • the robot 100 may output a control signal to the target device according to the schedule when the target device is activated. For example, the robot 100 may automatically transmit the control signal to the target device according to the time information of the control signal transmitted from the remote control.
  • the robot 100 may update the evaluation value according to the reaction of the user after transmitting the control signal. For example, when the acceleration in the vertical direction is detected by being held by the user after transmitting the control signal, it is considered that the transmission of the control signal to the device is appropriate, and the evaluation value of the device is evaluated. You may try to raise it. Further, when the predetermined voice information (for example, "Thank you") is detected after the control signal is transmitted, it is considered that the control signal is properly transmitted to the device, and the evaluation value of the device is evaluated. You may try to raise it. Further, the processor 120 of the robot 100 has the target device in the region R in the direction opposite to the direction of the source, based on the estimation result of the direction of the source of the infrared data in step S304. Was estimated. Alternatively or additionally, in view of the high directivity of infrared rays, the processor 120 of the robot 100 may presume that the target device exists in the direction of the source of the infrared data.
  • the predetermined voice information for example, "Thank you"
  • a part of the functions of the robot 100 may be realized by the server 200, and a part or all of the functions of the server 200 may be realized by the robot 100.
  • the device control device of the present disclosure has been described by taking the case of one robot 100 as an example, but the present invention is not limited to this, and may be composed of a plurality of robots.
  • a plurality of robots may send and receive information to each other, and the plurality of robots may perform learning of the type of device and control of the device.
  • the robot 100 of the fifth embodiment has been described as an example of the case where infrared rays which are control signals are output in the direction of the target device, but the present invention is not limited to this, and for example, the direction estimation device 30 Infrared rays, which are control signals, may be output from all of the plurality of infrared light emitting elements 11a to 11d provided.
  • control signal is infrared rays
  • the present invention is not limited to this, and for example, the control signal may be a high frequency (Rf: Radio frequency) electromagnetic wave signal. Good.
  • computer-readable recording media refer to portable media such as flexible disks, magneto-optical disks, ROMs, and CD-ROMs, and storage devices such as hard disks built into computer systems.
  • program of the embodiment may be stored in a storage medium and provided.

Abstract

A device type recognition unit in this device controller uses a received control signal as a basis to recognize a device type represented by the control signal. A registration unit registers the recognized device type in a first storage unit. An instruction signal acquisition unit acquires an instruction signal that is a different type of signal from the control signal and that represents control content for a device. A controlled device type recognition unit uses the instruction signal and the device type stored in the first storage unit as a basis to recognize the type of device to be controlled represented by the instruction signal. A control signal recognition unit references a second storage unit in which control signals corresponding to device types and control content are stored and recognizes a control signal for an object corresponding to the type of the recognized device to be controlled and the control content for the device to be controlled included in the instruction signal. A signal transmission control unit causes a control signal transmission unit to transmit the control signal for an object to the device to be controlled.

Description

機器制御装置、機器制御方法、及び機器制御プログラムDevice control device, device control method, and device control program
 開示の技術は、機器制御装置、機器制御方法、及び機器制御プログラムに関するものである。 The disclosed technology relates to a device control device, a device control method, and a device control program.
 電子機器の設定を行う技術が知られている(例えば、特開2015‐76775号公報)。この技術では、電子機器の一例であるスマートフォンが制御対象の機器であるテレビ受信装置と通信可能な領域に入った場合、スマートフォンはテレビ受信装置と近距離無線通信を行う。そして、スマートフォンは、テレビ受信装置から送信された設定情報を受信する。この設定情報は、テレビ受信装置を遠隔操作するための情報である。スマートフォンは、この設定情報を取得することにより、テレビ受信装置の動作を制御することが可能となる。 A technique for setting an electronic device is known (for example, Japanese Patent Application Laid-Open No. 2015-76775). In this technology, when a smartphone, which is an example of an electronic device, enters an area where it can communicate with a television receiver, which is a device to be controlled, the smartphone performs short-range wireless communication with the television receiver. Then, the smartphone receives the setting information transmitted from the television receiving device. This setting information is information for remotely controlling the television receiving device. By acquiring this setting information, the smartphone can control the operation of the television receiving device.
 しかし、この技術において、制御対象の機器であるテレビ受信装置は、自らを遠隔操作するための情報をスマートフォンへ送信する機能を有している必要がある。 However, in this technology, the TV receiver, which is the device to be controlled, needs to have a function of transmitting information for remote control of itself to the smartphone.
 しかし、制御対象の機器がそのような機能を有していない場合、制御対象の機器に対して、遠隔操作に関する情報の送受信機能を設定する必要がある。このため、制御対象の機器が自身の遠隔操作に関する情報の送受信機能を有していない場合には、他の機器によって制御対象の機器の制御を簡易に行うことができない、という課題がある。 However, if the device to be controlled does not have such a function, it is necessary to set a function for sending and receiving information regarding remote control to the device to be controlled. Therefore, if the device to be controlled does not have a function of transmitting and receiving information related to its own remote control, there is a problem that the device to be controlled cannot be easily controlled by another device.
 一つの側面では、開示の技術は、制御対象の機器が自らの遠隔操作に関する情報の送受信機能を有していない場合であっても、制御対象の機器の制御を簡易に行うことができる機器制御装置、方法、及びプログラムを提供することを目的とする。 On one side, the disclosed technology allows device control to easily control the device to be controlled, even if the device to be controlled does not have the ability to send and receive information about its own remote control. It is intended to provide equipment, methods, and programs.
 上記目的を達成するために、開示の技術の機器制御装置は、機器への制御内容を表す制御信号を受信する制御信号受信部と、機器への制御内容を表す制御信号を送信する制御信号送信部と、前記制御信号受信部により受信された前記制御信号に基づいて、前記制御信号が表す機器の種別を認識する機器種別認識部と、前記機器種別認識部により認識された前記機器の種別を、第1記憶部へ登録する登録部と、前記制御信号とは異なる種類の信号であって、機器への制御内容を表す信号である指示信号を取得する指示信号取得部と、前記指示信号取得部により取得された前記指示信号と、前記第1記憶部に記憶されている前記機器の種別とに基づいて、前記指示信号が表す制御対象の機器の種別を認識する制御対象機器種別認識部と、機器の種別と制御内容とに対応する制御信号が記憶されている第2記憶部を参照して、前記制御対象機器種別認識部により認識された制御対象の機器の種別と前記指示信号に含まれる当該制御対象の機器への制御内容とに対応する対象の制御信号を認識する制御信号認識部と、前記制御信号送信部に、前記制御対象の機器に対して前記対象の制御信号を送信させる信号送信制御部と、を備えている。 In order to achieve the above object, the device control device of the disclosed technology includes a control signal receiving unit that receives a control signal indicating the control content to the device and a control signal transmission that transmits a control signal representing the control content to the device. A device type recognition unit that recognizes the type of device represented by the control signal based on the unit and the control signal received by the control signal receiving unit, and a device type recognized by the device type recognition unit. , A registration unit registered in the first storage unit, an instruction signal acquisition unit that acquires an instruction signal that is a signal of a type different from the control signal and is a signal representing the control content to the device, and the instruction signal acquisition unit. A control target device type recognition unit that recognizes the type of the control target device represented by the instruction signal based on the instruction signal acquired by the unit and the type of the device stored in the first storage unit. , The control target device type recognized by the control target device type recognition unit and the instruction signal are included in the second storage unit in which the control signal corresponding to the device type and the control content is stored. The control signal recognition unit that recognizes the target control signal corresponding to the control content to the control target device and the control signal transmission unit are made to transmit the target control signal to the control target device. It is equipped with a signal transmission control unit.
 一つの側面では、開示の技術によれば、制御対象の機器が自らの遠隔操作に関する情報の送受信機能を有していない場合であっても、制御対象の機器の制御を簡易に行うことができる、という効果が得られる。 On one side, according to the disclosed technology, it is possible to easily control the device to be controlled even when the device to be controlled does not have the function of transmitting and receiving information regarding its own remote control. , The effect can be obtained.
本実施形態の概要を説明するための説明図である。It is explanatory drawing for demonstrating the outline of this Embodiment. 本実施形態の概要を説明するための説明図である。It is explanatory drawing for demonstrating the outline of this Embodiment. 本実施形態のロボット100の外観図である。It is an external view of the robot 100 of this embodiment. 本実施形態のロボット100の外観図である。It is an external view of the robot 100 of this embodiment. 本実施形態のロボットシステム300の概略構成図である。It is a schematic block diagram of the robot system 300 of this embodiment. 家屋内における外部センサの設置位置を説明するための説明図である。It is explanatory drawing for demonstrating the installation position of the external sensor in a house. ロボット100のハードウェア構成図である。It is a hardware block diagram of the robot 100. 第1実施形態のロボット100の機能ブロック図である。It is a functional block diagram of the robot 100 of 1st Embodiment. 通信形式記憶部170に格納されるテーブルの一例を示す図である。It is a figure which shows an example of the table stored in the communication type storage unit 170. 機器種別記憶部172に格納されるテーブルの一例を示す図である。It is a figure which shows an example of the table stored in the device type storage part 172. 機器種別記憶部172に格納されるテーブルの一例を示す図である。It is a figure which shows an example of the table stored in the device type storage part 172. 制御内容記憶部174に格納されるテーブルの一例を示す図である。It is a figure which shows an example of the table stored in the control content storage unit 174. 制御内容記憶部174に格納されるテーブルの一例を示す図である。It is a figure which shows an example of the table stored in the control content storage unit 174. 地図情報記憶部176に格納されるテーブルの一例を示す図である。It is a figure which shows an example of the table stored in the map information storage part 176. 第1実施形態のロボット100のプロセッサ120が実行する学習処理ルーチンの一例を示す図である。It is a figure which shows an example of the learning processing routine executed by the processor 120 of the robot 100 of 1st Embodiment. 第1実施形態のロボット100のプロセッサ120が実行する機器制御処理ルーチンの一例を示す図である。It is a figure which shows an example of the device control processing routine executed by the processor 120 of the robot 100 of 1st Embodiment. 第2実施形態のロボット100の機能ブロック図である。It is a functional block diagram of the robot 100 of the 2nd embodiment. 第3実施形態のロボット100の機能ブロック図である。It is a functional block diagram of the robot 100 of the third embodiment. 方向推定機器30の構成例を示す図である。It is a figure which shows the configuration example of the direction estimation apparatus 30. 対象の機器の位置を記憶する処理を説明するための説明図である。It is explanatory drawing for demonstrating the process of memorizing the position of the target device. 第5実施形態のロボット100のプロセッサ120が実行する位置学習処理ルーチンの一例を示す図である。It is a figure which shows an example of the position learning processing routine executed by the processor 120 of the robot 100 of 5th Embodiment. 第5実施形態のロボット100のプロセッサ120が実行する機器制御処理ルーチンの一例を示す図である。It is a figure which shows an example of the device control processing routine executed by the processor 120 of the robot 100 of 5th Embodiment. ロボット100の選好性を説明するための説明図である。It is explanatory drawing for demonstrating the preference of a robot 100.
 以下、図面を参照して、各実施形態を詳細に説明する。 Hereinafter, each embodiment will be described in detail with reference to the drawings.
 図1に、本実施形態の概要を説明するための説明図を示す。図1に示されるように、本実施形態では、例えば、ユーザ140による遠隔操作器142(一般的には、リモコンとも称される。)の操作により、遠隔操作器142から制御対象の機器へ制御信号が出力される場合を想定する。制御対象の機器は、図1に示されるように、例えば、テレビ144、及びエアコン146等である。制御信号は、遠隔操作器142から出力される赤外線データであり、機器への制御内容を表す。 FIG. 1 shows an explanatory diagram for explaining the outline of the present embodiment. As shown in FIG. 1, in the present embodiment, for example, the remote controller 142 controls the device to be controlled by the operation of the remote controller 142 (generally also referred to as a remote controller) by the user 140. It is assumed that a signal is output. As shown in FIG. 1, the devices to be controlled are, for example, a television 144, an air conditioner 146, and the like. The control signal is infrared data output from the remote controller 142, and represents the content of control to the device.
 本実施形態においては、遠隔操作器142から制御対象の機器へ制御信号が出力されると、ロボット100が当該制御信号を受信する。そして、ロボット100は、受信した制御信号から制御対象の機器の種別を認識し、当該機器の種別を所定の記憶部へ格納する。これにより、ロボット100は、ユーザが日常的に操作している機器の種別を学習する。 In the present embodiment, when a control signal is output from the remote controller 142 to the device to be controlled, the robot 100 receives the control signal. Then, the robot 100 recognizes the type of the device to be controlled from the received control signal, and stores the type of the device in a predetermined storage unit. As a result, the robot 100 learns the types of devices that the user operates on a daily basis.
 ロボット100による機器の種別の学習が行われると、図2に示されるように、ユーザ140は、遠隔操作器142を介した機器の操作ではなく、例えば、発話148Aによって制御対象の機器の操作を行う。具体的には、図2に示されるように、ユーザから発話148Aが発せられた場合、ロボット100はその発話148Aを取得する。そして、ロボット100は、発話148Aに応じて機器の種別を特定し、その機器の種別に対する制御信号を制御対象の機器へ出力する。 When the robot 100 learns the type of device, as shown in FIG. 2, the user 140 operates the device to be controlled by, for example, utterance 148A, instead of operating the device via the remote controller 142. Do. Specifically, as shown in FIG. 2, when the utterance 148A is uttered by the user, the robot 100 acquires the utterance 148A. Then, the robot 100 identifies the type of the device according to the utterance 148A, and outputs a control signal for the type of the device to the device to be controlled.
 または、図2に示されるように、ユーザ140は、遠隔操作器142を介した機器の操作ではなく、例えば、スマートフォン端末148Bによって制御対象の機器の操作を行う。ユーザからスマートフォン端末148Bによって信号が出力されると、ロボット100は外部サーバを介してスマートフォン端末148Bから出力された信号を取得する。そして、ロボット100は、スマートフォン端末148Bから出力された信号に応じて機器の種別を特定し、その機器の種別に対する制御信号を制御対象の機器へ出力する。 Alternatively, as shown in FIG. 2, the user 140 does not operate the device via the remote controller 142, but operates the device to be controlled by, for example, the smartphone terminal 148B. When a signal is output from the user by the smartphone terminal 148B, the robot 100 acquires the signal output from the smartphone terminal 148B via an external server. Then, the robot 100 specifies the type of the device according to the signal output from the smartphone terminal 148B, and outputs the control signal for the type of the device to the device to be controlled.
 このように、本実施形態のロボット100を用いることにより、ユーザは、遠隔操作器142を操作することなく、制御対象の機器の操作を簡易に行うことができる。 As described above, by using the robot 100 of the present embodiment, the user can easily operate the device to be controlled without operating the remote controller 142.
 以下、具体的に説明する。 The details will be explained below.
[第1実施形態] [First Embodiment]
 図3Aは、機器制御装置の一例であるロボット100の正面外観図である。図3Bは、ロボット100の側面外観図である。本実施形態におけるロボット100は、外部環境および内部状態に基づいて行動を決定する自律行動型のロボットである。外部環境は、カメラ、サーモセンサ、及びマイク等の各種のセンサにより検知される。内部状態はロボット100の感情を表現するさまざまなパラメータとして定量化される。これらについては後述する。 FIG. 3A is a front view of the robot 100, which is an example of the device control device. FIG. 3B is a side view of the robot 100. The robot 100 in this embodiment is an autonomous action type robot that determines an action based on an external environment and an internal state. The external environment is detected by various sensors such as a camera, a thermo sensor, and a microphone. The internal state is quantified as various parameters expressing the emotions of the robot 100. These will be described later.
 ロボット100は、移動機構の一例である2つの車輪102を備える。2つの車輪102の回転速度又は回転方向は、個別に制御可能である。また、ロボット100は、2つの手106を有する。手106は上げる、振る、振動するなど簡単な動作が可能である。2つの手106も個別制御可能である。また、ロボット100の目110にはカメラが内蔵される。目110は、液晶素子または有機EL素子による画像表示も可能である。ロボット100は、目110に内蔵されるカメラのほか、集音マイクや超音波センサなどさまざまなセンサを搭載する。また、ロボット100はスピーカーを内蔵し、簡単な音声を発することもできる。 The robot 100 includes two wheels 102, which is an example of a moving mechanism. The rotation speed or direction of rotation of the two wheels 102 can be controlled individually. Also, the robot 100 has two hands 106. The hand 106 can perform simple operations such as raising, shaking, and vibrating. The two hands 106 can also be individually controlled. Further, a camera is built in the eyes 110 of the robot 100. The eye 110 can also display an image by a liquid crystal element or an organic EL element. In addition to the camera built into the eyes 110, the robot 100 is equipped with various sensors such as a sound collecting microphone and an ultrasonic sensor. In addition, the robot 100 has a built-in speaker and can emit a simple voice.
 図4は、ロボットシステム300の構成図である。ロボットシステム300は、ロボット100、サーバ200、及び複数の外部センサ114を含む。 FIG. 4 is a configuration diagram of the robot system 300. The robot system 300 includes a robot 100, a server 200, and a plurality of external sensors 114.
 本実施形態では、図5に示されるように、家屋内にはあらかじめ複数の外部センサ114(外部センサ114a,114b,114c,・・・,114n)が設置される。サーバ200には、外部センサ114の位置座標が登録される。位置座標は、ロボット100の行動範囲として想定される家屋内においてx,y座標として定義される。例えば、ロボット100がP3に移動した場合、外部センサ114bとの通信により、ロボット100の位置情報が取得される。 In the present embodiment, as shown in FIG. 5, a plurality of external sensors 114 ( external sensors 114a, 114b, 114c, ..., 114n) are installed in advance in the house. The position coordinates of the external sensor 114 are registered in the server 200. The position coordinates are defined as x, y coordinates in the house, which is assumed to be the action range of the robot 100. For example, when the robot 100 moves to P3, the position information of the robot 100 is acquired by communicating with the external sensor 114b.
 図6は、ロボット100のハードウェア構成図である。ロボット100は、バッテリー116、駆動機構118、プロセッサ120、記憶装置122、通信機124、表示装置125、スピーカー126、及び内部センサ128を含む。各ユニットは電源線130および信号線132により互いに接続される。バッテリー116は、電源線130を介して各ユニットに電力を供給する。各ユニットは信号線132により制御信号を送受する。バッテリー116は、リチウムイオン二次電池などの二次電池であり、ロボット100の動力源である。通信機124は、制御信号受信部及び制御信号送信部の一例である。 FIG. 6 is a hardware configuration diagram of the robot 100. The robot 100 includes a battery 116, a drive mechanism 118, a processor 120, a storage device 122, a communication device 124, a display device 125, a speaker 126, and an internal sensor 128. Each unit is connected to each other by a power line 130 and a signal line 132. The battery 116 supplies electric power to each unit via the power line 130. Each unit sends and receives control signals via signal lines 132. The battery 116 is a secondary battery such as a lithium ion secondary battery, and is a power source for the robot 100. The communication device 124 is an example of a control signal receiving unit and a control signal transmitting unit.
 内部センサ128は、ロボット100が内蔵する各種センサの集合体である。具体的には、カメラ、集音マイク、赤外線センサ、サーモセンサ、タッチセンサ、加速度センサ、及び匂いセンサなどである。 The internal sensor 128 is an assembly of various sensors built in the robot 100. Specifically, it includes a camera, a sound collecting microphone, an infrared sensor, a thermo sensor, a touch sensor, an acceleration sensor, an odor sensor, and the like.
 通信機124は、サーバ200、外部センサ114、及びユーザの有する携帯機器等、各種の外部機器を対象として無線通信を行う通信モジュールである。 The communication device 124 is a communication module that performs wireless communication for various external devices such as a server 200, an external sensor 114, and a mobile device owned by a user.
 記憶装置122は、不揮発性メモリおよび揮発性メモリにより構成され、コンピュータプログラムや各種設定情報を記憶する。 The storage device 122 is composed of a non-volatile memory and a volatile memory, and stores computer programs and various setting information.
 プロセッサ120は、コンピュータプログラムの実行手段である。駆動機構118は、車輪102や手106等の各機構を制御するアクチュエータである。このほかには、表示器やスピーカーなども搭載される。 The processor 120 is a means for executing a computer program. The drive mechanism 118 is an actuator that controls each mechanism such as the wheel 102 and the hand 106. In addition, a display and speakers will be installed.
 プロセッサ120は、通信機124を介してサーバ200や外部センサ114と通信しながら、ロボット100の行動選択を行う。内部センサ128により得られるさまざまな外部情報も行動選択に影響する。 The processor 120 selects the action of the robot 100 while communicating with the server 200 and the external sensor 114 via the communication device 124. Various external information obtained by the internal sensor 128 also influences the action selection.
 駆動機構118は、主として、車輪102及び手106を制御する。駆動機構118は、2つの車輪102それぞれの回転速度や回転方向を変化させることにより、ロボット100の移動方向や移動速度を変化させる。また、駆動機構118は、車輪102を昇降させることもできる。車輪102が上昇すると、車輪102はボディ104に完全に格納され、ロボット100は着座面108にて床面に当接し、着座状態となる。 The drive mechanism 118 mainly controls the wheels 102 and the hands 106. The drive mechanism 118 changes the movement direction and the movement speed of the robot 100 by changing the rotation speed and the rotation direction of each of the two wheels 102. The drive mechanism 118 can also raise and lower the wheels 102. When the wheel 102 rises, the wheel 102 is completely retracted in the body 104, and the robot 100 comes into contact with the floor surface at the seating surface 108 to be in a seated state.
 駆動機構118がワイヤ134を介して手106を引っ張ることにより、手106を持ち上げることができる。手106を振動させることで手を振るような仕草も可能である。多数のワイヤ134を利用すればさらに複雑な仕草も表現可能である。 The hand 106 can be lifted by the drive mechanism 118 pulling the hand 106 via the wire 134. It is also possible to make a gesture like waving by vibrating the hand 106. Even more complicated gestures can be expressed by using a large number of wires 134.
 図7は、ロボットシステム300の機能ブロック図である。上述のように、ロボットシステム300は、ロボット100、サーバ200、及び複数の外部センサ114を含む。ロボット100及びサーバ200の各構成要素は、CPU(Central Processing Unit)及び各種コプロセッサなどの演算器、メモリやストレージといった記憶装置、それらを連結する有線または無線の通信線を含むハードウェアと、記憶装置に格納され、演算器に処理命令を供給するソフトウェアによって実現される。 FIG. 7 is a functional block diagram of the robot system 300. As described above, the robot system 300 includes a robot 100, a server 200, and a plurality of external sensors 114. Each component of the robot 100 and the server 200 includes a CPU (Central Processing Unit), a computing unit such as various coprocessors, a storage device such as a memory and a storage device, hardware including a wired or wireless communication line connecting them, and storage. It is realized by software that is stored in the device and supplies processing instructions to the arithmetic unit.
 コンピュータプログラムは、デバイスドライバ、オペレーティングシステム、それらの上位層に位置する各種アプリケーションプログラム、また、これらのプログラムに共通機能を提供するライブラリによって構成されてもよい。以下に説明する各ブロックは、ハードウェア単位の構成ではなく、機能単位のブロックを示している。ロボット100の機能の一部はサーバ200により実現されてもよいし、サーバ200の機能の一部または全部はロボット100により実現されてもよい。 The computer program may be composed of a device driver, an operating system, various application programs located in the upper layers thereof, and a library that provides common functions to these programs. Each block described below shows a block for each function, not a configuration for each hardware. A part of the functions of the robot 100 may be realized by the server 200, and a part or all of the functions of the server 200 may be realized by the robot 100.
 図7に示されるように、プロセッサ120は、機能的には、位置取得部の一例である。位置情報取得部150と、時刻取得部の一例である時刻情報取得部152と、機器種別認識部154と、登録部156と、指示信号取得部158と、音声解析部160と、制御対象機器種別認識部162と、制御信号認識部164と、信号送信制御部166と、を備える。また、図7に示されるように、記憶装置122は、機能的には、通信形式記憶部170と、第1記憶部の一例である機器種別記憶部172と、第2記憶部の一例である制御内容記憶部174と、地図情報記憶部176と、を備える。 As shown in FIG. 7, the processor 120 is functionally an example of a position acquisition unit. The position information acquisition unit 150, the time information acquisition unit 152 which is an example of the time acquisition unit, the device type recognition unit 154, the registration unit 156, the instruction signal acquisition unit 158, the voice analysis unit 160, and the control target device type. It includes a recognition unit 162, a control signal recognition unit 164, and a signal transmission control unit 166. Further, as shown in FIG. 7, the storage device 122 is functionally an example of the communication type storage unit 170, the device type storage unit 172 which is an example of the first storage unit, and the second storage unit. It includes a control content storage unit 174 and a map information storage unit 176.
 通信形式記憶部170には、ユーザが操作する遠隔操作器142から送信される制御信号の通信フォーマットに関する情報が格納される。通信フォーマットに関する情報は、例えば、図8に示されるように、テーブルの形式で格納される。例えば、図8に示すテーブルには、通信フォーマットの識別情報と、通信フォーマットと、通信フォーマットの制御信号の形式と、制御信号内のデータ部を識別するための情報とが対応付けられて格納される。 The communication format storage unit 170 stores information regarding the communication format of the control signal transmitted from the remote controller 142 operated by the user. Information about the communication format is stored in the form of a table, for example, as shown in FIG. For example, in the table shown in FIG. 8, the identification information of the communication format, the communication format, the format of the control signal of the communication format, and the information for identifying the data unit in the control signal are stored in association with each other. To.
 機器種別記憶部172には、制御対象の機器の種別に関する情報が格納される。機器の種別に関する情報は、例えば、図9Aに示されるように、テーブルの形式で格納される。例えば、図9Aに示すテーブルには、通信フォーマットの識別情報を表す通信フォーマットNoと、機器の種別と、機器を製作したメーカに関する情報と、制御信号を受信したときの時刻を表す時刻情報と、制御信号を受信したときの位置を表す位置情報とが対応付けられて格納される。なお、機器の種別は、機器の識別情報を表す機器の種別No、家電種別を表す第1の機器の種別、及び第1の機器の種別よりも詳細な機器の種別を表す第2の機器の種別を含む。 Information on the type of device to be controlled is stored in the device type storage unit 172. Information about the type of device is stored in the form of a table, for example, as shown in FIG. 9A. For example, in the table shown in FIG. 9A, the communication format No. indicating the identification information of the communication format, the type of the device, the information about the manufacturer that manufactured the device, and the time information indicating the time when the control signal is received are displayed. The position information indicating the position when the control signal is received is stored in association with the position information. The device type is a device type number representing device identification information, a first device type representing a home appliance type, and a second device representing a device type more detailed than the first device type. Includes type.
 また、機器種別記憶部172には、図9Bに示されるように、機器の種別を示す文言に関する情報が格納される。機器の種別を示す文言に関する情報は、例えば、図9Bに示されるように、機器の種別を示す文言と、当該文言に対応する第1の機器の種別とが対応付けられて格納される。 Further, as shown in FIG. 9B, the device type storage unit 172 stores information regarding the wording indicating the device type. As shown in FIG. 9B, for example, the information regarding the wording indicating the type of the device is stored in association with the wording indicating the type of the device and the type of the first device corresponding to the wording.
 制御内容記憶部174には、機器に対する制御内容に関する情報が格納される。機器に対する制御内容に関する情報は、例えば、図10Aに示されるように、テーブルの形式で格納される。例えば、図10Aに示すテーブルには、機器の種別と、制御内容を表す情報と、その制御内容を実現する制御信号とが対応付けられて格納される。 The control content storage unit 174 stores information on the control content for the device. Information about the control content for the device is stored in the form of a table, for example, as shown in FIG. 10A. For example, in the table shown in FIG. 10A, the type of the device, the information representing the control content, and the control signal that realizes the control content are stored in association with each other.
 また、制御内容記憶部174には、機器の制御内容を示す文言に関する情報が格納される。機器の制御内容を示す文言に関する情報は、例えば、図10Bに示されるように、テーブルの形式で格納される。例えば、図10Bに示すテーブルには、機器の制御内容を示す文言と、当該文言が表す制御内容が対応付けられて格納される。 In addition, the control content storage unit 174 stores information regarding the wording indicating the control content of the device. Information about the wording indicating the control content of the device is stored in the form of a table, for example, as shown in FIG. 10B. For example, in the table shown in FIG. 10B, the wording indicating the control content of the device and the control content represented by the wording are stored in association with each other.
 なお、機器種別記憶部172に機器の種別が登録されると、プロセッサ120は、例えば、登録した機器の種別の各々の制御内容に関する情報を取得する。例えば、プロセッサ120は、通信機124を介して、外部サーバから、登録した機器の種別の各々について、制御内容を表す情報と、その制御内容を実現する制御信号とを取得し、制御内容記憶部174に格納する。なお、制御内容記憶部174に記憶される機器の種別と制御内容とに関する制御信号のリストは、機器のメーカ等から予め提供されたデータベースに基づき構成されたものであってもよい。 When the device type is registered in the device type storage unit 172, the processor 120 acquires, for example, information on the control contents of each of the registered device types. For example, the processor 120 acquires information representing the control content and a control signal that realizes the control content for each of the registered device types from the external server via the communication device 124, and is a control content storage unit. Store in 174. The list of control signals related to the type of device and the control content stored in the control content storage unit 174 may be configured based on a database provided in advance by the manufacturer of the device or the like.
 制御内容記憶部174には、機器の種別に対する様々な制御内容が格納される。例えば、図10Aに示されるように、機器の種別No「100」には、制御内容「ON」の制御信号「10101*」と、制御内容「OFF」の制御信号「01010*」とが格納される。このため、機器の種別が判明すれば、その制御内容を表す制御信号が判明することとなる。したがって、機器の種別からその機器を制御するための種々の制御信号を得ることができる。 The control content storage unit 174 stores various control contents for the type of device. For example, as shown in FIG. 10A, the control content "ON" control signal "10101 *" and the control content "OFF" control signal "01010 *" are stored in the device type No. "100". To. Therefore, once the type of device is known, the control signal representing the control content is known. Therefore, various control signals for controlling the device can be obtained from the type of the device.
 地図情報記憶部176には、屋内の地図を表す地図情報が格納されている。地図情報は、例えば、図11に示されるように、テーブルの形式で格納される。例えば、図11に示すテーブルには、屋内の位置座標上の範囲と、当該範囲に位置する部屋の名称と、当該部屋の別称とが対応付けられて格納される。 The map information storage unit 176 stores map information representing an indoor map. The map information is stored in the form of a table, for example, as shown in FIG. For example, in the table shown in FIG. 11, a range on indoor position coordinates, a name of a room located in the range, and another name of the room are stored in association with each other.
<ロボットシステム300の作用> <Action of robot system 300>
 次に、ロボットシステム300の作用を説明する。まず、ユーザが遠隔操作器142を操作すると、制御対象の機器に対する制御信号が遠隔操作器142から出力される。制御信号は、機器への制御内容を表す信号である。このとき、ロボット100の通信機124は、遠隔操作器142から出力された制御信号を受信する。通信機124によって制御信号が受信されると、ロボット100のプロセッサ120は、記憶装置122からプログラムを読み出し、図12に示す学習処理ルーチンを実行する。 Next, the operation of the robot system 300 will be described. First, when the user operates the remote controller 142, a control signal for the device to be controlled is output from the remote controller 142. The control signal is a signal indicating the content of control to the device. At this time, the communication device 124 of the robot 100 receives the control signal output from the remote controller 142. When the control signal is received by the communication device 124, the processor 120 of the robot 100 reads the program from the storage device 122 and executes the learning processing routine shown in FIG.
<学習処理ルーチン> <Learning processing routine>
 まず、ステップS100において、プロセッサ120の機器種別認識部154は、通信機124によって受信された制御信号を取得する。 First, in step S100, the device type recognition unit 154 of the processor 120 acquires the control signal received by the communication device 124.
 次に、ステップS102において、プロセッサ120の時刻情報取得部152は、通信機124により制御信号が受信されたときの時刻情報を取得する。 Next, in step S102, the time information acquisition unit 152 of the processor 120 acquires the time information when the control signal is received by the communication device 124.
 次に、ステップS104において、プロセッサ120の位置情報取得部150は、通信機124により制御信号が受信されたときの位置情報を取得する。 Next, in step S104, the position information acquisition unit 150 of the processor 120 acquires the position information when the control signal is received by the communication device 124.
 ステップS106において、プロセッサ120の機器種別認識部154は、上記ステップS100で取得された制御信号に基づいて、当該制御信号が表す機器の種別を認識する。 In step S106, the device type recognition unit 154 of the processor 120 recognizes the type of device represented by the control signal based on the control signal acquired in step S100.
 制御信号から機器の種別を認識する場合、例えば、機器種別認識部154は、制御信号を復号化する。次に、機器種別認識部154は、通信形式記憶部170に格納されたテーブルを参照して、その制御信号の通信フォーマットを特定する。例えば、機器種別認識部154は、制御信号の形式が「111100」である場合、通信形式記憶部170に格納されたテーブルを参照して、その制御信号の通信フォーマットは「X方式」であることを特定する。 When recognizing the type of device from the control signal, for example, the device type recognition unit 154 decodes the control signal. Next, the device type recognition unit 154 refers to the table stored in the communication format storage unit 170 to specify the communication format of the control signal. For example, when the control signal format is "111100", the device type recognition unit 154 refers to the table stored in the communication format storage unit 170, and the communication format of the control signal is "X method". To identify.
 そして、プロセッサ120の機器種別認識部154は、通信形式記憶部170に格納されたテーブルを参照し、制御信号の6ビット以降のデータ部を抽出する。そして、機器種別認識部154は、制御信号のデータ部に含まれる、家電種別を表す第1の機器の種別、第1の機器の種別よりも詳細な機器の種別を表す第2の機器の種別、及びメーカを表す情報を認識する。 Then, the device type recognition unit 154 of the processor 120 refers to the table stored in the communication format storage unit 170, and extracts the data unit after 6 bits of the control signal. Then, the device type recognition unit 154 includes a first device type representing the home appliance type and a second device type representing a device type more detailed than the first device type, which is included in the control signal data unit. , And recognize information representing the manufacturer.
 なお、第2の機器の種別は、第1の機器の種別よりも詳細な機器の種別であり、同種の機器が複数存在している場合に、それらの機器を識別するための情報である。例えば、屋内にテレビが2台存在していた場合には、第2の機器の種別により、それら2台のテレビが識別可能である。 Note that the second device type is a device type that is more detailed than the first device type, and is information for identifying a plurality of devices of the same type when there are a plurality of devices of the same type. For example, when there are two TVs indoors, the two TVs can be identified by the type of the second device.
 ステップS108において、プロセッサ120の登録部156は、上記ステップS102で認識された、第1の機器の種別、第2の機器の種別、及びメーカを、機器種別記憶部172へ登録する。また、プロセッサ120の登録部156は、上記ステップS102で取得された時刻情報と、上記ステップS104で取得された位置情報との組み合わせを、機器種別記憶部172へ登録する。 In step S108, the registration unit 156 of the processor 120 registers the type of the first device, the type of the second device, and the manufacturer recognized in step S102 in the device type storage unit 172. Further, the registration unit 156 of the processor 120 registers the combination of the time information acquired in the step S102 and the position information acquired in the step S104 in the device type storage unit 172.
 例えば、図9Aに示されるテーブルのように、通信フォーマットNoと、機器の種別(例えば、機器の種別No、第1の機器の種別、及び第2の機器の種別)と、メーカと、時刻情報と、位置情報とが対応付けられて格納される。 For example, as shown in the table shown in FIG. 9A, the communication format No., the device type (for example, the device type No., the first device type, and the second device type), the manufacturer, and the time information. And the position information are stored in association with each other.
 そして、ロボット100のプロセッサ120は、学習処理ルーチンを終了する。 Then, the processor 120 of the robot 100 ends the learning processing routine.
 なお、上記の処理ルーチンは、通信機124が遠隔操作器142からの制御信号を受信する毎に実行される。このため、ロボット100の機器種別記憶部172には、ユーザが遠隔操作器142により操作した機器の種別に関する情報が格納されていく。これにより、ロボット100は、ユーザが日常的に操作している機器の種別を学習することになる。 The above processing routine is executed every time the communication device 124 receives the control signal from the remote controller 142. Therefore, the device type storage unit 172 of the robot 100 stores information regarding the type of device operated by the user by the remote controller 142. As a result, the robot 100 learns the types of devices that the user operates on a daily basis.
 機器種別記憶部172に機器の種別が格納されると、ロボット100は、遠隔操作器142から出力される制御信号とは異なる種類の信号である指示信号を受け付けたときに、その指示信号に応じた制御信号を制御対象の機器へ出力する。 When the device type is stored in the device type storage unit 172, the robot 100 responds to the instruction signal when it receives an instruction signal which is a signal of a type different from the control signal output from the remote controller 142. The control signal is output to the device to be controlled.
 なお、指示信号とは、制御信号とは異なる種類の信号であって、機器への制御内容を表す信号である。指示信号の一例としては、ユーザによって発せられる音声情報が挙げられる。また、指示信号の一例としては、遠隔操作器142とは異なる端末によって出力される信号であって、外部サーバを介してロボット100が受信する信号が挙げられる。そのような指示信号としては、例えば、スマートフォン端末から出力された信号が外部サーバを介してロボット100へ出力される場合が考えられる。この場合には、専用のアプリケーションによって、スマートフォン端末からロボット100へ信号が伝達される。 Note that the instruction signal is a signal of a type different from the control signal, and is a signal representing the control content to the device. An example of the instruction signal is voice information emitted by the user. Further, as an example of the instruction signal, there is a signal output by a terminal different from the remote controller 142 and received by the robot 100 via an external server. As such an instruction signal, for example, a signal output from a smartphone terminal may be output to the robot 100 via an external server. In this case, a signal is transmitted from the smartphone terminal to the robot 100 by a dedicated application.
 本実施形態では、指示信号が、ユーザの音声情報である場合を例に説明する。 In this embodiment, the case where the instruction signal is the voice information of the user will be described as an example.
 まず、ユーザが音声情報を発すると、ロボット100の内部センサ128に含まれるマイクは、ユーザから発せられた音声情報を検知する。ロボット100のマイクが音声情報を検知すると、ロボット100のプロセッサ120は、記憶装置122からプログラムを読み出し、図13に示す機器制御処理ルーチンを実行する。 First, when the user emits voice information, the microphone included in the internal sensor 128 of the robot 100 detects the voice information emitted from the user. When the microphone of the robot 100 detects the voice information, the processor 120 of the robot 100 reads the program from the storage device 122 and executes the device control processing routine shown in FIG.
<機器制御処理ルーチン> <Device control processing routine>
 ステップS200において、プロセッサ120の指示信号取得部158は、内部センサ128のうちの集音マイクによって取得された音声情報である指示信号を取得する。 In step S200, the instruction signal acquisition unit 158 of the processor 120 acquires the instruction signal which is the voice information acquired by the sound collecting microphone in the internal sensor 128.
 ステップS202において、プロセッサ120の音声解析部160は、指示信号取得部158により取得された音声情報を音声解析する。例えば、音声解析部160は、既知の音声解析技術を用いて、音声情報を文字情報へ変換する。音声解析部160による音声解析により、音声情報から文字情報への変換が行われ、例えば、文字情報「てれびつけて」が得られる。更に、音声解析部160は、既存の自然言語処理技術により文字情報「てれびつけて」を文字情報「テレビつけて」へ変換する。 In step S202, the voice analysis unit 160 of the processor 120 performs voice analysis of the voice information acquired by the instruction signal acquisition unit 158. For example, the voice analysis unit 160 converts voice information into character information by using a known voice analysis technique. By voice analysis by the voice analysis unit 160, the voice information is converted into the character information, and for example, the character information "Terebitsuke" is obtained. Further, the voice analysis unit 160 converts the character information "Telebitsuke" into the character information "Television" by the existing natural language processing technology.
 ステップS204において、プロセッサ120の音声解析部160は、上記ステップS204で得られた結果に基づいて、指示信号に含まれる制御内容を示す文言と、指示信号に含まれる機器の種別を示す文言とを取得する。例えば、音声解析部160は、文字情報「テレビつけて」から、機器の種別を示す文言「テレビ」と、制御内容を示す文言「つけて」とを取得する。 In step S204, the voice analysis unit 160 of the processor 120 uses the wording indicating the control content included in the instruction signal and the wording indicating the type of the device included in the instruction signal based on the result obtained in step S204. get. For example, the voice analysis unit 160 acquires the wording "television" indicating the type of device and the wording "turning on" indicating the control content from the character information "television on".
 ステップS206において、プロセッサ120の制御対象機器種別認識部162は、上記ステップS204で得られた機器の種別を示す文言と、機器種別記憶部172に記憶されている機器の種別とに基づいて、上記ステップS200で取得された指示信号が表す制御対象の機器の種別を認識する。例えば、機器の種別を示す文言が「テレビ」である場合、機器種別記憶部172に記憶されている図9Bのテーブルを参照して、機器の種別を表す文言「テレビ」と一致する第1の機器の種別である「Television」を、制御対象の機器として認識する。そして、制御対象機器種別認識部162は、第1の機器の種別が「Television」である「TV-01」を、実際の制御対象の機器として認識する。 In step S206, the control target device type recognition unit 162 of the processor 120 is based on the wording indicating the device type obtained in step S204 and the device type stored in the device type storage unit 172. The type of the device to be controlled represented by the instruction signal acquired in step S200 is recognized. For example, when the wording indicating the type of device is "television", the first word matching the wording "television" indicating the type of device with reference to the table of FIG. 9B stored in the device type storage unit 172. The device type "Television" is recognized as the device to be controlled. Then, the control target device type recognition unit 162 recognizes "TV-01" whose first device type is "Television" as an actual control target device.
 ステップS208において、制御信号認識部164は、機器の種別と制御内容とに対応する制御信号が記憶されている制御内容記憶部174を参照して、上記ステップS206で認識された制御対象の機器の種別と、音声情報に含まれる制御対象の機器への制御内容とに対応する、対象の制御信号を認識する。 In step S208, the control signal recognition unit 164 refers to the control content storage unit 174 in which the control signal corresponding to the type of device and the control content is stored, and refers to the control target device recognized in step S206. Recognize the target control signal corresponding to the type and the control content for the controlled target device included in the voice information.
 具体的には、制御信号認識部164は、制御内容記憶部174に格納された情報と、上記ステップS204で得られた制御内容を示す文言とに基づいて、対象の制御信号を認識する。例えば、制御内容を示す文言が「つけて」である場合、制御信号認識部164に記憶されているテーブルを参照して、制御内容を示す文言「つけて」と一致する制御内容に対応する制御内容「ON」を、対象の制御内容として認識する。 Specifically, the control signal recognition unit 164 recognizes the target control signal based on the information stored in the control content storage unit 174 and the wording indicating the control content obtained in step S204. For example, when the wording indicating the control content is "attach", the control corresponding to the control content corresponding to the word "attach" indicating the control content is referred to by referring to the table stored in the control signal recognition unit 164. The content "ON" is recognized as the target control content.
 そして、制御信号認識部164は、上記ステップS206で認識された機器の種別と、対象の制御内容とに基づいて、対象の制御信号を認識する。例えば、制御信号認識部164は、上記ステップS206で認識された機器の種別「TV-01」と、対象の制御内容「ON」とが一致する制御信号「10101*」を、対象の制御信号として認識する。 Then, the control signal recognition unit 164 recognizes the target control signal based on the type of the device recognized in step S206 and the target control content. For example, the control signal recognition unit 164 uses a control signal "10101 *" whose device type "TV-01" recognized in step S206 and the target control content "ON" match as the target control signal. recognize.
 ステップS210において、信号送信制御部166は、上記ステップS206で認識された制御対象の機器に対して、上記ステップS208で認識された対象の制御信号を送信させるように通信機124を制御する。例えば、信号送信制御部166は、制御対象の機器「TV-01」に対して、ONを意味する制御信号「10101*」を送信するように、通信機124を制御する。そして、ロボット100のプロセッサ120は、機器制御処理ルーチンを終了する。 In step S210, the signal transmission control unit 166 controls the communication device 124 so that the device to be controlled recognized in step S206 transmits the control signal of the target recognized in step S208. For example, the signal transmission control unit 166 controls the communication device 124 so as to transmit the control signal “10101 *” meaning ON to the device “TV-01” to be controlled. Then, the processor 120 of the robot 100 ends the device control processing routine.
 なお、上記の機器制御処理ルーチンは、ロボット100のマイクが指示信号を検知する毎に実行される。 The above device control processing routine is executed every time the microphone of the robot 100 detects an instruction signal.
 以上説明したように、第1実施形態のロボット100は、遠隔操作器142から送信された制御信号を受信する。そして、ロボット100は、受信された制御信号に基づいて、制御信号が表す機器の種別を認識する。そして、ロボット100は、認識した機器の種別を、機器種別記憶部172へ登録する。ロボット100は、遠隔操作器142からの制御信号とは異なる種類の信号であって、機器への制御内容を表す信号である指示信号としてのユーザの音声情報を取得する。そして、ロボット100は、音声情報と、機器種別記憶部172に記憶されている機器の種別とに基づいて、音声情報が表す制御対象の機器の種別を認識する。次に、ロボット100は、機器の種別と制御内容とに対応する制御信号が記憶されている制御内容記憶部174を参照して、認識した制御対象の機器の種別と、音声情報に含まれる当該制御対象の機器への制御内容とに対応する対象の制御信号を認識する。そして、ロボット100は、制御対象の機器に対して対象の制御信号を送信する。これにより、制御対象の機器が自らの遠隔操作に関する情報の送受信機能を有していない場合であっても、制御対象の機器の制御を簡易に行うことができる。 As described above, the robot 100 of the first embodiment receives the control signal transmitted from the remote controller 142. Then, the robot 100 recognizes the type of the device represented by the control signal based on the received control signal. Then, the robot 100 registers the recognized device type in the device type storage unit 172. The robot 100 acquires user voice information as an instruction signal, which is a signal of a type different from the control signal from the remote controller 142 and is a signal indicating the control content to the device. Then, the robot 100 recognizes the type of the device to be controlled represented by the voice information based on the voice information and the type of the device stored in the device type storage unit 172. Next, the robot 100 refers to the control content storage unit 174 in which the control signal corresponding to the device type and the control content is stored, and refers to the recognized control target device type and the voice information included in the robot 100. Recognize the control signal of the target corresponding to the control content to the device to be controlled. Then, the robot 100 transmits the target control signal to the device to be controlled. As a result, even when the device to be controlled does not have the function of transmitting / receiving information regarding its own remote control, the device to be controlled can be easily controlled.
 また、ユーザは、機器固有の遠隔操作器142を操作することなく、対象の機器を操作することができる。特に、ユーザは、音声によって簡易に機器を操作することができる。 In addition, the user can operate the target device without operating the remote control device 142 unique to the device. In particular, the user can easily operate the device by voice.
[第2実施形態] [Second Embodiment]
 次に、第2実施形態について説明する。なお、第2実施形態に係るロボットシステムの構成は、第1実施形態と同様の構成となるため、同一符号を付して説明を省略する。 Next, the second embodiment will be described. Since the robot system according to the second embodiment has the same configuration as that of the first embodiment, the same reference numerals are given and the description thereof will be omitted.
 第2実施形態では、遠隔操作器142からの制御信号を受信したときの時刻情報又は遠隔操作器142からの制御信号を受信したときの位置情報に応じて、制御対象の機器を決定する点が、第1実施形態と異なる。 In the second embodiment, the device to be controlled is determined according to the time information when the control signal from the remote controller 142 is received or the position information when the control signal from the remote controller 142 is received. , Different from the first embodiment.
 ロボット100が制御対象の機器を決定する際に、制御対象の機器の候補が複数存在する場合がある。例えば、2台のテレビが存在している場合等である。この場合、機器種別記憶部172には、2台のテレビが登録される。例えば、機器種別記憶部172の図9Aのテーブルには、第1の機器の種別が「Television」である機器として、「TV-01」と「TV-02」との2つのテレビが登録されている。このとき、ロボット100は、制御対象の機器を決定する際に、例えば、機器の種別Noが小さい方の機器を制御対象の機器として決定することが考えられる。しかし、そのようにして決定された機器は、ユーザが操作を意図している機器でない場合もある。 When the robot 100 determines a device to be controlled, there may be a plurality of candidates for the device to be controlled. For example, when there are two televisions. In this case, two televisions are registered in the device type storage unit 172. For example, in the table of FIG. 9A of the device type storage unit 172, two televisions, "TV-01" and "TV-02", are registered as devices whose first device type is "Television". There is. At this time, when the robot 100 determines the device to be controlled, for example, it is conceivable that the device having the smaller device type number is determined as the device to be controlled. However, the device thus determined may not be the device that the user intends to operate.
 そこで、第2実施形態のロボット100は、制御信号を受信したときの時刻情報及び制御信号を受信したときの位置情報に応じて、制御対象の機器を決定する。 Therefore, the robot 100 of the second embodiment determines the device to be controlled according to the time information when the control signal is received and the position information when the control signal is received.
 具体的には、制御対象機器種別認識部162は、指示信号に含まれる機器の種別及び指示信号を受信した時刻と、機器種別記憶部172に記憶されている機器の種別及び時刻情報とに基づいて、指示信号が表す制御対象の機器の種別を認識する。 Specifically, the control target device type recognition unit 162 is based on the device type included in the instruction signal and the time when the instruction signal is received, and the device type and time information stored in the device type storage unit 172. Then, the type of the device to be controlled represented by the instruction signal is recognized.
 また、制御対象機器種別認識部162は、指示信号に含まれる機器の種別及び指示信号を受信した位置と、機器種別記憶部172に記憶されている機器の種別及び位置情報とに基づいて、指示信号が表す制御対象の機器の種別を認識する。 Further, the control target device type recognition unit 162 gives an instruction based on the type of the device included in the instruction signal and the position where the instruction signal is received, and the type and position information of the device stored in the device type storage unit 172. Recognize the type of device to be controlled represented by the signal.
 図9Aに示されるように、機器種別記憶部172には、制御信号を受信したときの時刻情報及び制御信号を受信したときの位置情報が格納されている。制御信号が受信される毎に、時刻情報及び位置情報が格納される。このため、各機器の種別について、時間帯毎に、制御信号を受信した回数を計算することができる。 As shown in FIG. 9A, the device type storage unit 172 stores time information when the control signal is received and position information when the control signal is received. Each time a control signal is received, time information and position information are stored. Therefore, for each type of device, the number of times the control signal is received can be calculated for each time zone.
 例えば、各時間帯において制御信号を受信した回数について、8:00~12:00までは5回、12:00~16:00までは1回、16:00~19:00までは0回、19:00~24:00までは1回といった回数が計算される。 For example, regarding the number of times the control signal is received in each time zone, 5 times from 8:00 to 12:00, 1 time from 12:00 to 16:00, 0 times from 16:00 to 19:00, From 19:00 to 24:00, the number of times such as once is calculated.
 また、例えば、各位置において制御信号を受信した回数について、(X1,Y1)~(X2,Y2)は5回、(X3,Y3)~(X4,Y4)は0回といった回数が計算される。 Further, for example, the number of times the control signal is received at each position is calculated as 5 times for (X1, Y1) to (X2, Y2) and 0 times for (X3, Y3) to (X4, Y4). ..
 例えば、制御信号を受信した時刻が10:00であった場合を考える。この場合、制御対象機器種別認識部162は、制御対象の機器を「Television」と判定したものの、「Television」に対応する機器が「TV-01」と「TV-02」の2つあったとする。このとき、例えば、制御対象機器種別認識部162は、「TV-01」の制御信号を過去に受信した時刻情報から求まる、制御信号を受信した回数が最も高い時間帯と、「TV-02」の制御信号を過去に受信した時刻情報から求まる、制御信号を受信した回数が最も高い時間帯とを取得する。そして、例えば、「TV-01」の制御信号を受信した回数が最も高い時間帯が「8:00~12:00」であり、「TV-02」の制御信号を受信した回数が最も高い時間帯が「12:00~16:00」であったとする。この場合、制御対象機器種別認識部162は、制御信号を受信した時刻は10:00であるため、制御対象の機器は「TV-01」であると認識する。 For example, consider the case where the time when the control signal is received is 10:00. In this case, it is assumed that the control target device type recognition unit 162 determines that the control target device is "Television", but there are two devices corresponding to "Television", "TV-01" and "TV-02". .. At this time, for example, the control target device type recognition unit 162 obtains the control signal of "TV-01" from the time information received in the past, the time zone in which the number of times the control signal is received is the highest, and "TV-02". Obtains the time zone in which the number of times the control signal is received is the highest, which is obtained from the time information received in the past. Then, for example, the time zone in which the control signal of "TV-01" is received most frequently is "8:00 to 12:00", and the time zone in which the control signal of "TV-02" is received is the highest time. It is assumed that the band is "12:00 to 16:00". In this case, the control target device type recognition unit 162 recognizes that the control target device is "TV-01" because the time when the control signal is received is 10:00.
 また、例えば、制御信号を受信した時刻が(X2,Y2)であった場合を考える。この場合、制御対象機器種別認識部162は、制御対象の機器を「Television」と判定したものの、「Television」に対応する機器が「TV-01」と「TV-02」の2つあったとする。このとき、例えば、制御対象機器種別認識部162は、「TV-01」の制御信号を過去に受信した位置情報から求まる、制御信号を受信した回数が最も高い位置範囲と、「TV-02」の制御信号を過去に受信した時刻情報から求まる、制御信号を受信した回数が最も高い位置範囲とを取得する。そして、例えば、「TV-01」の制御信号を受信した回数が最も高い位置範囲が(X1,Y1)~(X3,Y3)であり、「TV-02」の制御信号を受信した回数が最も高い位置範囲が(X4,Y4)~(X5,Y5)であったとする。この場合、制御対象機器種別認識部162は、制御信号を受信した位置は(X2,Y2)であって、(X1,Y1)~(X3,Y3)の範囲内であるため、制御対象の機器は「TV-01」であると認識する。 Also, consider, for example, the case where the time when the control signal is received is (X2, Y2). In this case, it is assumed that the control target device type recognition unit 162 determines that the control target device is "Television", but there are two devices corresponding to "Television", "TV-01" and "TV-02". .. At this time, for example, the control target device type recognition unit 162 obtains the control signal of "TV-01" from the position information received in the past, the position range in which the number of times the control signal is received is the highest, and "TV-02". Obtains the position range in which the number of times the control signal has been received is the highest, which is obtained from the time information received in the past. Then, for example, the position range in which the number of times the control signal of "TV-01" is received is the highest is (X1, Y1) to (X3, Y3), and the number of times the control signal of "TV-02" is received is the highest. Assume that the high position range is (X4, Y4) to (X5, Y5). In this case, the control target device type recognition unit 162 receives the control signal at the position (X2, Y2) and is within the range of (X1, Y1) to (X3, Y3). Recognizes as "TV-01".
 なお、第2実施形態に係るロボットシステムの他の構成及び作用については、第1実施形態と同様であるため、説明を省略する。 Since other configurations and operations of the robot system according to the second embodiment are the same as those of the first embodiment, the description thereof will be omitted.
 以上説明したように、第2実施形態のロボット100は、指示信号に含まれる機器の種別及び指示信号を受信した時刻と、機器種別記憶部172に記憶されている機器の種別及び時刻情報とに基づいて、指示信号が表す制御対象の機器の種別を認識する。これにより、制御対象の機器の候補が複数存在する場合であっても、ユーザが操作する遠隔操作器142からの制御信号を受信した時間帯に応じて、ユーザが意図する機器を適切に制御することができる。 As described above, the robot 100 of the second embodiment has the type of device included in the instruction signal and the time when the instruction signal is received, and the type and time information of the device stored in the device type storage unit 172. Based on this, the type of the device to be controlled represented by the instruction signal is recognized. As a result, even if there are a plurality of candidates for the device to be controlled, the device intended by the user is appropriately controlled according to the time zone in which the control signal from the remote controller 142 operated by the user is received. be able to.
 また、第2実施形態のロボット100は、指示信号に含まれる機器の種別及び指示信号を受信した位置と、機器種別記憶部172に記憶されている機器の種別及び位置情報とに基づいて、指示信号が表す制御対象の機器の種別を認識する。これにより、制御対象の機器の候補が複数存在する場合であっても、ユーザが操作する遠隔操作器142からの制御信号を受信した位置に応じて、ユーザが意図する機器を適切に制御することができる。 Further, the robot 100 of the second embodiment gives an instruction based on the type of the device included in the instruction signal and the position where the instruction signal is received, and the type and position information of the device stored in the device type storage unit 172. Recognize the type of device to be controlled represented by the signal. As a result, even if there are a plurality of candidates for the device to be controlled, the device intended by the user can be appropriately controlled according to the position where the control signal from the remote controller 142 operated by the user is received. Can be done.
[第3実施形態] [Third Embodiment]
 次に、第3実施形態について説明する。第3実施形態では、機器に対して評価値を付与する点が、第1実施形態及び第2実施形態と異なる。 Next, the third embodiment will be described. The third embodiment is different from the first embodiment and the second embodiment in that an evaluation value is given to the device.
 図14に示されるように、第3実施形態のプロセッサ220は、機能的には、位置情報取得部150と、時刻情報取得部152と、機器種別認識部154と、登録部156と、評価部257と、指示信号取得部158と、音声解析部160と、制御対象機器種別認識部162と、制御信号認識部164と、信号送信制御部166と、を備える。 As shown in FIG. 14, the processor 220 of the third embodiment functionally has a position information acquisition unit 150, a time information acquisition unit 152, a device type recognition unit 154, a registration unit 156, and an evaluation unit. It includes 257, an instruction signal acquisition unit 158, a voice analysis unit 160, a control target device type recognition unit 162, a control signal recognition unit 164, and a signal transmission control unit 166.
 評価部257は、機器種別記憶部172に記憶された各機器に対する評価値を決定して、当該評価値を各機器へ付与する。そして、第3実施形態の制御対象機器種別認識部162は、制御対象の機器の種別の候補が複数存在する場合に、評価部257によって決定された評価値に応じて、制御対象の機器の種別を認識する。 The evaluation unit 257 determines the evaluation value for each device stored in the device type storage unit 172, and assigns the evaluation value to each device. Then, when there are a plurality of candidates for the type of the device to be controlled, the control target device type recognition unit 162 of the third embodiment determines the type of the device to be controlled according to the evaluation value determined by the evaluation unit 257. Recognize.
 評価部257は、ユーザの行動に応じて各機器に対する評価値を決定する。以下では、遠隔操作器142による制御信号の送信及び指示信号である音声情報の出力を、ユーザの行動としてみなす場合を例に説明する。 The evaluation unit 257 determines the evaluation value for each device according to the user's behavior. In the following, a case where the transmission of the control signal by the remote controller 142 and the output of the voice information as the instruction signal are regarded as the user's behavior will be described as an example.
 例えば、評価部257は、通信機124による対象の機器への制御信号の送信と、通信機124による制御信号の受信との間の時系列の関係に応じて、機器種別記憶部172に記憶された各機器に対する評価値を更新する。 For example, the evaluation unit 257 is stored in the device type storage unit 172 according to the time-series relationship between the transmission of the control signal to the target device by the communication device 124 and the reception of the control signal by the communication device 124. Update the evaluation value for each device.
 例えば、ロボット100から制御信号を送信した対象の機器が誤っていた場合、ユーザは遠隔操作器142を操作し、意図する機器へ制御信号を再度送信することが考えられる。例えば、ロボット100がテレビBをONにする制御信号を送信したが、ユーザの意図はテレビAをONすることであった場合には、ユーザは遠隔操作器142を操作し、テレビAをONにする制御信号を送信する可能性が高い。なお、この場合、実際にはテレビBは存在していない場合もある。 For example, if the target device to which the control signal is transmitted from the robot 100 is incorrect, the user may operate the remote controller 142 and retransmit the control signal to the intended device. For example, if the robot 100 transmits a control signal to turn on the TV B, but the user's intention is to turn on the TV A, the user operates the remote controller 142 to turn on the TV A. Is likely to transmit a control signal. In this case, the television B may not actually exist.
 このため、評価部257は、ロボット100の通信機124による第1の機器(テレビB)への制御信号の送信後、所定時間内に、通信機124によって第2の機器(テレビA)への制御信号を受信した場合には、第1の機器(テレビB)への評価値を下げるように更新する。 Therefore, the evaluation unit 257 transmits the control signal to the first device (television B) by the communication device 124 of the robot 100, and then within a predetermined time, the communication device 124 sends the control signal to the second device (television A). When the control signal is received, the evaluation value for the first device (television B) is updated to be lowered.
 また、この場合、評価部257は、第2の機器(テレビA)の評価値を上げるように更新する。 Further, in this case, the evaluation unit 257 updates so as to raise the evaluation value of the second device (television A).
 なお、例えば、第1の機器(テレビB)へのONの制御信号の送信後、第2の機器(テレビA)へのOFFの制御信号を受信した場合には、ロボット100による制御信号の誤送信ではなく、単に、ユーザが第2の機器(テレビA)をOFFにしたいだけの可能性もある。 For example, when the ON control signal is transmitted to the first device (TV B) and then the OFF control signal is received to the second device (TV A), the control signal by the robot 100 is erroneous. There is a possibility that the user simply wants to turn off the second device (television A) instead of transmitting.
 このため、評価部257は、第1の機器(テレビB)への制御信号が表す制御内容と、第2の機器(テレビA)への制御信号が表す制御内容とが同一である場合に、第2の機器(テレビA)の評価値を上げるように更新するようにしてもよい。 Therefore, the evaluation unit 257 determines that the control content represented by the control signal to the first device (television B) and the control content represented by the control signal to the second device (television A) are the same. It may be updated so as to raise the evaluation value of the second device (television A).
 また、同様に、ユーザがテレビAをONにすることを意図していたのに対し、ロボット100がテレビBをONにする制御信号を送信した場合、ユーザは遠隔操作器142を操作し、テレビBをOFFにする制御信号を送信する可能性が高い。 Similarly, while the user intended to turn on the television A, when the robot 100 transmits a control signal to turn on the television B, the user operates the remote controller 142 and the television. There is a high possibility that a control signal for turning off B will be transmitted.
 このため、評価部257は、第1の機器(テレビB)への制御信号の送信後、所定時間内に、第1の機器(テレビB)への制御信号を受信した場合であって、かつ送信した制御信号が表す制御内容(ON)と、受信した制御信号が表す制御内容(OFF)とが異なる場合には、該第1の機器(テレビB)への評価値を下げるように更新する。 Therefore, the evaluation unit 257 receives the control signal to the first device (television B) within a predetermined time after transmitting the control signal to the first device (television B), and If the control content (ON) represented by the transmitted control signal and the control content (OFF) represented by the received control signal are different, the evaluation value for the first device (television B) is updated to be lowered. ..
 また、評価部257は、ロボット100による制御信号の送信後に取得した指示信号である音声情報の内容に応じて、評価値を更新するようにしてもよい。 Further, the evaluation unit 257 may update the evaluation value according to the content of the voice information which is the instruction signal acquired after the control signal is transmitted by the robot 100.
 例えば、ロボット100がテレビBへのONを表す制御信号の送信後、所定時間内に、再度同一の制御内容(ON)の指示信号(「キッチンのテレビ(テレビA)をつけて」)を認識した場合、ロボット100によるテレビBへの制御信号の送信が誤っていた可能性が高い。 For example, after the robot 100 transmits a control signal indicating ON to the TV B, the robot 100 recognizes the instruction signal (“Turn on the TV (TV A) in the kitchen”) of the same control content (ON) again within a predetermined time. If this is the case, it is highly possible that the robot 100 has mistakenly transmitted the control signal to the television B.
 このため、評価部257は、第1の機器(テレビB)への制御信号の送信後、所定時間内に、第2の機器(テレビA)への指示信号を取得した場合には、第1の機器(テレビB)への評価値を下げるように更新する。 Therefore, when the evaluation unit 257 acquires the instruction signal to the second device (television A) within a predetermined time after transmitting the control signal to the first device (television B), the first Update to lower the evaluation value for the device (TV B).
 また、ロボット100は、テレビBへONを表す制御信号の送信後、所定時間内に反対の制御内容(OFF)の指示信号(テレビBは消して)を認識した場合、テレビBへの制御信号の送信が誤っていた可能性が高い。 Further, when the robot 100 recognizes the instruction signal (turning off the TV B) of the opposite control content (OFF) within a predetermined time after transmitting the control signal indicating ON to the TV B, the control signal to the TV B It is highly possible that the transmission was incorrect.
 このため、評価部257は、第1の機器(テレビB)への制御信号の送信後、所定時間内に、第1の機器(テレビB)への指示信号を取得した場合であって、かつ制御信号が表す制御内容(ON)と指示信号が表す制御内容(OFF)とが異なる場合には、第1の機器(テレビB)への評価値を下げるように更新する。 Therefore, the evaluation unit 257 acquires the instruction signal to the first device (television B) within a predetermined time after transmitting the control signal to the first device (television B), and When the control content (ON) represented by the control signal and the control content (OFF) represented by the instruction signal are different, the evaluation value for the first device (television B) is updated to be lowered.
 また、評価部257は、位置情報取得部150により取得された位置情報に対応する領域に応じて、各機器に対する評価値を決定する。また、評価部257は、時刻情報取得部152により取得された時刻情報に対応する時間帯に応じて、各機器に対する評価値を決定する。これにより、第2実施形態と同様に、時間及び位置に応じた評価値に基づいて、各機器に対する制御信号を送信することができる。 Further, the evaluation unit 257 determines the evaluation value for each device according to the area corresponding to the position information acquired by the position information acquisition unit 150. Further, the evaluation unit 257 determines the evaluation value for each device according to the time zone corresponding to the time information acquired by the time information acquisition unit 152. Thereby, as in the second embodiment, the control signal for each device can be transmitted based on the evaluation value according to the time and the position.
 また、評価部257は、機器種別記憶部172に記憶された各機器のうち、時刻情報取得部152によって制御信号が受信されたときの時刻から所定の時間が経過した機器の評価値を下げるように制御する。これにより、長く使われていない機器ほど評価値が低くなり、利用されていない機器の制御が行われにくくなるようにすることができる。 Further, the evaluation unit 257 lowers the evaluation value of each device stored in the device type storage unit 172 for which a predetermined time has passed from the time when the control signal is received by the time information acquisition unit 152. To control. As a result, the evaluation value becomes lower as the device is not used for a long time, and it is possible to make it difficult to control the device that is not used.
 第3実施形態の制御対象機器種別認識部162は、制御対象の機器の種別の候補が複数存在する場合に、上記の評価部257によって決定された評価値に応じて、制御対象の機器の種別を認識する。具体的には、制御対象機器種別認識部162は、制御対象の機器の種別の候補が複数存在する場合には、評価値が最も高い機器を、制御対象の機器として認識する。 When there are a plurality of candidates for the type of the device to be controlled, the control target device type recognition unit 162 of the third embodiment determines the type of the device to be controlled according to the evaluation value determined by the evaluation unit 257. Recognize. Specifically, when there are a plurality of candidates for the type of the device to be controlled, the control target device type recognition unit 162 recognizes the device having the highest evaluation value as the device to be controlled.
 そして、第3実施形態の信号送信制御部166は、制御対象機器種別認識部162によって認識された制御対象の機器に対して、対象の制御信号を送信させるように通信機124を制御する。 Then, the signal transmission control unit 166 of the third embodiment controls the communication device 124 so that the control target device recognized by the control target device type recognition unit 162 transmits the target control signal.
 以上説明したように、第3実施形態のロボット100は、制御対象の機器の種別の候補が複数存在する場合に、評価部257によって決定された評価値に応じて、制御対象の機器の種別を認識する。これにより、制御対象の機器の種別の候補が複数存在する場合に、評価値に応じて、制御対象の機器を適切に選択することができ、選択された制御対象の機器を制御することができる。 As described above, the robot 100 of the third embodiment determines the type of the device to be controlled according to the evaluation value determined by the evaluation unit 257 when there are a plurality of candidates for the type of the device to be controlled. recognize. As a result, when there are a plurality of candidates for the type of the device to be controlled, the device to be controlled can be appropriately selected according to the evaluation value, and the selected device to be controlled can be controlled. ..
[第4実施形態] [Fourth Embodiment]
 次に、第4実施形態について説明する。第4実施形態では、内部センサ128及び外部センサ114によって検出された情報に応じて、制御対象の機器に対して制御信号を送信する点が、第1実施形態~第3実施形態と異なる。 Next, the fourth embodiment will be described. The fourth embodiment is different from the first to third embodiments in that a control signal is transmitted to the device to be controlled according to the information detected by the internal sensor 128 and the external sensor 114.
 図15に示されるように、第4実施形態のプロセッサ420は、機能的には、位置情報取得部150と、時刻情報取得部152と、機器種別認識部154と、登録部156と、状態取得部357と、指示信号取得部158と、音声解析部160と、制御対象機器種別認識部162と、制御信号認識部164と、信号送信制御部166と、を備える。 As shown in FIG. 15, the processor 420 of the fourth embodiment functionally includes a position information acquisition unit 150, a time information acquisition unit 152, a device type recognition unit 154, a registration unit 156, and a state acquisition unit. It includes a unit 357, an instruction signal acquisition unit 158, a voice analysis unit 160, a control target device type recognition unit 162, a control signal recognition unit 164, and a signal transmission control unit 166.
 状態取得部357は、内部センサ128又は外部センサ114によって検知された情報に基づいて、制御対象の機器の状態を取得する。 The state acquisition unit 357 acquires the state of the device to be controlled based on the information detected by the internal sensor 128 or the external sensor 114.
 例えば、状態取得部357は、内部センサ128であるカメラによって撮像された画像に基づいて、制御対象の機器が動作しているか否かに関する状態を取得する。 For example, the state acquisition unit 357 acquires a state regarding whether or not the device to be controlled is operating based on the image captured by the camera which is the internal sensor 128.
 そして、第4実施形態の信号送信制御部166は、状態取得部357によって取得された機器の状態と、対象の制御信号が表す制御内容とに応じて、制御対象の機器を選択する。そして、第4実施形態の信号送信制御部166は、機器の選択結果に対応する機器に対して、対象の制御信号を送信させる。 Then, the signal transmission control unit 166 of the fourth embodiment selects the device to be controlled according to the state of the device acquired by the state acquisition unit 357 and the control content represented by the target control signal. Then, the signal transmission control unit 166 of the fourth embodiment causes the device corresponding to the selection result of the device to transmit the target control signal.
 例えば、ON状態のテレビAと、OFF状態のテレビBとが存在している場合、指示信号「テレビ消して」を受信した場合を想定する。この場合、制御対象機器種別認識部162が、制御対象の機器は「テレビB」であると認識した場合には、既にOFF状態のテレビBに対して、OFFの制御信号を送信してしまうことになる。 For example, when TV A in the ON state and TV B in the OFF state exist, it is assumed that the instruction signal "Television off" is received. In this case, when the control target device type recognition unit 162 recognizes that the control target device is "TV B", the OFF control signal is transmitted to the TV B that is already in the OFF state. become.
 このため、第4実施形態の信号送信制御部166は、状態取得部357によって取得された機器の状態としてのON状態又はOFF状態と、対象の制御信号が表す制御内容としてのON状態又はOFF状態とに応じて、制御対象の機器であるテレビAを選択する。そして、第4実施形態の信号送信制御部166は、テレビAに対してOFF信号の制御信号を送信する。 Therefore, the signal transmission control unit 166 of the fourth embodiment has an ON state or OFF state as the state of the device acquired by the state acquisition unit 357 and an ON state or OFF state as the control content represented by the target control signal. The TV A, which is the device to be controlled, is selected according to the above. Then, the signal transmission control unit 166 of the fourth embodiment transmits the control signal of the OFF signal to the television A.
 これにより、例えば、制御対象の機器がオルタネート式の機器である場合に、適切に制御することができる。例えば、オルタネート式の機器では、OFF状態のときにOFFの制御信号を送るとON状態となってしまい、ユーザの意図と乖離する可能性がある。第4実施形態によれば、ユーザの意図と乖離した制御を回避することができる。 As a result, for example, when the device to be controlled is an alternate type device, it can be appropriately controlled. For example, in an alternate type device, if an OFF control signal is sent in the OFF state, the device will be in the ON state, which may deviate from the user's intention. According to the fourth embodiment, it is possible to avoid control that deviates from the intention of the user.
 状態取得部357は、内部センサ128であるカメラ又はサーモセンサ等によって得られた情報に基づいて、人の存在の有無を認識する。そして、信号送信制御部166は、状態取得部357による人の存在の認識結果に応じて、対象の制御信号を送信させる。 The state acquisition unit 357 recognizes the presence or absence of a person based on the information obtained by the camera or the thermo sensor which is the internal sensor 128. Then, the signal transmission control unit 166 causes the state acquisition unit 357 to transmit the target control signal according to the recognition result of the existence of the person.
 例えば、信号送信制御部166は、状態取得部357によって人がいないと認識された場合、TV、照明、及びエアコン等の所定の機器については、OFFの制御信号を送信するように通信機124を制御する。なお、この制御については、例えば、スマートフォン端末からアプリケーションの設定により、ロボット100による自動制御の有無を決定できるようにしてもよい。 For example, when the state acquisition unit 357 recognizes that there is no person, the signal transmission control unit 166 sets the communication device 124 so as to transmit an OFF control signal for a predetermined device such as a TV, a lighting, and an air conditioner. Control. Regarding this control, for example, the presence or absence of automatic control by the robot 100 may be determined by setting the application from the smartphone terminal.
 また、状態取得部357は、内部センサ128であるカメラ又はサーモセンサ等によって得られた情報に基づいて、人の状態を更に認識するようにしてもよい。この場合、信号送信制御部166は、状態取得部357による人の状態の認識結果に応じて、対象の制御信号を通信機124に送信させる。例えば、内部センサ128であるサーモセンサによって人の体温が得られた場合には、人の体温が所定温度以上となった場合に、エアコンをONする制御信号をエアコンに対して送信する。 Further, the state acquisition unit 357 may further recognize the state of a person based on the information obtained by the camera or the thermo sensor which is the internal sensor 128. In this case, the signal transmission control unit 166 causes the communication device 124 to transmit the target control signal according to the recognition result of the human state by the state acquisition unit 357. For example, when a person's body temperature is obtained by a thermo sensor which is an internal sensor 128, a control signal for turning on the air conditioner is transmitted to the air conditioner when the person's body temperature exceeds a predetermined temperature.
 または、信号送信制御部166は、状態取得部357によって認識された、各機器がON状態であるか否かに基づいて、ON状態である機器に対して、OFFの制御信号を送信するようにしてもよい。この場合、TVについては画像又は音声による判定が行われ、照明については画像による判定が行われ、エアコンについては画像又はサーマルセンサによる判定が行われるようにしてもよい。 Alternatively, the signal transmission control unit 166 transmits an OFF control signal to the device in the ON state based on whether or not each device is in the ON state, which is recognized by the state acquisition unit 357. You may. In this case, the TV may be determined by an image or sound, the lighting may be determined by an image, and the air conditioner may be determined by an image or a thermal sensor.
 また、信号送信制御部166は、ユーザのスマートフォン端末へアプリケーションを介して、各機器をOFF状態にするか否かを通知するようにしてもよい。また、ユーザはスマートフォン端末からロボット100を遠隔操作できるようにしてもよい。 Further, the signal transmission control unit 166 may notify the user's smartphone terminal whether or not to turn off each device via the application. Further, the user may be able to remotely control the robot 100 from the smartphone terminal.
 また、状態取得部357は、内部センサ128又は外部センサ114によって検知された情報に基づいて、ユーザが外出しているか否かを判定するようにしてもよい。例えば、一定時間以上、カメラ及びマイクの検出結果に基づき、予め指定された部屋において、誰も検出されなかった場合には、ユーザのスマートフォン端末へ、ON状態の機器をOFF状態へしてもよいかに関する信号を送信する。 Further, the state acquisition unit 357 may determine whether or not the user is out based on the information detected by the internal sensor 128 or the external sensor 114. For example, if no one is detected in a room designated in advance based on the detection results of the camera and the microphone for a certain period of time or longer, the device in the ON state may be turned OFF in the smartphone terminal of the user. Send a signal about the camera.
 また、状態取得部357は、ユーザのスマートフォン端末のアプリケーションを介して、ユーザが外出しているか否かを判定する。例えば、状態取得部357は、スマートフォン端末のGPS信号が、家から所定距離以上離れた場合には、ユーザは外出していると判定する。そして、状態取得部357は、例えば、TV、照明、及びエアコン等の所定の機器については、対象となるユーザの全員(例えば、家族)が外出していれば、各機器に対してOFFの制御信号を送信する。 In addition, the state acquisition unit 357 determines whether or not the user is out via the application of the user's smartphone terminal. For example, the state acquisition unit 357 determines that the user is out when the GPS signal of the smartphone terminal is separated from the house by a predetermined distance or more. Then, the state acquisition unit 357 controls OFF for predetermined devices such as TV, lighting, and air conditioner if all the target users (for example, family members) are out. Send a signal.
 また、状態取得部357は、内部センサ128又は外部センサ114によって検知された情報に基づいて、家屋内に存在する幼児又はペットの状態を取得し、信号送信制御部166は、幼児又はペットの状態に応じて、各種機器の制御信号を送信するようにしてもよい。 Further, the state acquisition unit 357 acquires the state of the infant or pet existing in the house based on the information detected by the internal sensor 128 or the external sensor 114, and the signal transmission control unit 166 acquires the state of the infant or pet. Depending on the situation, control signals of various devices may be transmitted.
 例えば、状態取得部357によって、幼児の泣き声を検知した場合には、音声を再生するように、対象の機器へ制御信号を送信するようにしてもよい。また、例えば、状態取得部357によって、家屋内の温度が所定温度以上又は以下である場合には、エアコンに対して所定の制御信号を送信するようにしてもよい。また、例えば、状態取得部357によって、幼児の睡眠状態が検知された場合には、音声機器の音量を低下させる、又はテレビの音量を低下させる等の制御信号を送信するようにしてもよい。 For example, when the state acquisition unit 357 detects the crying of an infant, a control signal may be transmitted to the target device so as to reproduce the voice. Further, for example, when the temperature inside the house is equal to or lower than the predetermined temperature, the state acquisition unit 357 may transmit a predetermined control signal to the air conditioner. Further, for example, when the sleep state of the infant is detected by the state acquisition unit 357, a control signal such as lowering the volume of the audio device or lowering the volume of the television may be transmitted.
 また、状態取得部357は、スマートフォン端末からアプリケーションを介して停止信号を受信した場合、信号送信制御部166は、各機器の作動を停止する制御信号を出力してもよい。 Further, when the state acquisition unit 357 receives a stop signal from the smartphone terminal via the application, the signal transmission control unit 166 may output a control signal for stopping the operation of each device.
 また、状態取得部357は、内部センサ128又は外部センサ114によって検知された情報に基づいて、周辺の温度が所定値以上又は所定値以下となった場合には、信号送信制御部166は、ロボット100の故障回避をするために、エアコンへ所定の制御信号を送信するようにしてもよい。 Further, when the ambient temperature becomes equal to or higher than a predetermined value or lower than a predetermined value based on the information detected by the internal sensor 128 or the external sensor 114, the state acquisition unit 357 causes the signal transmission control unit 166 to perform the robot. A predetermined control signal may be transmitted to the air conditioner in order to avoid the failure of 100.
 また、例えば、状態取得部357は、ユーザのスマートフォン端末から外部サーバを介してGPS情報を受信しユーザの帰宅を検知した場合、家屋内の所定位置である玄関又はユーザが向かうところに対応する位置の照明をつける等の制御信号を、各機器に対して出力してもよい。 Further, for example, when the state acquisition unit 357 receives GPS information from the user's smartphone terminal via an external server and detects the user's return home, the state acquisition unit 357 is a position corresponding to a predetermined position in the house or a place where the user is heading. A control signal such as turning on the lighting of the device may be output to each device.
 以上説明したように、第4実施形態のロボット100は、センサによって検知された情報に応じて、対象の機器へ対象の制御信号を送信させる。これにより、制御対象の機器を適切に制御することができる。 As described above, the robot 100 of the fourth embodiment causes the target device to transmit the target control signal according to the information detected by the sensor. As a result, the device to be controlled can be appropriately controlled.
[第5実施形態] [Fifth Embodiment]
 次に、第5実施形態について説明する。第5実施形態では、制御信号の発信元の方向を推定することにより対象の機器の位置を推定する点が、第1実施形態~第4実施形態と異なる。 Next, the fifth embodiment will be described. The fifth embodiment is different from the first to fourth embodiments in that the position of the target device is estimated by estimating the direction of the source of the control signal.
 第5実施形態のロボット100の内部センサ128は、図16に示されるような方向推定機器30を備えている。方向推定機器30は、開示の技術の「方向推定部」に相当する。方向推定機器30は、例えば、ロボット100の角の部分に備えられている。図16に示されるように、方向推定機器30は、複数の赤外線発光素子11a~11dと、複数の受光素子21a~21d(開示の技術の「複数の受信素子」に相当する。)とを備えている。複数の赤外線発光素子11a~11dを総称する場合、又は特定の赤外線発光素子を特定しない場合には、単に「赤外線発光素子11」という。赤外線発光素子11は、開示の技術の「制御信号送信部」にも相当する。また、複数の受光素子21a~21dを総称する場合、又は特定の受光素子を特定しない場合には、単に「受光素子21」という。受光素子12は、開示の技術の「制御信号受信部」にも相当する。 The internal sensor 128 of the robot 100 of the fifth embodiment includes a direction estimation device 30 as shown in FIG. The direction estimation device 30 corresponds to the "direction estimation unit" of the disclosed technology. The direction estimation device 30 is provided, for example, at a corner portion of the robot 100. As shown in FIG. 16, the direction estimation device 30 includes a plurality of infrared light emitting elements 11a to 11d and a plurality of light receiving elements 21a to 21d (corresponding to “plurality of receiving elements” of the disclosed technology). ing. When a plurality of infrared light emitting elements 11a to 11d are collectively referred to, or when a specific infrared light emitting element is not specified, it is simply referred to as "infrared light emitting element 11". The infrared light emitting element 11 also corresponds to the "control signal transmission unit" of the disclosed technology. Further, when a plurality of light receiving elements 21a to 21d are collectively referred to, or when a specific light receiving element is not specified, it is simply referred to as "light receiving element 21". The light receiving element 12 also corresponds to the "control signal receiving unit" of the disclosed technology.
 例えば、図17に示されるように、ユーザUが遠隔操作器142を操作し、対象の機器であるテレビ144に対して制御信号を出力すると、ロボット100のプロセッサ120は、記憶装置122からプログラムを読み出し、図18に示す位置学習処理ルーチンを実行する。なお、図17は、ユーザUとロボット100と対象の機器であるテレビ144との間の位置関係を、上空から見た図である。 For example, as shown in FIG. 17, when the user U operates the remote controller 142 and outputs a control signal to the television 144 which is the target device, the processor 120 of the robot 100 outputs a program from the storage device 122. Read and execute the position learning processing routine shown in FIG. Note that FIG. 17 is an aerial view of the positional relationship between the user U, the robot 100, and the television 144, which is the target device.
 まず、方向推定機器30の受光素子21は、遠隔操作器142から出力された制御信号である赤外線データを受信する。 First, the light receiving element 21 of the direction estimation device 30 receives infrared data which is a control signal output from the remote controller 142.
 ステップS302において、ロボット100のプロセッサ120は、方向推定機器30の受光素子21が受光した受光量の情報を取得する。具体的には、ロボット100のプロセッサ120は、複数の受光素子21a~21dの各々の受光量の情報を取得する。 In step S302, the processor 120 of the robot 100 acquires information on the amount of light received by the light receiving element 21 of the direction estimation device 30. Specifically, the processor 120 of the robot 100 acquires information on the amount of light received by each of the plurality of light receiving elements 21a to 21d.
 ステップS304において、ロボット100のプロセッサ120は、上記ステップS302で取得された受光量の情報に基づいて、赤外線データの発信元の方向を推定する。例えば、ロボット100のプロセッサ120は、それぞれの受光素子21が受光した受光量の多寡に基づいて図17に示される遠隔操作器142の方向を、赤外線データの発信元の方向として推定する。 In step S304, the processor 120 of the robot 100 estimates the direction of the source of the infrared data based on the light receiving amount information acquired in step S302. For example, the processor 120 of the robot 100 estimates the direction of the remote controller 142 shown in FIG. 17 as the direction of the source of infrared data based on the amount of light received by each light receiving element 21.
 ステップS306において、ロボット100のプロセッサ120は、上記ステップS304での赤外線データの発信元の方向の推定結果に基づいて、発信元の方向とは反対方向の領域Rに、対象の機器が存在していることを推定する。例えば、図17に示されるように、ロボット100のプロセッサ120は、遠隔操作器142の方向とは反対方向であるRの領域に、テレビ144が存在していることを推定する。ロボット100のプロセッサ120が、開示の技術の「存在位置推定部」に相当する。 In step S306, the processor 120 of the robot 100 has the target device in the region R in the direction opposite to the direction of the source, based on the estimation result of the direction of the source of the infrared data in step S304. Estimate that you are. For example, as shown in FIG. 17, the processor 120 of the robot 100 estimates that the television 144 is present in the region of R, which is in the direction opposite to the direction of the remote controller 142. The processor 120 of the robot 100 corresponds to the "existence position estimation unit" of the disclosed technology.
 ステップS308において、ロボット100のプロセッサ120は、駆動機構118を駆動させ、内部センサ128のうちのカメラによってRの範囲を撮像するように制御する。 In step S308, the processor 120 of the robot 100 drives the drive mechanism 118 and controls the camera of the internal sensors 128 to image the range of R.
 ステップS310において、ロボット100のプロセッサ120は、上記ステップS308で撮像された画像に対して、既知の画像処理を適用し対象の機器を認識する。なお、対象の機器を認識する際、プロセッサ120は、予め設定されていた対象の機器の特徴量と、カメラによって撮像された画像から抽出される特徴量とを比較することにより、対象の機器を識別する。または、プロセッサ120は、画像が入力されると当該画像に写る対象の機器の識別情報が出力される学習済みモデルを用いて、画像に写る対象の機器を識別するようにしてもよい。 In step S310, the processor 120 of the robot 100 applies known image processing to the image captured in step S308 to recognize the target device. When recognizing the target device, the processor 120 determines the target device by comparing the preset feature amount of the target device with the feature amount extracted from the image captured by the camera. Identify. Alternatively, the processor 120 may identify the target device to be captured in the image by using a learned model in which the identification information of the target device to be captured in the image is output when the image is input.
 ステップS311において、ロボット100のプロセッサ120は、上記ステップS310で得られた画像認識結果に基づいて、既知の位置推定アルゴリズム等を用いて、対象の機器の位置を推定する。例えば、ロボット100のプロセッサ120は、画像認識結果に基づいて、対象の機器の3次元位置を推定する。 In step S311, the processor 120 of the robot 100 estimates the position of the target device by using a known position estimation algorithm or the like based on the image recognition result obtained in step S310. For example, the processor 120 of the robot 100 estimates the three-dimensional position of the target device based on the image recognition result.
 そして、ステップS312において、ロボット100のプロセッサ120は、上記ステップS311で得られた対象の機器の位置情報を、機器種別記憶部172へ記憶させる。 Then, in step S312, the processor 120 of the robot 100 stores the position information of the target device obtained in step S311 in the device type storage unit 172.
 なお、上記ステップS308及び上記ステップS310においては、内部センサ128のうちのカメラによって撮像された画像に基づいて、対象の機器の位置を認識する場合を例に説明したが、これに限定されるものではない。例えば、ロボット100は、屋内の地図を表す地図情報、内部センサ128のうちのサーモセンサによって検知された情報、及び内部センサ128のうちのマイクによって検知された情報の少なくとも一種の情報に基づいて、制御信号の送信対象の機器の位置を認識するようにしてもよい。 In the steps S308 and S310, the case where the position of the target device is recognized based on the image captured by the camera among the internal sensors 128 has been described as an example, but the present invention is limited to this. is not it. For example, the robot 100 is based on at least one type of information, which is map information representing an indoor map, information detected by a thermosensor among internal sensors 128, and information detected by a microphone among internal sensors 128. The position of the device to which the control signal is transmitted may be recognized.
 機器種別記憶部172へ機器の位置情報が蓄積されると、ロボット100は、指示信号である音声情報を受け付けたときに、その音声情報に応じた制御信号を制御対象の機器の方向へ出力する。まず、ユーザが音声情報を発すると、ロボット100の内部センサ128に含まれるマイクは、ユーザから発せられた音声情報を検知する。ロボット100のマイクが音声情報を検知すると、ロボット100のプロセッサ120は、記憶装置122からプログラムを読み出し、図19に示す機器制御処理ルーチンを実行する。 When the position information of the device is accumulated in the device type storage unit 172, the robot 100 outputs the control signal corresponding to the voice information to the direction of the device to be controlled when the voice information which is the instruction signal is received. .. First, when the user emits voice information, the microphone included in the internal sensor 128 of the robot 100 detects the voice information emitted from the user. When the microphone of the robot 100 detects the voice information, the processor 120 of the robot 100 reads the program from the storage device 122 and executes the device control processing routine shown in FIG.
 ステップS200~ステップS208の各処理は、第1実施形態と同様に実行される。 Each process of steps S200 to S208 is executed in the same manner as in the first embodiment.
 ステップS410において、ロボット100のプロセッサ120は、上記ステップS206での機器の認識結果に基づいて、当該機器に対応付けられている位置情報を読み出し、対象の機器の位置を認識する。 In step S410, the processor 120 of the robot 100 reads the position information associated with the device based on the recognition result of the device in step S206, and recognizes the position of the target device.
 ステップS412において、ロボット100のプロセッサ120は、上記ステップS410で認識された対象の機器の位置の方向へ、制御信号である赤外線信号を出力するように方向推定機器30の赤外線発光素子11を制御する。なお、この場合、ロボット100は自己の位置を認識し、自己の位置と対象の機器の位置との関係に応じて、方向推定機器30の赤外線発光素子11を制御する。 In step S412, the processor 120 of the robot 100 controls the infrared light emitting element 11 of the direction estimation device 30 so as to output an infrared signal which is a control signal in the direction of the position of the target device recognized in step S410. .. In this case, the robot 100 recognizes its own position and controls the infrared light emitting element 11 of the direction estimation device 30 according to the relationship between its own position and the position of the target device.
 以上説明したように、第5実施形態のロボット100は、対象の機器の位置を認識し、その方向へ制御信号を出力する。これにより、ユーザが意図する機器を適切に制御することができる。 As described above, the robot 100 of the fifth embodiment recognizes the position of the target device and outputs a control signal in that direction. As a result, the device intended by the user can be appropriately controlled.
 なお、上述した実施形態に限定されるものではなく、各実施形態の要旨を逸脱しない範囲内で様々な変形や応用が可能である。 It should be noted that the present invention is not limited to the above-described embodiment, and various modifications and applications are possible within a range that does not deviate from the gist of each embodiment.
 例えば、指示信号に場所を表す情報が含まれていた場合には、ロボット100のプロセッサ120は、駆動機構118に対して当該場所を表す位置へ移動するような制御信号を出力する。そして、ロボット100のプロセッサ120は、当該位置に存在する制御対象の機器へ制御信号を送信するようにしてもよい。 For example, when the instruction signal contains information indicating the location, the processor 120 of the robot 100 outputs a control signal to the drive mechanism 118 to move to the position representing the location. Then, the processor 120 of the robot 100 may transmit the control signal to the device to be controlled existing at the position.
 例えば、指示信号としての音声情報「寝室の明かりをつけて」を取得した場合、制御対象機器種別認識部162は、地図情報記憶部176に格納されているテーブルを参照して、寝室の位置を特定する。そして、ロボット100のプロセッサ120は、車輪102へ寝室の位置へ移動するような信号を出力して、自身を寝室へ移動させる。そして、制御対象機器種別認識部162は、寝室に存在する制御対象の照明に対して、ONを表す制御信号を送信する。 For example, when the voice information "turn on the bedroom light" as an instruction signal is acquired, the controlled device type recognition unit 162 refers to the table stored in the map information storage unit 176 to determine the position of the bedroom. Identify. Then, the processor 120 of the robot 100 outputs a signal to the wheels 102 to move to the position of the bedroom, and moves itself to the bedroom. Then, the control target device type recognition unit 162 transmits a control signal indicating ON to the control target lighting existing in the bedroom.
 また、ロボット100は、各機器に対する好みを表す選好性に関する情報に基づいて、制御を行ってもよい。例えば、図20に示されるように、テレビ144に付与された選好性のスコアが、エアコン146に付与された選好性のスコアよりも高い場合、ロボット100は、選好性が高い機器であるテレビ144に対しては、そのそばに近づきやすくなる、テレビ144のほうを向くとともに、テレビ144に対する制御信号を早期に学習しやすくなる、といった設定がされていてもよい。 Further, the robot 100 may perform control based on the information regarding the preference representing the preference for each device. For example, as shown in FIG. 20, when the preference score given to the television 144 is higher than the preference score given to the air conditioner 146, the robot 100 is the television 144, which is a highly preferred device. The above may be set so that it becomes easier to approach the vicinity of the television 144, the television 144 is directed toward the television 144, and the control signal for the television 144 is easily learned at an early stage.
 また、ロボット100は、対象の機器へ制御信号を送信する際には、予め定められた音情報又はモーションを行うようにしてもよい。また、ロボット100は、周囲の状況に応じて、対象の機器へ制御信号を送信するようにしてもよい。例えば、ロボット100は、マイクを介して周囲の人間の音声を所定以上の頻度で検出した場合又は周囲の人間の顔が向き合っていることが撮像から検出された場合など、周囲の人間のコミュニケーションが活発になってきたときには、テレビの音声を下げるなどの制御信号を送信するようにしてもよい。 Further, the robot 100 may perform predetermined sound information or motion when transmitting a control signal to the target device. Further, the robot 100 may transmit a control signal to the target device according to the surrounding situation. For example, when the robot 100 detects the voice of a surrounding human being through a microphone at a frequency equal to or higher than a predetermined frequency, or when it is detected from imaging that the surrounding human faces are facing each other, the communication of the surrounding human beings When it becomes active, a control signal such as lowering the sound of the television may be transmitted.
 また、ロボット100が電話機能を有する場合には、ロボット100は、ユーザがロボット100の電話機能を利用する際には、テレビの音量を下げる、明かりなどを制御するといった制御を行ってもよい。 Further, when the robot 100 has a telephone function, the robot 100 may perform controls such as lowering the volume of the television and controlling the light when the user uses the telephone function of the robot 100.
 また、ロボット100は、対象の機器が起動されたスケジュールに応じて、対象の機器に対して制御信号を出力するようにしてもよい。例えば、ロボット100は、遠隔操作から送信されている制御信号の時刻情報に応じて、自動的に対象の機器に対して制御信号を送信するようにしてもよい。 Further, the robot 100 may output a control signal to the target device according to the schedule when the target device is activated. For example, the robot 100 may automatically transmit the control signal to the target device according to the time information of the control signal transmitted from the remote control.
 また、ロボット100は、制御信号を送信した後のユーザの反応に応じて、評価値を更新するようにしてもよい。例えば、制御信号を送信した後に、抱きかかえられることにより、上下方向の加速度を検出した場合には、当該機器への制御信号の送信が適切であったものとみなして、当該機器の評価値を上げるようにしてもよい。また、所定の音声情報(例えば、「ありがとう」)を、制御信号の送信後に検知した場合には、当該機器への制御信号の送信が適切であったものとみなして、当該機器の評価値を上げるようにしてもよい。
 また、ロボット100のプロセッサ120は、上記ステップS304での赤外線データの発信元の方向の推定結果に基づいて、発信元の方向とは反対方向の領域Rに、対象の機器が存在していることを推定した。これに代えて又は加えて、赤外線の指向性が高いことに鑑みて、ロボット100のプロセッサ120は、赤外線データの発信元の方向に対象の機器が存在していると推定してもよい。
Further, the robot 100 may update the evaluation value according to the reaction of the user after transmitting the control signal. For example, when the acceleration in the vertical direction is detected by being held by the user after transmitting the control signal, it is considered that the transmission of the control signal to the device is appropriate, and the evaluation value of the device is evaluated. You may try to raise it. Further, when the predetermined voice information (for example, "Thank you") is detected after the control signal is transmitted, it is considered that the control signal is properly transmitted to the device, and the evaluation value of the device is evaluated. You may try to raise it.
Further, the processor 120 of the robot 100 has the target device in the region R in the direction opposite to the direction of the source, based on the estimation result of the direction of the source of the infrared data in step S304. Was estimated. Alternatively or additionally, in view of the high directivity of infrared rays, the processor 120 of the robot 100 may presume that the target device exists in the direction of the source of the infrared data.
 また、ロボット100の機能の一部はサーバ200により実現されてもよいし、サーバ200の機能の一部または全部はロボット100により実現されてもよい。 Further, a part of the functions of the robot 100 may be realized by the server 200, and a part or all of the functions of the server 200 may be realized by the robot 100.
 また、本開示の機器制御装置は1台のロボット100である場合を例に説明したが、これに限定されるものではなく、複数のロボットによって構成されていてもよい。この場合には、複数のロボット同士が情報の送受信を行い、複数のロボットによって機器の種別の学習及び機器の制御が実行されるようにしてもよい。 Further, the device control device of the present disclosure has been described by taking the case of one robot 100 as an example, but the present invention is not limited to this, and may be composed of a plurality of robots. In this case, a plurality of robots may send and receive information to each other, and the plurality of robots may perform learning of the type of device and control of the device.
 また、上記第5実施形態のロボット100は、対象の機器の方向へ制御信号である赤外線を出力する場合を例に説明したが、これに限定されるものではなく、例えば、方向推定機器30が備える複数の赤外線発光素子11a~11dの全てから、制御信号である赤外線を出力するようにしてもよい。 Further, the robot 100 of the fifth embodiment has been described as an example of the case where infrared rays which are control signals are output in the direction of the target device, but the present invention is not limited to this, and for example, the direction estimation device 30 Infrared rays, which are control signals, may be output from all of the plurality of infrared light emitting elements 11a to 11d provided.
 また、上記各実施形態では、制御信号が赤外線である場合を例に説明したが、これに限定されるものではなく、例えば、制御信号は高周波(Rf:Radio frequency)の電磁波信号であってもよい。 Further, in each of the above embodiments, the case where the control signal is infrared rays has been described as an example, but the present invention is not limited to this, and for example, the control signal may be a high frequency (Rf: Radio frequency) electromagnetic wave signal. Good.
 なお、コンピュータ読み取り可能な記録媒体とは、フレキシブルディスク、光磁気ディスク、ROM、CD-ROM等の可搬媒体、コンピュータシステムに内蔵されるハードディスク等の記憶装置のことをいう。 Note that computer-readable recording media refer to portable media such as flexible disks, magneto-optical disks, ROMs, and CD-ROMs, and storage devices such as hard disks built into computer systems.
 このように、各実施形態を、図面を参照して詳述してきたが、具体的な構成は、これらの各実施形態に限られるものではなく、要旨を逸脱しない範囲の設計等も含まれる。 As described above, each embodiment has been described in detail with reference to the drawings, but the specific configuration is not limited to each of these embodiments, and includes a design within a range that does not deviate from the gist.
 また、実施形態のプログラムは、記憶媒体に格納して提供するようにしてもよい。 Further, the program of the embodiment may be stored in a storage medium and provided.
 本明細書に記載された全ての文献、特許出願、及び技術規格は、個々の文献、特許出願、及び技術規格が参照により取り込まれることが具体的かつ個々に記載された場合と同程度に、本明細書中に参照により取り込まれる。 All documents, patent applications, and technical standards described herein are to the same extent as if the individual documents, patent applications, and technical standards were specifically and individually stated to be incorporated by reference. Incorporated herein by reference.

Claims (23)

  1.  機器への制御内容を表す制御信号を受信する制御信号受信部と、
     機器への制御内容を表す制御信号を送信する制御信号送信部と、
     前記制御信号受信部により受信された前記制御信号に基づいて、前記制御信号が表す機器の種別を認識する機器種別認識部と、
     前記機器種別認識部により認識された前記機器の種別を、第1記憶部へ登録する登録部と、
     前記制御信号とは異なる種類の信号であって、機器への制御内容を表す信号である指示信号を取得する指示信号取得部と、
     前記指示信号取得部により取得された前記指示信号と、前記第1記憶部に記憶されている前記機器の種別とに基づいて、前記指示信号が表す制御対象の機器の種別を認識する制御対象機器種別認識部と、
     機器の種別と制御内容とに対応する制御信号が記憶されている第2記憶部を参照して、前記制御対象機器種別認識部により認識された制御対象の機器の種別と前記指示信号に含まれる当該制御対象の機器への制御内容とに対応する対象の制御信号を認識する制御信号認識部と、
     前記制御信号送信部に、前記制御対象の機器に対して前記対象の制御信号を送信させる信号送信制御部と、
     を備える機器制御装置。
    A control signal receiver that receives control signals that represent the control content of the device,
    A control signal transmitter that transmits a control signal that represents the control content to the device,
    A device type recognition unit that recognizes the type of device represented by the control signal based on the control signal received by the control signal reception unit.
    A registration unit that registers the type of the device recognized by the device type recognition unit in the first storage unit, and a registration unit.
    An instruction signal acquisition unit that acquires an instruction signal that is a signal of a type different from the control signal and is a signal indicating the control content to the device.
    A control target device that recognizes the type of the control target device represented by the instruction signal based on the instruction signal acquired by the instruction signal acquisition unit and the type of the device stored in the first storage unit. Type recognition unit and
    The type of the device to be controlled recognized by the device type recognition unit to be controlled and the instruction signal are included in the instruction signal with reference to the second storage unit in which the control signal corresponding to the type of device and the control content is stored. A control signal recognition unit that recognizes the control signal of the target corresponding to the control content of the device to be controlled,
    A signal transmission control unit that causes the control signal transmission unit to transmit the control signal of the target to the device to be controlled.
    A device control device equipped with.
  2.  移動機構と、
     前記制御信号受信部により前記制御信号が受信されたときの位置情報を取得する位置取得部と、を更に備え、
     前記登録部は、前記機器種別認識部により認識された前記機器の種別と、前記位置取得部により取得された前記位置情報との組み合わせを、前記第1記憶部へ登録し、
     前記制御対象機器種別認識部は、前記指示信号に含まれる機器の種別及び前記指示信号を受信した位置と、前記第1記憶部に記憶されている前記機器の種別及び位置情報とに基づいて、前記指示信号が表す制御対象の機器の種別を認識する、
     請求項1に記載の機器制御装置。
    Movement mechanism and
    Further, a position acquisition unit for acquiring position information when the control signal is received by the control signal reception unit is provided.
    The registration unit registers the combination of the type of the device recognized by the device type recognition unit and the position information acquired by the position acquisition unit in the first storage unit.
    The control target device type recognition unit is based on the type of the device included in the instruction signal, the position where the instruction signal is received, and the type and position information of the device stored in the first storage unit. Recognize the type of device to be controlled represented by the instruction signal.
    The device control device according to claim 1.
  3.  前記制御信号受信部により前記制御信号が受信されたときの時刻情報を取得する時刻取得部を更に備え、
     前記登録部は、前記機器種別認識部により認識された前記機器の種別と、前記時刻取得部により取得された前記時刻情報との組み合わせを、前記第1記憶部へ登録し、
     前記制御対象機器種別認識部は、前記指示信号に含まれる機器の種別及び前記指示信号を受信した時刻と、前記第1記憶部に記憶されている前記機器の種別及び時刻情報とに基づいて、前記指示信号が表す制御対象の機器の種別を認識する、
     請求項1又は請求項2に記載の機器制御装置。
    Further, a time acquisition unit for acquiring time information when the control signal is received by the control signal reception unit is provided.
    The registration unit registers the combination of the type of the device recognized by the device type recognition unit and the time information acquired by the time acquisition unit in the first storage unit.
    The control target device type recognition unit is based on the type of device included in the instruction signal, the time when the instruction signal is received, and the type and time information of the device stored in the first storage unit. Recognize the type of device to be controlled represented by the instruction signal.
    The device control device according to claim 1 or 2.
  4.  音声解析部を更に備え、
     前記指示信号取得部は、音声情報を前記指示信号として取得し、
     前記音声解析部は、前記指示信号取得部により取得された前記指示信号を音声解析し、前記指示信号に含まれる制御内容を示す文言を取得し、
     前記制御信号認識部は、前記第2記憶部に格納された情報と、前記制御内容を示す文言とに基づいて、前記対象の制御信号を認識する、
     請求項1~請求項3の何れか1項に記載の機器制御装置。
    Equipped with a voice analysis unit
    The instruction signal acquisition unit acquires voice information as the instruction signal, and obtains the voice information.
    The voice analysis unit voice-analyzes the instruction signal acquired by the instruction signal acquisition unit, and acquires a wording indicating a control content included in the instruction signal.
    The control signal recognition unit recognizes the target control signal based on the information stored in the second storage unit and the wording indicating the control content.
    The device control device according to any one of claims 1 to 3.
  5.  前記音声解析部は、前記指示信号取得部により取得された前記指示信号を音声解析し、前記制御内容を示す文言と、前記指示信号に含まれる機器の種別を示す文言と、を取得し、
     前記制御対象機器種別認識部は、前記機器の種別を示す文言と、前記第1記憶部に記憶されている前記機器の種別とに基づいて、前記指示信号が表す制御対象の機器の種別を認識する、
     請求項4に記載の機器制御装置。
    The voice analysis unit voice-analyzes the instruction signal acquired by the instruction signal acquisition unit, and acquires a word indicating the control content and a word indicating the type of the device included in the instruction signal.
    The control target device type recognition unit recognizes the type of the control target device represented by the instruction signal based on the wording indicating the type of the device and the type of the device stored in the first storage unit. To do
    The device control device according to claim 4.
  6.  前記第1記憶部に記憶された各機器に対する評価値を決定する評価部を更に備え、
     前記制御対象機器種別認識部は、前記制御対象の機器の種別の候補が複数存在する場合に、前記評価部によって決定された評価値に応じて、前記制御対象の機器の種別を認識する、
     請求項1~請求項5の何れか1項に記載の機器制御装置。
    An evaluation unit for determining an evaluation value for each device stored in the first storage unit is further provided.
    When there are a plurality of candidates for the type of the device to be controlled, the control target device type recognition unit recognizes the type of the device to be controlled according to the evaluation value determined by the evaluation unit.
    The device control device according to any one of claims 1 to 5.
  7.  前記制御信号受信部により前記制御信号が受信されたときの位置情報を取得する位置取得部を更に備え、
     前記評価部は、前記位置取得部により取得された前記位置情報に対応する領域に応じて、各機器に対する評価値を決定する、
     請求項6に記載の機器制御装置。
    Further, a position acquisition unit for acquiring position information when the control signal is received by the control signal reception unit is provided.
    The evaluation unit determines an evaluation value for each device according to a region corresponding to the position information acquired by the position acquisition unit.
    The device control device according to claim 6.
  8.  前記制御信号受信部により前記制御信号が受信されたときの時刻情報を取得する時刻取得部を更に備え、
     前記評価部は、前記時刻取得部により取得された前記時刻情報に対応する時間帯に応じて、各機器に対する評価値を決定する、
     請求項6に記載の機器制御装置。
    Further, a time acquisition unit for acquiring time information when the control signal is received by the control signal reception unit is provided.
    The evaluation unit determines an evaluation value for each device according to a time zone corresponding to the time information acquired by the time acquisition unit.
    The device control device according to claim 6.
  9.  前記制御信号受信部により前記制御信号が受信されたときの時刻情報を取得する時刻取得部を更に備え、
     前記評価部は、前記第1記憶部に記憶された各機器のうち、前記時刻取得部によって前記制御信号が受信されたときの時刻から所定の時間が経過した機器の評価値を下げるように制御する、
     請求項6に記載の機器制御装置。
    Further, a time acquisition unit for acquiring time information when the control signal is received by the control signal reception unit is provided.
    The evaluation unit controls to lower the evaluation value of each device stored in the first storage unit in which a predetermined time has elapsed from the time when the control signal is received by the time acquisition unit. To do,
    The device control device according to claim 6.
  10.  前記評価部は、前記制御信号送信部による前記制御信号の送信と、前記制御信号受信部による前記制御信号の受信と、の間の時系列の関係に応じて、前記第1記憶部に記憶された各機器に対する評価値を更新する、
     請求項6に記載の機器制御装置。
    The evaluation unit is stored in the first storage unit according to the time-series relationship between the transmission of the control signal by the control signal transmission unit and the reception of the control signal by the control signal reception unit. Update the evaluation value for each device,
    The device control device according to claim 6.
  11.  前記評価部は、前記制御信号送信部による第1の機器への前記制御信号の送信後、所定時間内に、前記制御信号受信部によって第2の機器への前記制御信号を受信した場合には、第1の機器への評価値を下げるように更新する、
     請求項10に記載の機器制御装置。
    When the evaluation unit receives the control signal to the second device by the control signal receiving unit within a predetermined time after the control signal transmitting unit transmits the control signal to the first device, the evaluation unit receives the control signal to the second device. , Update to lower the evaluation value for the first device,
    The device control device according to claim 10.
  12.  前記評価部は、第1の機器への前記制御信号の送信後、所定時間内に、第2の機器への前記制御信号を受信した場合であって、かつ第1の機器への前記制御信号の制御内容と第2の機器への前記制御信号の制御内容とが同一である場合には、第2の機器への評価値を上げるように更新する、
     請求項10又は請求項11に記載の機器制御装置。
    The evaluation unit receives the control signal to the second device within a predetermined time after transmitting the control signal to the first device, and the control signal to the first device. If the control content of the control signal and the control content of the control signal to the second device are the same, the evaluation value for the second device is updated to be increased.
    The device control device according to claim 10 or 11.
  13.  前記評価部は、第1の機器への前記制御信号の送信後、所定時間内に、第1の機器への前記制御信号を受信した場合であって、かつ送信した前記制御信号が表す制御内容と、受信した前記制御信号が表す制御内容とが異なる場合には、該第1の機器への評価値を下げるように更新する、
     請求項10~請求項12の何れか1項に記載の機器制御装置。
    The evaluation unit receives the control signal to the first device within a predetermined time after transmitting the control signal to the first device, and the control content represented by the transmitted control signal. When the control content represented by the received control signal is different, the evaluation value for the first device is updated to be lowered.
    The device control device according to any one of claims 10 to 12.
  14.  前記評価部は、第1の機器への前記制御信号の送信後、所定時間内に、第2の機器への指示信号を取得した場合には、第1の機器への評価値を下げるように更新する、
     請求項10~請求項13の何れか1項に記載の機器制御装置。
    When the evaluation unit acquires the instruction signal to the second device within a predetermined time after transmitting the control signal to the first device, the evaluation unit lowers the evaluation value to the first device. Update,
    The device control device according to any one of claims 10 to 13.
  15.  前記評価部は、第1の機器への前記制御信号の送信後、所定時間内に、第1の機器への指示信号を取得した場合であって、かつ該指示信号が表す機器が第1の機器であり、かつ該制御信号が表す制御内容と該指示信号が表す制御内容とが異なる場合には、第1の機器への評価値を下げるように更新する、
     請求項10~請求項14の何れか1項に記載の機器制御装置。
    The evaluation unit acquires an instruction signal to the first device within a predetermined time after transmitting the control signal to the first device, and the device represented by the instruction signal is the first device. If the device is a device and the control content represented by the control signal is different from the control content represented by the instruction signal, the evaluation value for the first device is updated to be lowered.
    The device control device according to any one of claims 10 to 14.
  16.  センサと、
     前記センサによって検知された情報に基づいて、制御対象の機器の状態を取得する状態取得部と、を更に備え、
     前記信号送信制御部は、前記状態取得部によって取得された機器の状態と、対象の制御信号が表す制御内容とに応じて、制御対象の機器を選択し、選択された前記機器に対して、前記対象の制御信号を送信させる、
     請求項1~請求項15の何れか1項に記載の機器制御装置。
    With the sensor
    A state acquisition unit that acquires the state of the device to be controlled based on the information detected by the sensor is further provided.
    The signal transmission control unit selects a device to be controlled according to the state of the device acquired by the state acquisition unit and the control content represented by the target control signal, and with respect to the selected device. To transmit the control signal of the target,
    The device control device according to any one of claims 1 to 15.
  17.  センサと、
     前記センサによって検知された情報に基づいて、人の存在の有無を認識する状態取得部と、
     前記信号送信制御部は、前記状態取得部による人の存在の認識結果に応じて、前記対象の制御信号を送信させる、
     請求項1~請求項15の何れか1項に記載の機器制御装置。
    With the sensor
    A state acquisition unit that recognizes the presence or absence of a person based on the information detected by the sensor, and
    The signal transmission control unit transmits the control signal of the target according to the recognition result of the existence of a person by the state acquisition unit.
    The device control device according to any one of claims 1 to 15.
  18.  前記状態取得部は、前記人の状態を更に認識し、
     前記信号送信制御部は、前記状態取得部による人の状態の認識結果に応じて、前記対象の制御信号を送信させる、
     請求項17に記載の機器制御装置。
    The state acquisition unit further recognizes the state of the person and
    The signal transmission control unit transmits the target control signal according to the recognition result of the human state by the state acquisition unit.
    The device control device according to claim 17.
  19.  センサと、
     前記信号送信制御部は、前記センサによって検知された情報に応じて、前記対象の制御信号を送信させる、
     請求項1~請求項15の何れか1項に記載の機器制御装置。
    With the sensor
    The signal transmission control unit transmits a control signal of the target according to the information detected by the sensor.
    The device control device according to any one of claims 1 to 15.
  20.  前記制御信号受信部は、複数の受信素子を備え、
     前記機器制御装置は、前記複数の受信素子のそれぞれの受光量に基づいて信号の発信方向を推定する方向推定部と、当該方向推定部により推定された信号の発信方向に基づいて、前記制御対象の機器の存在位置を推定する存在位置推定部とを備え、
     前記信号送信制御部は、当該存在位置推定部により推定された前記制御対象の機器の存在位置に向けて、前記信号を送信する、
     請求項1~請求項19の何れか1項に記載の機器制御装置。
    The control signal receiving unit includes a plurality of receiving elements and has a plurality of receiving elements.
    The device control device has a direction estimation unit that estimates a signal transmission direction based on the light receiving amount of each of the plurality of receiving elements, and a control target based on a signal transmission direction estimated by the direction estimation unit. Equipped with an existence position estimation unit that estimates the existence position of the equipment
    The signal transmission control unit transmits the signal toward the existence position of the device to be controlled estimated by the existence position estimation unit.
    The device control device according to any one of claims 1 to 19.
  21.  第1記憶部には、各機器に対する選好性に関する情報が更に格納されており、
     前記信号送信制御部は、前記制御対象の機器に対する選好性に関する情報に応じて、前記機器に対して前記対象の制御信号を送信させる、
     請求項1~請求項20の何れか1項に記載の機器制御装置。
    The first storage unit further stores information on the preference for each device.
    The signal transmission control unit causes the device to transmit the control signal of the target according to the information regarding the preference for the device to be controlled.
    The device control device according to any one of claims 1 to 20.
  22.  制御信号受信部が、機器への制御内容を表す制御信号を受信し、
     制御信号送信部が、機器への制御内容を表す制御信号を送信し、
     機器種別認識部が、前記制御信号受信部により受信された前記制御信号に基づいて、前記制御信号が表す機器の種別を認識し、
     登録部が、前記機器種別認識部により認識された前記機器の種別を、第1記憶部へ登録し、
     指示信号取得部が、前記制御信号とは異なる種類の信号であって、機器への制御内容を表す信号である指示信号を取得し、
     制御対象機器種別認識部が、前記指示信号取得部により取得された前記指示信号と、前記第1記憶部に記憶されている前記機器の種別とに基づいて、前記指示信号が表す制御対象の機器の種別を認識し、
     制御信号認識部が、機器の種別と制御内容とに対応する制御信号が記憶されている第2記憶部を参照して、前記制御対象機器種別認識部により認識された制御対象の機器の種別と前記指示信号に含まれる当該制御対象の機器への制御内容とに対応する対象の制御信号を認識し、
     信号送信制御部が、前記制御信号送信部に、前記制御対象の機器に対して前記対象の制御信号を送信させる、
     機器制御方法。
    The control signal receiver receives a control signal indicating the control content to the device,
    The control signal transmitter transmits a control signal indicating the control content to the device,
    The device type recognition unit recognizes the type of device represented by the control signal based on the control signal received by the control signal receiving unit.
    The registration unit registers the type of the device recognized by the device type recognition unit in the first storage unit.
    The instruction signal acquisition unit acquires an instruction signal which is a signal of a type different from the control signal and represents a control content to the device.
    The control target device type recognition unit is a control target device represented by the instruction signal based on the instruction signal acquired by the instruction signal acquisition unit and the type of the device stored in the first storage unit. Recognize the type of
    The control signal recognition unit refers to the second storage unit in which the control signal corresponding to the type of the device and the control content is stored, and determines the type of the device to be controlled recognized by the control target device type recognition unit. Recognize the target control signal that corresponds to the control content for the control target device included in the instruction signal.
    The signal transmission control unit causes the control signal transmission unit to transmit the control signal of the target to the device to be controlled.
    Device control method.
  23.  コンピュータを、
     機器への制御内容を表す制御信号を受信する制御信号受信部により受信された前記制御信号に基づいて、前記制御信号が表す機器の種別を認識する機器種別認識部、
     前記機器種別認識部により認識された前記機器の種別を、第1記憶部へ登録する登録部、
     前記制御信号とは異なる種類の信号であって、機器への制御内容を表す信号である指示信号を取得する指示信号取得部、
     前記指示信号取得部により取得された前記指示信号と、前記第1記憶部に記憶されている前記機器の種別とに基づいて、前記指示信号が表す制御対象の機器の種別を認識する制御対象機器種別認識部、
     機器の種別と制御内容とに対応する制御信号が記憶されている第2記憶部を参照して、前記制御対象機器種別認識部により認識された制御対象の機器の種別と前記指示信号に含まれる当該制御対象の機器への制御内容とに対応する対象の制御信号を認識する制御信号認識部、及び
     機器への制御内容を表す制御信号を送信する制御信号送信部に、前記制御対象の機器に対して前記対象の制御信号を送信させる信号送信制御部
     として機能させるための機器制御プログラム。
    Computer,
    A device type recognition unit that recognizes the type of device represented by the control signal based on the control signal received by the control signal receiving unit that receives the control signal representing the control content to the device.
    A registration unit that registers the type of the device recognized by the device type recognition unit in the first storage unit,
    An instruction signal acquisition unit that acquires an instruction signal that is a signal of a type different from the control signal and is a signal representing the control content to the device.
    A control target device that recognizes the type of the control target device represented by the instruction signal based on the instruction signal acquired by the instruction signal acquisition unit and the type of the device stored in the first storage unit. Type recognition unit,
    The type of the device to be controlled recognized by the device type recognition unit to be controlled and the instruction signal are included in the instruction signal with reference to the second storage unit in which the control signal corresponding to the type of device and the control content is stored. To the device to be controlled, the control signal recognition unit that recognizes the control signal of the target corresponding to the control content to the device to be controlled, and the control signal transmission unit that transmits the control signal indicating the control content to the device. A device control program for functioning as a signal transmission control unit that transmits the control signal of the target.
PCT/JP2019/031750 2019-08-09 2019-08-09 Device controller, device control method, and device control program WO2021028994A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021539727A JPWO2021028994A1 (en) 2019-08-09 2019-08-09
PCT/JP2019/031750 WO2021028994A1 (en) 2019-08-09 2019-08-09 Device controller, device control method, and device control program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/031750 WO2021028994A1 (en) 2019-08-09 2019-08-09 Device controller, device control method, and device control program

Publications (1)

Publication Number Publication Date
WO2021028994A1 true WO2021028994A1 (en) 2021-02-18

Family

ID=74569455

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/031750 WO2021028994A1 (en) 2019-08-09 2019-08-09 Device controller, device control method, and device control program

Country Status (2)

Country Link
JP (1) JPWO2021028994A1 (en)
WO (1) WO2021028994A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0230528B2 (en) * 1982-09-30 1990-07-06 Mutoh Ind Ltd
JPH10304473A (en) * 1997-04-30 1998-11-13 Kenwood Corp Remote controller
JP2002224979A (en) * 2001-01-30 2002-08-13 Nec Corp Remote control method for personal robot
JP2011040938A (en) * 2009-08-10 2011-02-24 Revsonic Kk Apparatus interlocking control device and electronic apparatus
WO2014030377A1 (en) * 2012-08-21 2014-02-27 日本電気通信システム株式会社 Wireless device, apparatus to be controlled which is controlled thereby, control system comprising wireless device and apparatus to be controlled, and program for executing by computer control of apparatus to be controlled in wireless device
JP2018142776A (en) * 2017-02-27 2018-09-13 シャープ株式会社 Network system, information processing method, server, and terminal
JP2018152757A (en) * 2017-03-14 2018-09-27 シャープ株式会社 Network system, information processing method, server, and terminal

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0230528B2 (en) * 1982-09-30 1990-07-06 Mutoh Ind Ltd
JPH10304473A (en) * 1997-04-30 1998-11-13 Kenwood Corp Remote controller
JP2002224979A (en) * 2001-01-30 2002-08-13 Nec Corp Remote control method for personal robot
JP2011040938A (en) * 2009-08-10 2011-02-24 Revsonic Kk Apparatus interlocking control device and electronic apparatus
WO2014030377A1 (en) * 2012-08-21 2014-02-27 日本電気通信システム株式会社 Wireless device, apparatus to be controlled which is controlled thereby, control system comprising wireless device and apparatus to be controlled, and program for executing by computer control of apparatus to be controlled in wireless device
JP2018142776A (en) * 2017-02-27 2018-09-13 シャープ株式会社 Network system, information processing method, server, and terminal
JP2018152757A (en) * 2017-03-14 2018-09-27 シャープ株式会社 Network system, information processing method, server, and terminal

Also Published As

Publication number Publication date
JPWO2021028994A1 (en) 2021-02-18

Similar Documents

Publication Publication Date Title
US11516040B2 (en) Electronic device and method for controlling thereof
US11457788B2 (en) Method and apparatus for executing cleaning operation
US8972054B2 (en) Robot apparatus, information providing method carried out by the robot apparatus and computer storage media
US11862010B2 (en) Apparatus, system and method for using a universal controlling device for displaying a graphical user element in a display device
JP2010181064A (en) Air conditioner
US20190360717A1 (en) Artificial intelligence device capable of automatically checking ventilation situation and method of operating the same
US20180165951A1 (en) Remote control apparatus capable of remotely controlling multiple devices
KR102331672B1 (en) Artificial intelligence device and method for determining user&#39;s location
JP2007078270A (en) Air conditioning system
KR20230002021A (en) Smart home device and method
US20130124210A1 (en) Information terminal, consumer electronics apparatus, information processing method and information processing program
WO2021028994A1 (en) Device controller, device control method, and device control program
JP6489793B2 (en) Electronic device, control system, control method, and control program
EP2852175A1 (en) Device control method
JP2005250233A (en) Robot device
CN109960152B (en) Household appliance system and household appliance
JP6890451B2 (en) Remote control system, remote control method and program
JPWO2019239738A1 (en) Information processing device, information processing method
US20210188320A1 (en) Method for estimating location of object, and apparatus therefor
KR20170002048A (en) Apparatus and Method for controlling object moving
US20220122604A1 (en) Information equipment, information processing method, information processing program, control device, control method, and control program
CN112887766B (en) Electronic device and control method thereof
KR102023161B1 (en) Method and apparatus for providing appropriate information for location and space of user using moving device
WO2021065558A1 (en) Information processing device, information processing method, electrical appliance, and electrical appliance processing method
KR100914069B1 (en) General-Purpose Apparatus For Interactive Communication Between Various Objects

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19941102

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021539727

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19941102

Country of ref document: EP

Kind code of ref document: A1