WO2019198188A1 - Dispositif et procédé d'assistance mobile - Google Patents

Dispositif et procédé d'assistance mobile Download PDF

Info

Publication number
WO2019198188A1
WO2019198188A1 PCT/JP2018/015308 JP2018015308W WO2019198188A1 WO 2019198188 A1 WO2019198188 A1 WO 2019198188A1 JP 2018015308 W JP2018015308 W JP 2018015308W WO 2019198188 A1 WO2019198188 A1 WO 2019198188A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
mobile
event
hgw
illuminance
Prior art date
Application number
PCT/JP2018/015308
Other languages
English (en)
Japanese (ja)
Inventor
弘之 野本
駿 平尾
秀人 井澤
玲子 嘉和知
邦朗 本沢
Original Assignee
東芝映像ソリューション株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 東芝映像ソリューション株式会社 filed Critical 東芝映像ソリューション株式会社
Priority to CN201880077501.6A priority Critical patent/CN111418269B/zh
Priority to PCT/JP2018/015308 priority patent/WO2019198188A1/fr
Publication of WO2019198188A1 publication Critical patent/WO2019198188A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21VFUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
    • F21V23/00Arrangement of electric circuit elements in or on lighting devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/04Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • This embodiment relates to a mobile assist device and a mobile assist method.
  • a camera, a microphone, or the like is used, and video data and / or audio data are acquired and analyzed. Various determinations such as suspicious person intrusion are made according to the analysis result.
  • an object of the present embodiment is to provide a mobile assist device that can comprehensively determine data from various sensors, determine the content to be controlled next, and improve the monitoring effect.
  • Still another object of the present invention is to provide a mobile assist device and a mobile assist method that can realize cooperation of a plurality of monitoring sensors and further improve the assist function.
  • a moving unit a sensor mounted on the moving unit, and a control unit that determines the brightness of the surrounding space from the output of the sensor and outputs a control signal for controlling the operation of another device based on the determination result Is provided.
  • FIG. 1 is a diagram showing an outline of a mobile assist device.
  • FIG. 2 is a diagram illustrating the relationship between the internal configuration of the HGW 600 and the network.
  • FIG. 3A is a diagram illustrating a case where the mobile HGW 600 enters the bright room 800a.
  • FIG. 3B is a diagram illustrating a case where the mobile HGW 600 enters the dark room 800b.
  • FIG. 4 is a flowchart illustrating an example of a control operation of the HGW 600.
  • FIG. 5 is a diagram illustrating an example of the relationship between the HGW 600 and the smartphone.
  • FIG. 6 is an explanatory diagram showing a schematic interrelationship between a sensor group and a controlled device in the home and the HGW 600.
  • FIG. 7 is a flowchart showing an example of system operation when the full security mode is set in the system to which the present embodiment is applied.
  • FIG. 8 is a flowchart showing another system operation example when the full security mode is set in the system to which the present embodiment is applied.
  • FIG. 9 is a flowchart showing an example of the system operation when the operation of the air conditioner is monitored in the system to which the present embodiment is applied.
  • FIG. 10 is a reference diagram for explaining the operation when the sensor and the controlled device are inspected in the system to which the present embodiment is applied.
  • FIG. 11 is a diagram showing an example of the overall configuration of another network system to which this embodiment is applied.
  • FIG. 12 is an explanatory diagram showing an example in which a stream of event-related data and monitoring data is recorded in the timeline in the embodiment shown in FIG.
  • FIG. 13 is a block diagram showing the essential parts in the embodiment of FIG.
  • FIG. 14 is a diagram showing an example of a smartphone menu as a user interface capable of accessing event-related data and / or monitoring data.
  • FIG. 15A is an explanatory diagram illustrating a procedure for accessing monitoring data with a smartphone.
  • FIG. 15B is an explanatory diagram showing another procedure for accessing monitoring data with a smartphone.
  • FIG. 16A is a diagram illustrating an example of an operation screen displayed on the smartphone.
  • FIG. 16B is a diagram illustrating another example of the operation screen displayed on the smartphone.
  • FIG. 17 is a diagram illustrating an example of an image when monitoring data (video data) related to a certain event is reproduced.
  • FIG. 18A is a diagram for explaining an example of a relationship between a smart phone and event-related data displayed on the smart phone and an operation method.
  • FIG. 18B is a diagram for explaining an example of another relationship between the smart phone and the event-related data displayed on the smart phone and another operation method.
  • FIG. 19 is a diagram for explaining an example of still another relationship between the smart phone and the event-related data displayed on the smart phone and another operation method.
  • FIG. 20 is a hierarchy diagram for explaining an example of the relationship between the event-related data and the recording position of the monitoring data.
  • Reference numeral 600 denotes a home gateway (described as HGW), which can be connected to a network as will be described later.
  • HGW 600 is a moving type integrated with a moving device (also referred to as a carrying device) 618.
  • the HGW 600 may be a stationary type (may be referred to as a fixed type), and may be collectively referred to as an assist device 650 together with a movable type.
  • the HGW 600 (assist device 650) includes at least a camera 611, a microphone 613, and a speaker 615.
  • the camera 611, the microphone 613, and the like may be referred to as sensors in a large sense. A plurality of cameras may be provided.
  • the HGW 600 can control the moving device 618 based on the pickup data from the sensor and the internal control application. Based on this self-control, location change, object tracking, and the like can be performed.
  • FIG. 2 shows the relationship between the internal configuration of the HGW 600 and the network.
  • the server 1000 can be connected to the HGW 600 via the Internet 300.
  • the HGW 600 includes a memory 601, a control unit (may be referred to as a system controller) 602, a device manager 603, a network interface (hereinafter, network I / F) 605, a sensor control table 609, a camera 611, a microphone 613, a speaker 615, and the like. Prepare.
  • the HGW 600 can support various communication methods via a network function (605) that is a communication function (may be referred to as a communication device).
  • the communication method of the sensor may differ depending on the manufacturer. For example, there are sensors that employ IEEE 802.15.4 as a communication method, sensors that employ IEEE 802.1151, and IEEE 802.15.3a that exist. . Furthermore, some sensors employ IEEE 802.11b, IEEE 802.11a, or IEEE 802.11g.
  • the HGW 600 of the present embodiment can be equipped with an interface that can support each method as a network interface.
  • the HGW 600 includes a drive unit 618a that drives and controls the previous moving device 618. Furthermore, the HGW 600 includes a mapping unit 619 that can store a moving position and create a map.
  • the HGW 600 includes an illumination sensor 622 that detects ambient brightness and a human sensor 623 that can detect whether there is a person around. Note that when the ambient illumination is sufficient, the camera 611 may also serve as the human sensor 623.
  • the memory 601 (may be referred to as a control data management unit) includes an application manager (hereinafter APP-Mg), an event manager (hereinafter EVT-Mg), and a configuration manager (hereinafter CONFIG-Mg).
  • APP-Mg manages a plurality of applications for controlling various operations of the HGW 600.
  • the EVT-Mg manages an event application for controlling various operations resulting from the occurrence of various events.
  • CONF-Mg recognizes functions in the HGW 600 and various functions related to the HGW 600, and manages configuration applications that perform, for example, operation order and operation restrictions.
  • the system controller 602 can control each block in the HGW 600 and perform sequence control. Further, the HGW operation (determination, data processing, analysis operation, communication) and the like described later with reference to a flowchart are executed based on the system controller 602 and the applications stored in the memory 601.
  • EVT-Mg can control a camera 611, a microphone 613, a speaker 615, a recording manager (not shown), and the like. Further, the EVT-Mg can determine the detection data from the external sensor taken in from the network I / F 605 and / or the data from the camera 611 and the microphone 613, and control the next action and behavior. CONFG-Mg can perform settings such as initial setting, function restriction, function expansion, priority order, and operation time of each function block in the HGW 600.
  • the device manager 603 can authenticate other devices that operate in association with the HGW 600 and register them in the memory 601. Therefore, the device manager 603 can manage a plurality of other sensors, the lighting fixture 120, and so on connected via the network I / F 605.
  • the device manager 603 also registers identification data of the server 1000 connected via the Internet 300, and can recognize the server 1000. Furthermore, the device manager 603 also registers identification data such as a smart phone connected via the Internet 300, and can recognize the smart phone.
  • the sensor and instrument control table 609 stores names of various sensors and instruments, positional information of the various sensors and instruments, and data for controlling and / or restricting the controls of the sensors and instruments.
  • the name and position information of each sensor can be displayed on a smart phone or the like, so that the user can check the type and mounting position of the sensor.
  • the network I / F 605 is connected to, for example, another sensor in the home or a control target (the lighting fixture 120,...) Via short-range wireless.
  • the illumination tool 120 is shown as a representative.
  • the lighting device 120 includes an I / F unit 122 connected to a network, a light control unit 123, and a light emitting unit 124.
  • the dimming unit 123 can control the current of the light emitting unit 124 based on a command given from the outside via the I / F unit 122, and by this control, the lighting can be controlled brightly or darkly. It is also possible.
  • various sensors exist as information acquisition sensors, control sensors, and controlled sensors.
  • mapping unit 619 that can store the movement position and create a map will be further described.
  • the mapping unit 619 can apply an image from the mounted camera 611, for example.
  • a mapping function of the mapping unit 619 there is a SLAM (Simultaneous Localization and Mapping: simultaneous execution of self-position estimation and environment map generation) function.
  • the SLAM function operates while creating a map of a target area and constructing an operation (movement) path according to the map, while a conventional mobile object travels randomly on the floor.
  • the SLAM function operates by referring to imaging data from the camera, internally generates a surrounding environment map, and can output current position information.
  • ⁇ Lighting control and imaging> 3A and 3B show a case where the mobile HGW 600 has entered the bright room 800a and a case where the mobile HGW 600 has entered the dark room 800b.
  • the HGW 600 checks the measurement value of the illuminance sensor 622 when entering the room 800a in order to take a picture of the room for periodic inspection, for example. Since the room 800a is bright, photographing is performed. As a case where the room 800a is bright, there is a case where the lighting tool 120 is lit with sufficient brightness, or there is external light from the window 802 and the room is sufficiently bright.
  • the HGW 600 controls the illuminator 120, brightens the room by lighting or dimming control, and performs shooting.
  • the above description is an example, and the lighting control by the HGW 600 is a simple example.
  • the HGW 600 can perform more complicated control depending on the environment.
  • control operation of the HGW 600 is performed in an integrated manner by the system controller 602 in the HGW 600.
  • the HGW 600 starts the operation based on, for example, an operation setting by a timer set by the user, a direct operation by the user, or some detection information from another sensor.
  • the direct operation by the user may be, for example, an operation command from the smartphone via the Internet. Or you may start operation
  • the moving body of the HGW 600 may start moving based on a command from a stationary assist device described later (this is a cooperative operation of the assist device).
  • the HGW 600 grasps its own movement position based on the combination of the home map data and other sensors (camera, GPS, etc.).
  • a list of areas to be moved is acquired (step SB1).
  • the system controller 602 refers to a list of areas to be operated and determines whether or not an unreached area remains (step SB2). If no unreached area remains, the operation ends. If an unreached area remains, for example, an unreached area with a high priority is determined.
  • it is determined whether the area determined to be an unreached area is an area that the HGW 600 can reach (step SB3).
  • step SB3 If it is determined in step SB3 that the area is reachable, the process proceeds to step SB4. If it is determined that the area is unreachable, the process proceeds to step SB21.
  • the material that can be determined as an unreachable area include, for example, a locked key with a sensor in a room, or an obstacle that cannot be avoided as a result of checking with a camera.
  • the HGW 600 After reaching the reachable area (step SB4), the HGW 600 measures ambient illuminance (step SB5). After measuring the illuminance, it is determined whether the light color of the surrounding space is bright enough to allow the SLAM function to operate normally (step SB6).
  • a predetermined operation is executed (step SB7).
  • the predetermined operation is an operation according to the purpose for which the HGW 600 has moved to this position.
  • the HGW 600 when moving to measure a room temperature, acquires temperature information from a room temperature sensor. Alternatively, when the user moves to photograph a flower displayed in the room, the HGW 600 photographs a flower using the camera 611. Furthermore, when the user moves to monitor the open / closed state of the room window, the open / close information from the room window sensor is acquired or the window is photographed using a camera. If such an operation is not completed, the process returns to step SB5. In addition, when the HGW 600 determines that the state of the flower is water shortage, the HGW 600 can turn on the automatic water feeder and supply preset moisture to the flower. The HGW 600 can control the illumination state even when shooting flowers.
  • the HGW 600 determines whether the illumination is controlled before the operation (step SB9). If the HGW 600 controls the illumination before the operation, the HGW 600 performs control to return the illumination to the state before the control (Step SB10), and returns to Step SB2.
  • step SB3 If it is determined in step SB3 that it is impossible to reach an unreached area, the area is excluded from the operation (monitoring) target area (step SB21), and the process returns to step SB2.
  • step SB6 the HGW 600 determines that the light color of the surrounding space has such a brightness that the SLAM function cannot operate normally.
  • the process proceeds to step SB22, where it is determined whether the lighting fixture is a network control target.
  • step SB23 the lighting device is prohibited from being controlled.
  • data in the device manager 603 and / or the sensor control table 609 is used.
  • the lighting fixture is not a network control target or if control is prohibited, the area is excluded from the movement target area in step SB21.
  • the human sensor 623 detects a person.
  • the area is basically excluded from the operation target area except for a specific operation purpose (step SB21).
  • the specific operation purpose is a specific operation purpose when it is set as a monitoring target even when a person exists in the room. For example, even when a person is present in a hospital bed, the open / closed state of windows and curtains may be monitored. Therefore, such a room can be registered in advance as an exceptional room (area), that is, for example, registered in the mapping unit 619 as a check essential area.
  • step SB24 if the human sensor does not detect a person, it is determined whether the upper limit of the number of dimming trials has been reached. This is for checking whether or not the prescribed number of times of control has been exceeded by referring to the number of times of lighting control in the current area so far. And the restriction
  • step SB26 the initial state of illumination is recorded (stored) (step SB26), and illumination control is started (step SB27). However, if it exceeds the upper limit, the process proceeds to step SB10.
  • step SB5 For lighting control, all lights and color can be adjusted.
  • the process proceeds to step SB5.
  • the operation is terminated when the unreached area reaches zero. Then, for example, the charging stand return charging is automatically started.
  • the lighting device may be linked with the photographing by the camera 611 together with the light control.
  • the dimming is not performed when the human sensor detects a person.
  • the microphone 613 and / or the speaker 615 may be turned on.
  • FIG. 5 shows an example in which the lighting control operation of the HGW 600 is set from the external smart phone GUI-1.
  • the smart phone GUI-1 HGW application is started, the HGW 600 is accessed, and a list of lighting fixtures can be displayed. And the control prohibition time slot
  • control is prohibited from 19:00 to 21:00 for the lighting G (for example, a front lighting fixture).
  • Various methods for inputting a numerical value are possible, such as a numerical selection method or a method of selecting and inputting a numeric keypad displayed on the screen.
  • control is prohibited from 17:00 to 23:00.
  • the above shows an example of setting the prohibition of lighting control by the HGW 600 using the smart phone GUI-1.
  • other commands can be sent from the smart phone GUI-1 to the HGW 600.
  • the HGW 600 can also transmit, for example, a video image of the destination area to the smart phone GUI-1 according to a command from the smart phone GUI-1. For example, it is possible to send a list indicating what kind of control is performed during the day to the smart phone GUI-1.
  • the assist system can be set to full security mode. For example, when the landlord goes out for a while (for example, 1-2 weeks) and the outsider does not plan to come home.
  • the full security mode can be set to ON by selecting the full security mode from the menu screen of the smart phone GUI-1 and operating the button AraS101.
  • the assist device (A1) ... Control for determining the brightness of the surrounding space from the moving body, the sensor mounted on the moving body, and the output of the sensor, and controlling the operation of other instruments based on the determination result
  • a control unit for outputting a signal is provided.
  • (A2)... (A1) further comprising a mapping unit storing a plurality of areas, wherein the control unit determines a destination area among the plurality of areas, and in the determined destination area, Measure the illuminance.
  • (A3) ... (A1) further comprising a mapping unit storing a plurality of areas, The control unit determines a destination area among the plurality of areas, measures illuminance in the determined destination area, and if the measured value of the illuminance is less than a predetermined value, Adjust the illuminance of the lighting fixture.
  • a camera is further provided, and when the illuminance of the lighting fixture in the determined movement destination area is largely adjusted, shooting is performed with the camera.
  • control unit includes a human sensor, and stops the adjustment of the illuminance of the illuminator when the human sensor detects a person.
  • control unit includes a human sensor, and when the human sensor detects a person, the adjustment of the illuminance of the lighting fixture is stopped, and a microphone and / or Activate the speaker.
  • the camera and the mapping unit storing a plurality of areas are provided, and the control unit measures the illuminance when the measured illuminance value is less than a predetermined value. Temporarily store the value, When the illuminance of the lighting fixture in the determined destination area is greatly adjusted, shooting is performed with the camera, After the photographing by the camera is completed, the illuminator is adjusted so that the illumination of the illuminator becomes the temporarily stored measurement value.
  • control unit also stores the color of the illumination before adjusting the illuminance of the illuminator, and the illumination color of the illuminator after the photographing by the camera is completed. Return to the state before adjustment.
  • control unit does not control the illuminator during a time zone in which the illuminance control of the illuminator is prohibited.
  • control unit may control the lighting device to perform a flash operation.
  • control device when the moving body moves to a destination based on a command from a stationary assist device, the control device outputs a control signal for controlling the illumination. .
  • the above-described mobile assist device 650 can start an operation based on some detection information from the first sensor. In this case, cooperation with the second and third sensors may be further performed.
  • FIG. 6 is an explanatory diagram showing a schematic interrelationship between a sensor group in a home and controlled devices (including lighting fixtures, air conditioners, refrigerators, television devices, irons, automatic doors, digestion equipment, etc.) and the HGW 600.
  • controlled devices including lighting fixtures, air conditioners, refrigerators, television devices, irons, automatic doors, digestion equipment, etc.
  • the air conditioner when the air conditioner is controlled and the room temperature is lowered, the temperature may not drop easily. In such a case, the air conditioner may not be the cause, but the cold air may leak to the outside because the window is open. In addition, although the air conditioner was controlled to lower the room temperature, the room temperature sensor may be broken, and the temperature may not be adjusted accurately.
  • the embodiment described below can provide a system capable of coordinating with other sensors and arranging a favorable indoor environment when the above-described problems occur.
  • the HGW 600 can communicate with a sensor group and a controlled device in the home via wired or wireless.
  • a communication method Bluetooth (registered trademark), ZigBee (registered trademark), Z-Wave (registered trademark), Wi-Fi (registered trademark), or the like is adopted.
  • IoT element group 2000 indicates various sensor groups and controlled device groups. These may be referred to as a home network terminal group or an Internet Observers (so-called IoT element group). Hereinafter, each sensor and each controlled device included in the IoT element group 2000 will be described.
  • Sensor 2100 is an example of a sensor that detects an event.
  • a switch 2102 is provided on the substrate 2101.
  • One end of the next piece 2103 is attached to one end of the substrate 2101 via a hinge.
  • the next piece 2103 moves away from the substrate 2101 and turns on the switch 2102.
  • power from the power source is supplied to the power supply circuit configured on the board 2101, the radio wave transmitter of the board 2101 is activated, and radio waves including a predetermined sensor ID are output.
  • the switch 2102 is turned on (that is, the door or window is opened), and this radio wave is caught by the HGW 600, and the HGW 600 can recognize that the door or window is open.
  • the HGW 600 can recognize that the door or the window is closed.
  • Sensor 2110 is an example of a sensor that detects other events.
  • a photoelectric converter (photoelectric conversion panel) 2112 is attached to the substrate 2111.
  • the output of the photoelectric converter 2112 drives the radio wave transmitter 2113.
  • the photoelectric converter 2112 is configured to discharge immediately and lose power when not irradiated with light. Therefore, for example, when the curtain is opened or when illumination is irradiated, a radio wave including the sensor ID is output from the radio wave transmitter 2113. Conversely, when the curtain is closed or the illumination is turned off, the radio wave transmitter 2113 stops and the radio wave output stops. Therefore, the sensor 2110 can be used as a sensor for detecting whether the curtain is opened or closed or lighting is turned on or off.
  • a color filter may be provided on the light receiving surface of the photoelectric converter 2112 so that it does not react to unnecessary light.
  • a second sensor similar to the sensor 2110 may be added to detect opening / closing of the curtain.
  • the second sensor When the curtain is opened, the second sensor is blocked by the light and the switch is turned off.
  • the switch When the curtain is closed, the light is emitted and the switch is turned on. You may comprise so that the electromagnetic wave containing may be output. In this way, if one of the sensors fails, the HGW 600 can easily determine the abnormality. Therefore, the HGW 600 system can enhance the detection capability of the curtain open / close detection function.
  • the sensor 101 is an advanced sensor composed of an integrated circuit.
  • a memory 112 and a network I / F 115 are included. Further, functions 116 and 117 as sensing elements are included.
  • the types of sensors are not limited to such types, and various types can be used.
  • the memory 112 includes an application manager (APP-Mg), an event manager (EVT-Mg), and a configuration manager (CONFIG-Mg).
  • CONFIG-Mg manages various applications for controlling the entire operation of the sensor system.
  • the EVT-Mg manages an event application for executing the next operation of the sensor 101 based on the detection data from the functions 116 and 117.
  • the functions 116 and 117 have various elements depending on the sensing purpose. Examples of various elements include a camera and a microphone. Furthermore, various elements include a thermal sensor, a temperature sensor, a humidity sensor, an illumination sensor, a pressure sensor, a switch, and the like.
  • the sensor 101 may include one or more sensing elements for the purpose of use.
  • the sensors 101, 102, 103,... are sensors that detect the opening / closing of a door, a sensor that detects the sound of a certain loud sound, a sensor that detects the movement of a person, and a sensor that detects the opening / closing of a window. It can be used as a sensor for photographing, and is disposed at various locations in the home, for example.
  • the mobile HGW 600 has been described. However, a stationary HGW 600a may be additionally provided. In this case, the stationary HGW 600a is set as, for example, a slave HGW 600a. Since this HGW 600a also has the same configuration and function as the HGW 600 described with reference to FIG. 2 except for the moving body, detailed description thereof will be omitted.
  • the 2121 is a fixed camera provided in, for example, a parking lot, a front door, and a gate, and functions as a sensor.
  • the lighting 2131, 2132, 2133, the digestion facility 2126, and the like are controlled devices in each room in the home.
  • a temperature sensor 2122 installed at a kitchen or indoor temperature measurement location, a pressure sensor 2133 attached to the edge of a window glass, a door, a fire alarm sensor 2125, and a microphone (not shown) belong to the sensor group.
  • the mobile HGW 600 and the stationary HGW 600a are efficiently utilized by combining the characteristics of the sensor group and the controlled device group described above, and exhibit various abilities more than one sensor or one controlled device. be able to.
  • FIG. 7 is a flowchart showing the system operation in the full security mode.
  • the landlord may go out for a week or two, for example.
  • the system of the embodiment can set the full security mode.
  • step SC1 a suspicious moving object is detected by the camera of the stationary HGW 600a (step SC2).
  • the camera starts up and starts imaging periodically or based on detection signals from sounds, human sensors, and the like.
  • the captured video data is processed by the motion detection circuit, so that the HGW 600a can detect a suspicious moving body.
  • the HGW 600a images the subject as an event.
  • the imaging data is recorded on a recording medium (for example, a USB connection recording / reproducing apparatus may be used) connected to a home network. Alternatively, it may be recorded on a recording medium in the server via a network.
  • the HGW 600a When the HGW 600a detects a suspicious moving body, the HGW 600a notifies the mobile HGW 600 of this (step SC3).
  • the HGW 600a continues to image the suspicious moving body, but the suspicious moving body may go out of the field of view (step SC4).
  • a suspicious moving body may move to another room or entrance.
  • the HGW 600a notifies the mobile HGW 600 that the suspicious moving body has moved to another room or entrance (step SC5).
  • the HGW 600a notifies the mobile HGW 600 of which room the suspicious moving body has moved to based on pre-registered map information.
  • the mobile HGW 600 can move to the room or entrance where the suspicious moving body has moved, photograph the suspicious moving body, and transmit video data for recording on the recording medium.
  • FIG. 8 is a flowchart showing another system operation in the full security mode.
  • the landlord may go out for a week or two, for example.
  • the system of the embodiment can set the full security mode.
  • the mobile HGW 600 detects a suspicious sound with a microphone (steps SD1 and SD2).
  • the mobile HGW 600 may not be able to capture the suspicious object that causes the suspicious sound even if the surroundings are captured by the camera.
  • the mobile HGW 600 acquires data from sensors (window sensor, door sensor, pressure sensor, temperature sensor, etc.) installed in the home such as each room, and performs data analysis (step SD3).
  • sensors windshield sensor, door sensor, pressure sensor, temperature sensor, etc.
  • the mobile HGW 600 determines whether or not the suspicious sound is credible by data analysis, that is, whether or not it is a sound that has not been customary until now (abnormal sound). For this determination, a sound learning result detected in the past is also used. For example, in an area where a train or car always passes nearby and makes a vibration sound, even if a similar result is detected, the mobile HGW 600 determines that it is a suspicious sound. Absent. Examples of the abnormal sound include a sound that strikes or destroys a window glass, a sound that collides, and a squeak sound.
  • step SD5 If the credibility of the suspicious sound is high, the direction and location where the suspicious sound is generated is estimated (step SD5). Then, the mobile HGW 600 moves to the area where the suspicious sound is generated based on the map information, directs the camera in the direction of the sound, and performs shooting as an event (step SD6).
  • the mobile HGW 600 can take a picture of a moving path with a camera even during movement, and if there is an obstacle, it can move around the obstacle.
  • the mobile HGW 600 registers a captured image without an obstacle in the mapping unit 619 in advance. For this reason, if there is an obstacle on the moving route, it can be immediately determined by the image comparison that the obstacle exists.
  • the HGW 600 can turn on nearby illumination when the surrounding space is dark as described above. At this time, the illuminance data of the illumination is also acquired, and if the illuminance is insufficient, the administrator can be notified as a warning.
  • FIG. 9 shows an embodiment in which the mobile HGW 600 determines whether the air conditioner is functioning normally after the controlled device (for example, an air conditioner) is turned on.
  • the mobile HGW 600 turns on the cooling air conditioner in the room A1 by, for example, remote operation or a user's voice command (steps SE1 and SE2).
  • SE3, SE4 When the air conditioner starts cooling operation and a certain time has passed (SE3, SE4), it is checked whether or not the temperature of the room A1 has dropped to the vicinity of the set value (step SE5).
  • the mobile HGW 600 stores the temperature of the room A1 when the cooling air conditioner is started, and can compare the temperature with the temperature of the room A1 after a predetermined time has elapsed.
  • step SE6 if the temperature of the room A1 is within a desired temperature range, the process ends (step SE6). However, when the temperature of the room A1 does not fall within the predetermined range, data is collected from various sensors installed in the room A1.
  • sensors for example, data is collected from sensors of window sensors, door sensors, heat sources (gas stove, heater, etc.).
  • the window is open, the door is open, or the heat source is turned on, there is a high probability that the temperature of the room A1 does not decrease. Therefore, the mobile HGW 600 moves to the site, performs shooting, and notifies the administrator (step SE15).
  • step SE14 After analyzing the data from the sensor, if the cause is unknown, move to the site, take a picture, and notify the administrator (step SE14).
  • the cooling air conditioner can determine the normal operation or the abnormal state of the heating air conditioner by the same process even if it is replaced with the heating air conditioner.
  • FIG. 10 is an explanatory diagram for explaining still another embodiment of the mobile HGW 600 described above.
  • the mobile HGW 600 has a function of checking normal and abnormal states of various sensors and various controlled devices installed in the home periodically or according to a user command.
  • the lighting device 120 is controlled to be turned on / off, and the output of the camera 611 and / or the illuminance sensor is checked. Thereby, it can be determined whether the lighting fixture 120 is operating normally. Moreover, it is also possible to check whether or not the camera 611 and / or the illuminance sensor are operating normally by separately controlling a plurality of lighting fixtures on and off. When the on / off control of a plurality of lighting fixtures is performed separately, if the camera 611 and / or the illuminance sensor do not react to each other, the camera 611 and / or the illuminance sensor may be broken.
  • the output data from this illumination intensity sensor may be employ
  • the HGW 600 can also control whether the air conditioner 2127 is operating normally based on the detection data from the temperature sensor 2122 by controlling the air conditioner 2127 on and off.
  • the air conditioner is normal, it is possible to determine whether any one of the temperature sensors has failed by acquiring and analyzing detection outputs from a plurality of temperature sensors.
  • the opening / closing state of the window 2128 can be photographed by the camera 611, and it can be determined from the image data whether the window 2128 is open. At this time, it is also possible to determine whether the window sensor 2100 is normal or not by determining whether the window sensor 2100 is on or off.
  • the HGW 600 of this embodiment can inspect various sensors and various controlled devices.
  • the main functions are summarized below.
  • the control device is configured to obtain sound data of the sound detected by the microphone, to estimate the sound direction and the sound generation area, and to control the moving device to generate the sound.
  • Means for moving to an area and causing the camera to image the direction of the sound is provided.
  • (2B)... (1B) comprising means for restricting the estimation, the control of the moving device, and the imaging according to the type of the sound.
  • (3B)... (1B) comprising learning means relating to the sound, and in the case where the sound is a sound stored in the learning means, means for restricting the estimation, the control of the moving device, and the imaging. Can be provided.
  • the moving device is controlled to image the moving system path while moving, and from the difference between the past imaging data and the imaging data during the current movement, Means for determining the presence or absence of an obstacle and controlling the mobile device so as to avoid the obstacle.
  • (7B) means for controlling the first controlled device in (1B), means for obtaining the detection output of the first sensor that reacts to a phenomenon based on the first controlled device, and the first Means for checking the logical consistency between the control state of the first controlled device and the detection output of the first sensor.
  • the first controlled device is a lighting tool
  • the first sensor is an illuminance sensor
  • FIG. 11 is a diagram illustrating an overall configuration example of a network system in which the mobile assist device according to the embodiment is used.
  • the server 1000 can be connected to a home gateway (hereinafter referred to as HGW) 600 via the Internet 300.
  • the HGW 600 includes a system controller 602, a device manager 603, a network interface (hereinafter referred to as network I / F) 605, a recording manager 607, a camera 611, a microphone 613, a speaker 615, and the like.
  • the HGW 600 includes a sensor control table 609.
  • the memory (control data management unit) 601 is as described above.
  • the system controller 602 can control each block in the HGW 600 and perform sequence control.
  • the EVT-MG can further control the recording manager 607.
  • the sensor control table 609 stores the name of each sensor in which the sensors 101, 102, 103, and 104 are registered, the position information of each sensor, and data for controlling each sensor.
  • the name and position information of each sensor can be displayed on the smart phone GUI-1, which allows the user to check the type and mounting position of the sensor.
  • the network I / F 605 is connected to other sensors 101, 102, 103,...
  • a configuration of another sensor 101 is shown as a representative.
  • the sensor 101 also includes a control data management unit 112 and a network I / F 115. Further, functions 116 and 117 as sensing elements are included.
  • the types of sensors are not limited to such types, and various types can be used.
  • the memory (control data management unit) 112 includes an application manager (APP-MG), an event manager (EVT-Mg), and a configuration manager (CONFIG-Mg).
  • CONFIG-Mg manages various applications for controlling the entire operation of the sensor system.
  • the EVT-Mg manages an event application for executing the next operation of the sensor 101 based on the detection data from the functions 116 and 117.
  • the functions 116 and 117 have various elements depending on the sensing purpose. Examples of the various elements include a camera and a microphone as in the case of the HGW 600.
  • various elements include a thermal sensor, a temperature sensor, a humidity sensor, an illumination sensor, a pressure sensor, a switch, and the like.
  • the sensor 101 may include one or more sensing elements for the purpose of use.
  • the sensors 101, 102, 103,... are sensors that detect the opening / closing of a door, a sensor that detects the sound of a certain loud sound, a sensor that detects the movement of a person, and a sensor that detects the opening / closing of a window.
  • sensors for taking pictures for example, they are arranged at various locations in the home.
  • the control data management unit 601 determines that an event has occurred. recognize. Then, the control data management unit 601 controls the camera 611 via the recording manager 607. As a result, the camera 611 sends the monitoring data cached from before the event occurrence time (for example, 10 minutes before) to the storage medium via the recording manager 607 and the control data manager 601 and continues for a certain time ( (For example, 3 minutes, 5 minutes, 10 minutes, 20 minutes, 30 minutes, etc.) Continue to send the captured monitoring data.
  • event related data also referred to as event attribute data
  • event attribute data when the event is detected is also sent to the storage medium 1010 in this system.
  • the event-related data can include, for example, any one or more of an event occurrence time, the type of sensor that detected the event, sensor position data, a recording start time, a recording end time, and the like.
  • the storage medium is, for example, a memory in the server 1000, but is not necessarily a storage medium in the server.
  • the storage location of the monitoring data may be a storage medium in the HGW 600 or a storage medium connected via the network I / F 605.
  • the storage medium 1010 includes a data area 1011 and a management area 1021. Monitoring data 1012 is stored in the data area 1011, and event related data 1022 is stored in the management area 1021.
  • the monitoring data 1012 may include not only video data but also measurement data from a sensor. For example, a change state of temperature at a specific location, a change state of humidity, or a change state of pressure.
  • management data for reproducing the monitoring data is described. This management data includes the previous event-related data.
  • the management data includes event-related data and a recording address of monitoring data corresponding to the event-related data. When a plurality of events occur, a plurality of event related data and a plurality of monitoring data corresponding to the plurality of event related data exist.
  • the event related data includes the type of event (may be referred to as sensor output). Monitoring data (for example, monitoring video) is recorded based on the event, and the event related data includes its recording start time, recording end time, and the like.
  • FIG. 12 shows the passage of time when monitoring data is recorded on the storage medium when an event occurs.
  • various sensors in the home living room are assumed.
  • sensors there are door 1 open / close detection sensor, door 2 open / close detection sensor, window 1 open / close detection sensor, window 2 open / close detection sensor, microphone, motion detection sensor (captured image or infrared sensor etc. are used) It shall be.
  • the HGW 600 is arranged in a corner on the ceiling of the living room, and the camera can take an image of the inside of the living room.
  • the movement of a person is detected by the camera.
  • the door is opened and closed, for example, recording for about 3 minutes is performed.
  • recording is continuously performed during this detection period. Audio is picked up also from the microphone 613 while recording is being performed.
  • the storage medium 1010 (however, if there is a storage medium in the HGW 600 or directly connected to the HGW 600, that storage medium is also possible), the monitoring data resulting from the first event (two events have occurred) are stored. Recorded as recording data Rec1 on a storage medium.
  • the event-related data at this time includes the sensor ID attached to the door 1, the ID of the camera 611, the start time and end time of the recording Rec1.
  • the management data (event-related data) includes an address on the storage medium in which the recording Rec1 is stored.
  • a loud sound is picked up by microphone 613 at time t9
  • a human movement is detected by camera 611 at time t10
  • window 1 is opened at time t11
  • a loud sound is picked up again by microphone 613 at time t12.
  • the monitoring data resulting from the fourth event is recorded as recording data Rec4 on the storage medium.
  • the monitoring data resulting from the fifth event (the occurrence of three events) is recorded on the storage medium as the recording data Rec5.
  • the HGW 600 can present the monitoring data in various forms to the smart phone GUI-1. .
  • FIG. 13 shows an internal configuration example of the system controller 602 shown in FIG.
  • the recording command device 621 sends monitoring data 1012 to the recording medium 1010 and commands recording. At the same time, event-related data is sent to the recording medium 1010 and recorded.
  • the event determiner 625 can also determine an event when a specific command signal is sent from the smart phone GUI-1. For example, when the first user having the smart phone GUI-1 has a telephone conversation with the second user who is at home, the first user operates the specific key of the smart phone GUI-1 to send an event activation signal to the HGW 600. Can be sent to. The first user can send an event activation signal to the HGW 600 by operating a specific key of the smart phone GUI-1 even when not talking. Furthermore, the second user staying at home can consciously operate the sensor and send an event activation signal to the HGW 600. For example, for inspection, the second user consciously operates a sensor that senses lighting on / off (for example, shields / releases the light receiving unit), and can send an event activation signal to the HGW 600.
  • a specific command signal is sent from the smart phone GUI-1. For example, when the first user having the smart phone GUI-1 has a telephone conversation with the second user who is at home, the first user operates the specific key of the smart phone GUI-1 to send an event
  • the user When the user wants to check the monitoring data, the user requests the HGW 600 (system controller 602) to reproduce the monitoring data regarding the desired event via the smart phone GUI-1 or the television receiver GUI-2 connected to the network. be able to.
  • HGW 600 system controller 602
  • the system controller 602 includes a reproduction controller 623 for reproducing arbitrary event-related data and monitoring data from the storage medium 1030.
  • the reproduction controller 623 includes a fast forward function, a reverse feed function, a frame advance function, and an event processor for rounding events. Since a large amount of event-related data and monitoring data are stored in the storage medium 1010, the system controller 602 can behave so that the user can efficiently check desired monitoring data.
  • the system controller 602 includes a filtering unit 631 and a display style processing unit 629 that can classify and select various events and generate a display list or display arrangement. The generated display arrangement and reproduced monitoring data are transmitted to a monitor such as the smart phone GUI-1 or the television receiver GUI-2 via the display data output unit 627.
  • the system controller 602 also includes a memory 624 for temporarily storing data and lists.
  • the system controller 602 makes a call with the smart phone GUI-1 or the television receiver GUI-2 and transmits the generated display arrangement and the reproduced monitoring data to the monitor.
  • the playback controller 623 executes a fast-forward function, a reverse-feed function, and a frame-feed function for an image capturing event in response to a command from the smart phone GUI-1 or the television receiver GUI-2. be able to.
  • the playback controller 623 includes an event processor that processes event-related data, and can execute an event arrangement order, an event selection process, and the like.
  • FIG. 14 shows a state where a menu is displayed on the screen of the smart phone GUI-1, for example.
  • buttons in this menu for example, a monitoring data request button 501, an Internet (1) connection button 502, an Internet (2) connection button 503, a horn activation button 504, a game (1) start button 505, and a game (2) start button 506 and the like.
  • a sensor list button 507 is provided, and when this button 507 is operated, a list of various sensors for detecting an event can be displayed.
  • the monitoring data request button 501 is touch-operated. Then, for example, as shown in FIG. 15A or FIG. 15B, the smartphone GUI-1 gives “all” and “specify” to the user together with a message “What kind of event image should be checked?” , Buttons 512, 513, and 514 like “usually” are displayed.
  • event all events regardless of sensors
  • monitoring data monitoring data
  • a part (thumbnail) of the image data captured by (1) is sent to the smart phone GUI-1. Since a large amount of event-related data and monitoring data are stored in the storage medium 1010, the display data at the start of display is mainly for events that occurred 5 hours before the current time, Event related data related to a plurality of (3 to 5) events and representative thumbnails of corresponding monitoring data are selected and displayed.
  • the representative thumbnail is monitoring data (image data) corresponding to the event occurrence time, for example.
  • the user can select the “Specify” button 513 by touch operation.
  • a list 517 of names (door 1 opening / closing, door 2 opening / closing, window 1 opening / closing, window 2 opening / closing, etc. are displayed.
  • the user can select one or a plurality of generated events for which an image tick is desired by a touch operation.
  • FIG. 15A shows an example in which items such as door 1 opening / closing, window 1 opening / closing, motion detection, etc. are selected and determined. In this example, a simple example of an event is shown, but in reality, more events and event names are set.
  • a representative thumbnail of monitoring data when the selected event occurs and corresponding event-related data are displayed. Is displayed.
  • the display data at the start of display is mainly about events that occurred 5 hours before the current time. Event related data related to a plurality of (3 to 5) events in the preceding and following times and a representative thumbnail of the corresponding monitoring data are selected and displayed.
  • the user can select the “ordinary” button 514 by touch operation.
  • This button 514 becomes effective after the “designate” button 513 has already been operated and the determination operation 518 has been performed.
  • event-related data related to a plurality of (3 to 5) events before and after the event that occurred 5 hours before the current time, for example.
  • the representative thumbnail of the corresponding monitoring data is selected and displayed.
  • FIG. 15A illustrates an example in which event-related data is arranged in time order, managed independently for each selected type. An example of this arrangement will be described later with reference to FIG. 18A.
  • the display example of the event related data is not limited to this, and the event related data of different types of events can be combined and displayed by the setting shown in FIG. 15B.
  • the combination button 518a may be displayed before the enter button 518b is operated.
  • the event-related data of the item selected in the event list (the two items of door 2 opening / closing and motion detection are currently selected) are combined in time order. Can be set to be displayed. That is, when the combination button 518a and the determination button 518b are operated successively, for example, the arrangement and display of event-related data as described later with reference to FIG. 18B are performed.
  • the control data management unit 601 prior to requesting the control data management unit 601 to reproduce the monitoring data related to a desired event, the user determines what kind of event image reproduction the user wants to reproduce. The unit 601 can be notified.
  • FIG. 16A shows an operation screen that is subsequently displayed after the monitoring data request button 501 is operated in the smart phone GUI-1 menu shown in FIG.
  • a button that says “all”, “specify”, and “usual” to the user with the message "What kind of event images do you want to check?" 512, 513, and 514 are displayed.
  • the “ordinary” button 514 is selected.
  • an event list as shown in FIG. 16B is displayed.
  • This event list is a list generated by the display style processing unit 629 by the playback controller 623 shown in FIG. 13 reading event-related data and monitoring data from the storage medium 1030, performing filtering processing by the filtering unit 631.
  • the filtering is performed in the order in which the playback control unit 623 first reads event-related data from the storage medium 1010, performs filtering on the event-related data, and plays back the monitoring data corresponding to the extracted event-related data from the storage medium 1010. May be.
  • the event list is requested and displayed by the smart phone GUI-1, but the same operation is also possible by the television receiver GUI-2.
  • the operation by the television receiver GUI-2 can be realized by operating the cursor on the screen via the remote controller.
  • FIG. 16B the thumbnail of the monitoring data is shown in a simplified manner, but in reality, an image in the range of the viewing angle of the camera 611 is captured.
  • thumbnail 522 of the event 521 is selected from the list of FIG. 16B by a touch operation. Then, continuous reproduction of monitoring data, for example, about 10 minutes from, for example, 5 minutes before the time when this event 521 occurs is started by the reproduction controller 623 (shown in FIG. 13) and sent to the monitor. The video at this time is shown in FIG.
  • the video in FIG. 17 shows that a person 525 opens the door 526, enters the room, walks to the bed 527, and goes to bed 527.
  • the reproduction controller 623 displays a plurality of monitoring data corresponding to a plurality of event-related data in response to a command from the smart phone GUI-1 or the television receiver GUI-2, and the plurality of the displayed plurality When any monitoring data is selected from the monitoring data, the recording period of the designated monitoring data is continuously reproduced.
  • the playback controller 623 in response to a command from the smart phone GUI-1 or the television receiver GUI-2, fast-forwards the video capturing the event as described below.
  • Function, reverse feed function, frame advance function can be executed.
  • the playback controller 623 can refer to the event related data, the plurality of the monitoring data related to the plurality of events can be successively forward-played or normally played back.
  • the playback controller 623 can successively fast-forward playback or normal playback of a plurality of monitoring data related to a specified specific event.
  • the reproduction controller 623 shown in FIG. 13 includes an event processor that processes a plurality of event-related data, and the event processor rounds up a plurality of event-related data corresponding to a specific event.
  • the event may occur in a pulse manner.
  • detection of a loud sound and motion detection for example, motion detection when the blind is shaken by a wind
  • the detection time may be rounded as a series of continuous flows, and the monitoring data may be checked based on the event-related information that has been rounded.
  • image data of a fixed time length (5 minutes, 10 minutes, 15 minutes, or 20 minutes) captured by the camera 611 at the time of event detection is stored as monitoring data.
  • the time length for storing the monitoring data for each event may be arbitrarily changed.
  • the said period may differ according to the kind of event.
  • the time length for storing the monitoring data may be different depending on the time zone.
  • the arrangement method of events (the arrangement method of thumbnails corresponding to event-related data) can be arranged in any way by the arrangement application, and the thumbnails corresponding to the event-related data are displayed according to the arrangement. be able to.
  • FIG. 18A shows a display example when event-related data and thumbnails of monitoring data related to the event-related data are classified for each event.
  • the classification is shown as an example of a motion detection event, a door 1 opening / closing event, a window 1 opening / closing event, and a lighting 1 on / off event.
  • thumbnails 525a to 525d corresponding to event related data 526a to 526d related to the opening and closing of the door 1 are shown.
  • the event related data 526a to 526d are arranged in order of event occurrence time. If the user operates the touch operation surface of the smart phone GUI-1 to rub in the direction of the arrow 531a, the event-related data and the corresponding thumbnail later in time are displayed, and the operation is performed to rub in the direction of the arrow 531b. Then, event-related data with earlier time and corresponding thumbnails are displayed.
  • the event-related data regarding the opening / closing of the window 1 and the corresponding thumbnail are further displayed and rubbed in the direction of the arrow 532b.
  • event-related data relating to motion detection and corresponding thumbnails are displayed.
  • the control data management unit 601 includes a filtering unit 631.
  • the filtering unit 631 can filter and classify event-related data according to its type, or combine and display event-related data of events of different types.
  • FIG. 18B shows an example in which thumbnails of event-related data and corresponding monitoring data when a door 2 opening / closing event occurs and when a sound detection event occurs are displayed.
  • the event related data and the corresponding thumbnail later in time are displayed and rubbed in the direction of the arrow 531b. If operated, the event-related data and the corresponding thumbnail with earlier time are displayed.
  • FIG. 19 is a diagram for explaining an example of still another relationship between the smart phone and the event-related data displayed on the smart phone and another operation method.
  • event-related data is displayed as a list.
  • the name of each event occurrence source may be displayed in a tile style as shown in FIG. Of these multiple tiles (door 1, door 2, window 1-window 4, lighting 1-5, sound 561, sound / girls 562, sound / boys 563, etc.), the tile desired by the user was pressed.
  • the display state as shown in FIG. When the sound 561 is selected, event related data relating to all sounds is displayed. However, when the sound / girls 562 are selected, the event-related data relating to the girls 'sounds is displayed, and when the sound / boys 563 is selected, the event-related data relating to the boys' sounds is displayed.
  • FIG. 20 shows the structure of event-related data recorded in the management area 1021 and the monitoring data recorded in the data area 1011.
  • the event related data is classified for each event type. For example, door opening / closing, window opening / closing, lighting on / off, air conditioner on / off, television receiver on / off, motion detection, and the like.
  • Sensor items sensor 1, sensor 2,
  • event data is described for each sensor item.
  • the event data includes, for example, event occurrence time, monitoring data recording start time, monitoring data recording end time, monitoring data recording start address, monitoring data recording end address, and thumbnail address.
  • the recording start address of the monitoring data, the recording end address of the monitoring data, and the thumbnail address indicate the addresses of the data area 1011, and the reproduction controller 623 refers to the addresses to store necessary data in the storage medium 1030. Can be read from and played back.
  • monitoring data it is easy to display monitoring data with high image quality, for example, in the television receiver GUI-2. Further, since the monitoring data is not transmitted to the outside via the Internet 300, it is particularly effective when managing personal monitoring data. Note that data sent to the server 1000 via the Internet 300 and data sent from the server 1000 to the HGW are subjected to concealment processing.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

Selon un mode de réalisation, l'invention concerne : un dispositif d'assistance mobile pouvant évaluer de manière globale des données émanant de divers capteurs, pouvant déterminer un contenu à commander par la suite et pouvant améliorer un effet de surveillance ; et un procédé associé. Un mode de réalisation comprend les éléments suivants : un corps mobile ; un capteur monté sur le corps mobile ; et une unité de commande servant à déterminer la luminosité d'un espace périphérique à partir de la sortie du capteur et à délivrer, en fonction des résultats de la détermination, un signal de commande destiné à commander le mouvement d'un autre appareil.
PCT/JP2018/015308 2018-04-11 2018-04-11 Dispositif et procédé d'assistance mobile WO2019198188A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880077501.6A CN111418269B (zh) 2018-04-11 2018-04-11 移动式辅助装置及移动式辅助方法
PCT/JP2018/015308 WO2019198188A1 (fr) 2018-04-11 2018-04-11 Dispositif et procédé d'assistance mobile

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/015308 WO2019198188A1 (fr) 2018-04-11 2018-04-11 Dispositif et procédé d'assistance mobile

Publications (1)

Publication Number Publication Date
WO2019198188A1 true WO2019198188A1 (fr) 2019-10-17

Family

ID=68164236

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/015308 WO2019198188A1 (fr) 2018-04-11 2018-04-11 Dispositif et procédé d'assistance mobile

Country Status (2)

Country Link
CN (1) CN111418269B (fr)
WO (1) WO2019198188A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07334243A (ja) * 1994-06-06 1995-12-22 Matsushita Electric Ind Co Ltd 移動ロボット
JP2005103678A (ja) * 2003-09-29 2005-04-21 Toshiba Corp ロボット装置
JP2007249898A (ja) * 2006-03-20 2007-09-27 Funai Electric Co Ltd 移動式装置及び自走式掃除機
JP2017038894A (ja) * 2015-08-23 2017-02-23 日本電産コパル株式会社 掃除ロボット
JP2017131556A (ja) * 2016-01-29 2017-08-03 東芝ライフスタイル株式会社 電気掃除機

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180127556A (ko) * 2014-08-18 2018-11-28 파나소닉 아이피 매니지먼트 가부시키가이샤 제어 시스템 및 센서 유닛

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07334243A (ja) * 1994-06-06 1995-12-22 Matsushita Electric Ind Co Ltd 移動ロボット
JP2005103678A (ja) * 2003-09-29 2005-04-21 Toshiba Corp ロボット装置
JP2007249898A (ja) * 2006-03-20 2007-09-27 Funai Electric Co Ltd 移動式装置及び自走式掃除機
JP2017038894A (ja) * 2015-08-23 2017-02-23 日本電産コパル株式会社 掃除ロボット
JP2017131556A (ja) * 2016-01-29 2017-08-03 東芝ライフスタイル株式会社 電気掃除機

Also Published As

Publication number Publication date
CN111418269B (zh) 2022-06-03
CN111418269A (zh) 2020-07-14

Similar Documents

Publication Publication Date Title
JP6827758B2 (ja) 移動式アシスト装置及び移動式アシスト方法
US10869006B2 (en) Doorbell camera with battery at chime
CN101682695B (zh) 可配置用于自主自学操作的照相机
CN105939236A (zh) 控制智能家居设备的方法及装置
CN101682694A (zh) 可配置用于自主操作的照相机
US10354678B2 (en) Method and device for collecting sounds corresponding to surveillance images
US11736760B2 (en) Video integration with home assistant
US11743578B2 (en) Systems and methods of power-management on smart devices
US11483451B2 (en) Methods and systems for colorizing infrared images
US9705696B2 (en) Monitoring system
CN104678770B (zh) 位移事件检测方法和系统
WO2015196895A1 (fr) Dispositif d'éclairage commandé par un système de reconnaissance d'image
WO2019198188A1 (fr) Dispositif et procédé d'assistance mobile
US10582130B1 (en) System and method for connecting a network camera
CN111418204B (zh) 数据监视及管理装置、事件数据监视方法
CN106919912A (zh) 室内监控的方法及装置
JP2018064177A (ja) データ監視及び管理装置、イベントデータ監視方法
KR100464372B1 (ko) 홈오토메이션시스템의현관출입자영상녹화/재생장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18914843

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18914843

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP