WO2021005750A1 - Behavior type estimation device, behavior type estimation method, and behavior type estimation program - Google Patents

Behavior type estimation device, behavior type estimation method, and behavior type estimation program Download PDF

Info

Publication number
WO2021005750A1
WO2021005750A1 PCT/JP2019/027340 JP2019027340W WO2021005750A1 WO 2021005750 A1 WO2021005750 A1 WO 2021005750A1 JP 2019027340 W JP2019027340 W JP 2019027340W WO 2021005750 A1 WO2021005750 A1 WO 2021005750A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
place
action
behavior
action type
Prior art date
Application number
PCT/JP2019/027340
Other languages
French (fr)
Japanese (ja)
Inventor
省吾 厚地
皆木 宗
賢一 小泉
準史郎 神田
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2020510616A priority Critical patent/JPWO2021005750A1/en
Priority to PCT/JP2019/027340 priority patent/WO2021005750A1/en
Publication of WO2021005750A1 publication Critical patent/WO2021005750A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management

Definitions

  • the present invention relates to a behavior type estimation device, a behavior type estimation method, and a behavior type estimation program.
  • the present invention relates to an action type estimation device that estimates the user's behavior indoors, an action type estimation method, and an action type estimation program.
  • Patent Document 1 discloses a technique for increasing variations in behavior types of a user to be discriminated by using a plurality of data detected by a wearable sensor worn by the user. Specifically, a technique for further subdividing the behavior classified by the acceleration of the user according to the heart rate of the user is disclosed.
  • a sensor such as an accelerometer that detects the movement of the user is indispensable. Therefore, the user needs to wear a wearable sensor or hold a smartphone terminal or the like. Further, when providing a service based on the information transmitted from the wearable sensor, the service provider needs to aggregate this information, and power consumption for continuous data upload becomes an issue. Further, in the case of BYOD (Bring your own device), a procedure such as obtaining permission for data aggregation from each user becomes an issue. Further, when determining the behavior type of a large number of users, it is necessary to prepare as many wearable sensors or smartphone terminals as there are users, which imposes a heavy cost burden.
  • An object of the present invention is to discriminate an action type at low cost with less burden on the user.
  • the behavior type estimation device is A position estimation unit that estimates the user's position indoors, A place correspondence table in which a place area representing a place indoors and a place action type which is an action type of the user in the place area in the room is set is acquired, and matches the position of the user from the place correspondence table.
  • An action estimation unit that acquires a place action type corresponding to a place area and estimates the user's action based on the acquired place action type.
  • a behavior history generation unit that generates a history of the user's behavior as behavior history information using the estimation result of the user's behavior is provided.
  • the behavior type estimation device has an effect that the burden on the user is small and the behavior type can be discriminated at low cost.
  • FIG. 1 The block diagram of the action type estimation system which concerns on Embodiment 1.
  • FIG. 1 The flow chart which shows the operation of the information update part which concerns on Embodiment 1.
  • the flow chart which shows the operation of the action estimation part which concerns on Embodiment 1.
  • the block diagram of the action type estimation system which concerns on the modification of Embodiment 1.
  • the action type estimation system 500 includes a user communication device 21, an indoor communication device 22, and an action type estimation device 100.
  • the action type estimation device 100 estimates the behavior of the user 20 in the indoor 30.
  • the indoor 30 is in the building of the company where the user 20 is working.
  • the user 20 is an employee working indoors 30 in the company building.
  • the company determines the amount of exercise of the employee by analyzing the behavior of the employee estimated by the behavior type estimation device 100, and supports the improvement of the health of the employee.
  • the user communication device 21 is a device that transmits radio waves by short-range wireless communication such as a beacon. Specifically, the user communication device 21 is built in an employee ID card, a name plate attached to the employee ID card, or a key chain or the like carried by the user 20. The user communication device 21 is a device whose cost is lower than that of a wearable terminal or a mobile terminal device, and the burden on the user 20 to carry is small. Since the radio wave transmitted by the user communication device 21 may be beacon information, it is not necessary for the user 20 to intentionally transmit the information by operating the application or the like, and it is not necessary for the user 20 to obtain permission for transmission.
  • the radio wave transmitted by the user communication device 21 may be beacon information, it is not necessary for the user 20 to intentionally transmit the information by operating the application or the like, and it is not necessary for the user 20 to obtain permission for transmission.
  • a plurality of indoor communication devices 22 are installed indoors 30 such as a company building.
  • the indoor communication device 22 receives radio waves transmitted from the user communication device 21 for short-range wireless communication carried by the user 20. Then, the indoor communication device 22 transmits the radio wave transmitted from the user communication device 21 as a signal 31 to the action type estimation device 100.
  • the indoor communication device 22 is set indoors 30 at intervals of several meters to several tens of meters.
  • the indoor communicator 22 is installed at the entrance and exit of a place in the indoor 30.
  • the place in the indoor 30 is, for example, an area for each department or section, a reference room, a dining room, a washroom, a corridor, a staircase, an elevator, or a conference room.
  • the user 20 may have a personal terminal 23.
  • the personal terminal 23 is a terminal used by the user 20. Specifically, it is a device such as a PC (Personal Computer), a smartphone, a tablet terminal, or a mobile phone terminal.
  • the user 20 uses the personal terminal 23 to perform processing such as updating various information or viewing action history information.
  • the action type estimation device 100 is a computer.
  • the action type estimation device 100 includes a processor 910 and other hardware such as a memory 921, an auxiliary storage device 922, an input interface 930, an output interface 940, and a communication device 950.
  • the processor 910 is connected to other hardware via a signal line and controls these other hardware.
  • the action type estimation device 100 includes a position estimation unit 110, an action estimation unit 120, an action history generation unit 130, an information update unit 140, and a storage unit 150 as functional elements.
  • the storage unit 150 stores personal information 151, location correspondence table 152, detailed correspondence table 153, location schedule 154, and action history information 155.
  • the functions of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140 are realized by software.
  • the storage unit 150 is provided in the memory 921.
  • the processor 910 is a device that executes an action type estimation program.
  • the action type estimation program is a program that realizes the functions of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140.
  • the processor 910 is an IC (Integrated Circuit) that performs arithmetic processing. Specific examples of the processor 910 are a CPU (Central Processing Unit), a DSP (Digital Signal Processor), and a GPU (Graphics Processing Unit).
  • the memory 921 is a storage device that temporarily stores data.
  • a specific example of the memory 921 is a SRAM (Static Random Access Memory) or a DRAM (Dynamic Random Access Memory).
  • the auxiliary storage device 922 is a storage device that stores data.
  • a specific example of the auxiliary storage device 922 is an HDD.
  • the auxiliary storage device 922 may be a portable storage medium such as an SD (registered trademark) memory card, CF, NAND flash, flexible disc, optical disk, compact disc, Blu-ray (registered trademark) disc, or DVD.
  • HDD is an abbreviation for Hard Disk Drive.
  • SD® is an abbreviation for Secure Digital.
  • CF is an abbreviation for CompactFlash®.
  • DVD is an abbreviation for Digital Versaille Disc.
  • the input interface 930 is a port connected to an input device such as a mouse, keyboard, or touch panel. Specifically, the input interface 930 is a USB (Universal Serial Bus) terminal. The input interface 930 may be a port connected to a LAN (Local Area Network).
  • the output interface 940 is a port to which a cable of an output device such as a display is connected. Specifically, the output interface 940 is a USB terminal or an HDMI (registered trademark) (High Definition Multimedia Interface) terminal. Specifically, the display is an LCD (Liquid Crystal Display).
  • the action type estimation device 100 may display the action history information 155 on the display via the output interface 940.
  • the communication device 950 has a receiver and a transmitter.
  • the communication device 950 is wirelessly connected to a communication network such as a LAN, the Internet, or a telephone line.
  • the communication device 950 is a communication chip or a NIC (Network Interface Card).
  • the action type estimation device 100 acquires the signal 31 from the indoor communication device 22 via the communication device 950. Further, the action type estimation device 100 may transmit the action history information 155 to the personal terminal 23 of the user 20 via the communication device 950.
  • the action type estimation program is read into the processor 910 and executed by the processor 910.
  • the memory 921 not only the action type estimation program but also the OS (Operating System) is stored.
  • the processor 910 executes the action type estimation program while executing the OS.
  • the action type estimation program and the OS may be stored in the auxiliary storage device 922.
  • the action type estimation program and the OS stored in the auxiliary storage device 922 are loaded into the memory 921 and executed by the processor 910.
  • a part or all of the action type estimation program may be incorporated in the OS.
  • the action type estimation device 100 may include a plurality of processors that replace the processor 910. These plurality of processors share the execution of the action type estimation program.
  • Each processor like the processor 910, is a device that executes an action type estimation program.
  • Data, information, signal values and variable values used, processed or output by the action type estimation program are stored in the memory 921, the auxiliary storage device 922, or the register or cache memory in the processor 910.
  • the "department" of each of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140 may be read as “processing", “procedure”, or “process”. Further, the "process” of the position estimation process, the action estimation process, the action history generation process, and the information update process may be read as "program”, “program product”, or "computer-readable storage medium on which the program is recorded”.
  • the action type estimation program causes a computer to execute each process, each procedure or each process in which the "part" of each of the above parts is read as “process", “procedure” or “process”.
  • the action type estimation method corresponds to each procedure in which the "part” of each of the above parts is read as “procedure”.
  • the action type estimation program may be provided stored in a computer-readable recording medium.
  • the behavior type estimation program may be provided as a program product.
  • the operation procedure of the action type estimation device 100 corresponds to the action type estimation method.
  • the program that realizes the operation of the action type estimation device 100 corresponds to the action type estimation program.
  • the operation of the information updating unit 140 according to the present embodiment will be described with reference to FIG.
  • the information updating unit 140 updates at least one of the personal information 151, the location correspondence table 152, the detailed correspondence table 153, and the location schedule 154.
  • step S101 the information update unit 140 determines whether or not there is an update request for the information stored in the storage unit 150. Specifically, the information update unit 140 determines whether or not an update request has been received from the personal terminal 23 of the user 20 for the information stored in the storage unit 150.
  • the update request includes information to be updated and update contents. Specifically, when the user 20 who used to sit and browse in the reference room will now stand and browse for the sake of health, the location correspondence table 152 or the detailed correspondence table 153 will be changed. There is a case.
  • step S102 the information update unit 140 updates the information stored in the storage unit 150 based on the update request. If there is no update request, the process returns to step S101.
  • FIG. 3 is an example of personal information 151 according to the present embodiment.
  • Information about individual employees is set in the personal information 151.
  • personal information such as an employee number, an employee name, an affiliation department, and an extension number is set in the personal information 151.
  • FIG. 4 is an example of the location correspondence table 152 according to the present embodiment.
  • a place area 521 representing a place in the indoor 30 and a place action type 522 which is an action type of the user 20 in the place area 521 in the indoor 30 are set.
  • a coordinate system is set in the indoor 30, and the location area 521 is represented by coordinates in the location correspondence table 152.
  • the place area 521 may be represented by the name of the place in the indoor 30. In that case, a table corresponding to the coordinate system and the name of the place is separately provided indoors 30.
  • the place action type 522 the type of the action mainly performed by the user 20 in the place area 521 is set.
  • the place correspondence table 152 is provided for each user. As described above, the user 20 who has previously set the place action type 522 in the reference room as "sitting" may reset the place correspondence table 152 so that he / she can stand and browse for the sake of his / her health. In this way, the action type at the place differs depending on the user 20.
  • FIG. 5 is an example of the detailed correspondence table 153 according to the present embodiment.
  • a detailed area 531 that divides the place area 521 included in the place correspondence table 152 and a detailed action type 532 that is an action type corresponding to the detailed area 531 are set.
  • the area and action type setting method is the same as in the location correspondence table 152.
  • the reference room is subdivided into a bookshelf area and a reading desk area.
  • the user 20 stands mainly in the bookshelf area represented by the coordinates [5,0], [6,0], [6,6], [3,6].
  • "Standing" is set.
  • the reading desk area represented by the coordinates [3,0], [5,0], [5,6], [3,6] since the user 20 is mainly sitting, "sitting" is set. Will be done.
  • FIG. 6 is an example of the place schedule 154 according to the present embodiment.
  • a schedule in the place area is set. Specifically, the content such as "At the cafeteria, a standing party from 12:00 to 15:00 on March 29, 2019" is set.
  • the position estimation unit 110 estimates the position of the user 20 in the indoor 30.
  • the position of the user 20 may be a local coordinate system set in the indoor 30 or a name that specifies the position in the indoor 30.
  • the position of the user 20 may be any information as long as the position in the indoor 30 can be specified.
  • the position estimation unit 110 is a signal communicated between a plurality of indoor communication devices 22 for short-range wireless communication installed indoors 30 and a user communication device 21 for short-range wireless communication carried by the user 20. 31 is acquired, and the position of the user 20 is estimated from the acquired signal 31.
  • a plurality of indoor communication devices 22 installed in the indoor 30 receive radio waves transmitted from the user communication device 21 for short-range wireless communication carried by the user 20, so that the user 20 can receive radio waves.
  • the position is estimated.
  • the indoor communication device 22 receives radio waves from the user communication device 21.
  • the indoor communication device 22 transmits the received radio wave as a signal 31 to the position estimation unit 110.
  • the position estimation unit 110 receives the signal 31 via the communication device 950, and estimates the position of the user 20 based on the signal 31.
  • step S202 the position estimation unit 110 determines whether or not the position of the user 20 has been successfully acquired. If successful, the process proceeds to step S203. If unsuccessful, the process returns to step S201.
  • the behavior estimation unit 120 estimates the behavior of the user 20 based on the position of the user 20 acquired by the position estimation unit 110 and the information stored in the storage unit 150.
  • the information stored in the storage unit 150 is at least one of the location correspondence table 152, the detailed correspondence table 153, and the location schedule 154.
  • step S301 the action estimation unit 120 acquires the location correspondence table 152, and acquires the location action type 522 corresponding to the location area 521 matching the position of the user 20 from the location correspondence table 152.
  • the behavior estimation unit 120 estimates the behavior of the user 20 based on the acquired location behavior type 522.
  • the behavior estimation unit 120 selects the location correspondence table 152 corresponding to the user 20 if the location correspondence table 152 is set for each user.
  • the location correspondence table 152 may be set for each job title or role. In that case, the behavior estimation unit 120 acquires the job title or role of the user 20 from the personal information 151. Then, the behavior estimation unit 120 selects the location correspondence table 152 corresponding to the job title or role of the user 20.
  • step S302 the action estimation unit 120 acquires the detailed correspondence table 153 in which the detailed area 531 that divides the place area 521 included in the place correspondence table 152 and the detailed action type 532 corresponding to the detailed area 531 are set. ..
  • the action estimation unit 120 acquires the detailed action type corresponding to the detailed area 531 that matches the position of the user 20 from the detailed correspondence table 153.
  • the action estimation unit 120 estimates the action of the user 20 based on the place action type 522 that matches the position of the user 20 and the detailed action type 532 that matches the position of the user 20.
  • step S303 the action estimation unit 120 acquires the place schedule 154 in which the schedule in the place area is set.
  • the action estimation unit 120 estimates the action of the user 20 based on the place action type 522, the detailed action type 532, and the current schedule set in the place schedule 154, using the current time and the place schedule 154. ..
  • the behavior estimation unit 120 may estimate the behavior of the user 20 based only on the location behavior type 522.
  • the action estimation unit 120 estimates the action of the user 20 based only on the place action type 522. For example, when the user 20 is in the corridor, it is estimated to be "walking", and when the user 20 is in the elevator, it is estimated to be "standing”.
  • the action estimation unit 120 estimates the action of the user 20 based on the place action type 522 that matches the position of the user 20 and the detailed action type 532 that matches the position of the user 20. For example, when the position estimation unit 110 detects the user 20 in the area of the bookshelf in the data room, the place action type 522 is "standing" and the detailed action type 532 is "sitting". At this time, the behavior estimation unit 120 gives priority to the detailed correspondence table 153 and estimates the behavior of the user 20 as "sitting".
  • the behavior estimation unit 120 estimates the behavior of the user 20 in consideration of the place schedule 154. For example, when the position estimation unit 110 detects the user 20 in the dining table area of the dining room at 13:00 on March 29, 2019, the place action type 522 becomes “sitting" and the detailed action type 532 also "sitting". ". However, in the location correspondence table 153, since it is a standing party at the cafeteria from 12:00 to 15:00 on March 29, 2019, the behavior estimation unit 120 gives priority to the location schedule 154 and gives the user 20's behavior. Estimated to be "standing”. In addition, in the place schedule table 154, the action type may be set in association with the contents of the schedule. Alternatively, a table such as a schedule-specific action type in which the schedule content and the action type are associated with each other may be stored in the storage unit 150. For example, "standing" is associated with a standing party as an action type by schedule.
  • the action estimation unit 120 may estimate the action of the user 20 based on the transition of the action type of the user 20 in the time axis direction or the transition of the position of the user 20 in the time axis direction. As a specific example, it can be estimated that the behavior estimation unit 120 has taken the action of "fast walking” because the position of the user 20 has moved 10 m in 5 seconds. Alternatively, since the behavior estimation unit 120 has taken the action type of "sitting" immediately before, it can be estimated that the possibility of taking the action of "running" is low.
  • the action history generation unit 130 generates the action history of the user 20 as the action history information 155 by using the estimation result of the action of the user 20. Specifically, it is as follows.
  • step S204 the action history generation unit 130 determines whether or not the estimation result of the action of the user 20 is the same as the previous action. If they are the same, that is, if there is no change in the behavior of the user 20, the process returns to step S201. If they are not the same, that is, if there is a change in the behavior of the user 20, the process proceeds to step S205.
  • the action history generation unit 130 records a new action and a start time of the new action in the action history information 155. Specifically, the action history generation unit 130 records the current time as the start time of a new action. Further, the action history generation unit 130 may record the current time as the end time of the end of the action.
  • FIG. 9 is an example of the action history information 155 according to the present embodiment.
  • a time, a place, and an action as an estimation result are set for each employee who is the user 20.
  • the duration may be aggregated and recorded for each action of each employee.
  • any recording method may be used as long as the action can be recorded for each user 20.
  • a position estimation method other than the beacon may be used.
  • the position estimation unit may estimate the user's position by using the history information of the gate passage by the ACS (Access Control System).
  • the position estimation unit may estimate the position of the user by using the function of the elevator.
  • the position estimation unit uses an elevator control signal.
  • a beacon receiver may be installed in the elevator so that the user can be identified as being in the elevator car.
  • the position estimation unit may estimate the position of the user by using the person identification function by the surveillance camera.
  • the position estimation unit may estimate the user's position by combining a beacon, an ACS, an elevator, and a surveillance camera.
  • ACS is one of the security systems that unlocks the gate or door by holding an employee ID card or the like when entering a company premises or a specific room.
  • a log is taken of what time the individual was at the place.
  • the position of "dining room, crossing corridor, working room” can be estimated from the log such as "dining room exit, work room entry”.
  • the estimated position can be used to estimate an action type such as "sitting work, walking, standing work”.
  • the movement speed can be calculated from the time of the log between ACSs.
  • the position is estimated by using a plurality of indoor communication devices for short-range wireless communication installed indoors as a receiver and a user communication device for short-range wireless communication carried by the user as a beacon transmitter.
  • the department estimates the user's position.
  • a plurality of indoor communication devices for short-range wireless communication installed indoors may be used as a transmitter, and a user communication device for short-range wireless communication carried by the user may be used as a receiver.
  • the position estimation unit may estimate the position of the user by acquiring the signal transmitted from the indoor communication device and received by the user communication device from the user communication device.
  • a specific example of the user communication device is a mobile terminal such as a smartphone terminal or a tablet terminal.
  • the behavior estimation unit acquires the user's job title or role from the personal information and selects the location correspondence table corresponding to the job title or role. If there is no personal information, the job title or role of the user 20 may be acquired by referring to the table of attributes such as job title or role.
  • the information update unit receives an update request from the user and updates information such as a location correspondence table or a detailed correspondence table.
  • the action type estimation device may automatically create or update the location correspondence table or the detailed correspondence table by collecting and learning the activity record for each position of each user indoors.
  • the behavior estimation unit may be provided with the following functions to improve the estimation accuracy of the user's behavior.
  • (1) Although the positions at the start and end of the movement are known, the position during the movement may be unknown. In such a case, the behavior estimation unit estimates the moving position by using the information that can be acquired by the building side such as the operation history information of the elevator and the history information of passing through the gate by ACS, and the moving time. Then, the behavior estimation unit selects a location correspondence table, a detailed correspondence table, or a location schedule using the estimated position, and estimates the behavior. (2) If the user is stopped near his / her seat, it is highly likely that he / she is sitting. On the other hand, if you are stopped near someone else's seat, you are likely to be standing.
  • the behavior estimation unit extracts a person having a similar behavior history on that day, and estimates the behavior assuming that the user is taking the same behavior as that person.
  • the behavior estimation unit estimates the user's behavior by adding information such as time, day of the week, work time and leaving time, and schedules of the previous day and the next day.
  • a place correspondence table and a detailed correspondence table different from the usual ones are set at the end of the term or immediately after joining the company.
  • the behavior estimation department takes into account that the behavior changes normally when the person is late or leaves early.
  • the behavior estimation department presumes that the rehearsal of the presentation is performed standing in the meeting room at the meeting of the previous day.
  • the behavior estimation unit estimates whether the user is going up or down the stairs by identifying the floors before and after the stairs.
  • the functions of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140 are realized by software.
  • the functions of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140 may be realized by hardware.
  • FIG. 10 is a diagram showing a configuration of an action type estimation system 500 according to a modified example of the present embodiment.
  • the action type estimation device 100 includes an electronic circuit 909, a memory 921, an auxiliary storage device 922, an input interface 930, and an output interface 940.
  • the electronic circuit 909 is a dedicated electronic circuit that realizes the functions of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140.
  • the electronic circuit 909 is specifically a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, a logic IC, a GA, an ASIC, or an FPGA.
  • GA is an abbreviation for Gate Array.
  • ASIC is an abbreviation for Application Specific Integrated Circuit.
  • FPGA is an abbreviation for Field-Programmable Gate Array.
  • the functions of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140 may be realized by one electronic circuit or may be distributed to a plurality of electronic circuits.
  • some functions of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140 may be realized by an electronic circuit, and the remaining functions may be realized by software.
  • Each of the processor and the electronic circuit is also called a processing circuit. That is, in the action type estimation device 100, the functions of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140 are realized by the processing circuit.
  • the user's action type is estimated only from the user's position indoors by using the location correspondence table and the detailed correspondence table in which the indoor position and the action type are linked. Therefore, according to the behavior type estimation system according to the present embodiment, the behavior can be estimated without using a motion sensor such as an accelerometer. In addition, the burden on the user to wear the wearable terminal can be reduced. In addition, by diverting indoor person positioning such as normal entry / exit management, it becomes easier to collect information on the service provider side, and the burden on cost can be further reduced.
  • the action type estimation system includes a plurality of information such as a location correspondence table, a detailed correspondence table, and a location schedule, and selects and refers to appropriate information according to the user's personal information, location, or time zone. To do. Therefore, according to the behavior type estimation system according to the present embodiment, it is possible to improve the estimation accuracy of the user's behavior.
  • the behavior of a building user can be estimated and the amount of activity can be determined by having a device such as a beacon, which has a lower cost than a relatively high cost device such as a wearable terminal. Can be useful. It is difficult to improve the accuracy of distinguishing between climbing and descending stairs using only the acceleration sensor. Smartphones often use barometric pressure sensors to determine whether they are climbing or descending stairs. In the action type estimation system according to the present embodiment, it is possible to determine the accuracy of a device such as a beacon if the stay floors before and after the stairs can be identified.
  • Embodiment 2 In this embodiment, the points different from those in the first embodiment will be mainly described.
  • the same components as those in the first embodiment may be designated by the same reference numerals, and the description thereof may be omitted.
  • the configuration of the action type estimation device 100a according to the present embodiment will be described with reference to FIG.
  • the action type estimation device 100a according to the present embodiment includes a user schedule 156 in addition to the configuration of the action type estimation device 100 according to the first embodiment.
  • FIG. 12 is a diagram showing an example of the user schedule 156 according to the present embodiment.
  • a date and time 561, a place 562, and a work content 563 are set as a work schedule 564 of the user 20.
  • the work schedule 564 is a daily schedule of the user 20, and is composed of a date and time 561, a place 562, and a work content 563.
  • an employee number and an employee name that identify the user 20 are set in the user calendar 156.
  • the user schedule 156 may be set with an action type corresponding to the work content 563.
  • the personal information 151 and the user calendar 156 are described as separate information, but the personal information 151 and the user calendar 156 may be configured as one information.
  • the information update unit 140 also updates the user schedule 156 based on the update request from the user. That is, the information updating unit 140 updates at least one of the location correspondence table 152, the detailed correspondence table 153, the location schedule 154, and the user schedule 156.
  • step S301a the behavior estimation unit 120 acquires the user schedule 156. Specifically, the behavior estimation unit 120 selects the user schedule 156 corresponding to the user 20.
  • step S301b the action estimation unit 120 determines whether or not the user 20 is in a place that matches the work content 563 by using the current time, the position of the user 20, and the user schedule 156. If it is determined that the user 20 is in a place matching the work content 563, the process proceeds to step S301c.
  • step S301 the behavior estimation unit 120 estimates the behavior of the user 20 based on the location behavior type 522 corresponding to the location region 521 that matches the location of the user 20 in the location correspondence table 152.
  • the processing of steps S301 to S303 is the same as that of the first embodiment.
  • step S301c the action estimation unit 120 estimates the action of the user 20 based on the work content 563 that matches the work schedule of the user 20. Specifically, it is acquired that the current time is 9:30 on March 29, 2019, and the position of the user 20 is the conference room A. From the user schedule 156, the behavior estimation unit 120 determines that the user 20 is in the presentation, and estimates the behavior of the user 20 as "standing".
  • Embodiment 3 In this embodiment, the points different from those in the first embodiment will be mainly described.
  • the same components as those in the first embodiment may be designated by the same reference numerals, and the description thereof may be omitted.
  • the configuration of the action type estimation device is the same as the configuration of the action type estimation device 100 according to the first embodiment.
  • the user's action type includes the type of the user's exercise.
  • the action type of the user is a type of exercise such as standing, sitting, walking, running, or fast walking.
  • the action history generation unit 130 calculates the amount of exercise of the user based on the action type of the user.
  • FIG. 14 is a flow chart showing the operation of the action type estimation device 100 according to the present embodiment.
  • FIG. 14 is a diagram corresponding to FIG. 7 of the first embodiment. The difference from FIG. 7 in FIG. 14 is that step S204a is added and step S205 becomes step S205a.
  • step S204 the action history generation unit 130 determines whether or not the estimation result of the action of the user 20 is the same as the previous action. If they are not the same, that is, if there is a change in the behavior of the user 20, the process proceeds to step S204a.
  • step S204a the action history generation unit 130 calculates the amount of exercise of the user based on the action type of the user. Specifically, the action history generation unit 130 calculates the calorie consumption in the action of the completed user 20 as the amount of exercise.
  • a method of calculating calories burned there is a method of calculating calories burned using a coefficient set in advance for each behavior type, for example, a METS (Metabolic equivalents) value.
  • the action history generation unit 130 calculates the calorie consumption by multiplying the METS value corresponding to the type of completed action by the weight of the user and the duration of the action type.
  • the duration of the action type is calculated as the time from the start time of the completed action type to the current time.
  • step S205a the action history generation unit 130 records the new action and the start time of the new action in the action history information 155, and also records the calories burned in the finished action.
  • FIG. 15 is a diagram showing an example of action history information 155a according to the present embodiment.
  • the action history information 155a the calorie consumption corresponding to each action is stored as the amount of exercise.
  • the action history information 155a may record a cumulative value of the amount of exercise accumulated.
  • each part of the action type estimation device has been described as an independent functional block.
  • the configuration of the action type estimation device does not have to be the configuration as in the above-described embodiment.
  • the functional block of the action type estimation device may have any configuration as long as the functions described in the above-described embodiment can be realized.
  • the action type estimation device may be a system composed of a plurality of devices instead of one device.
  • a plurality of parts may be combined and carried out.
  • one part of these embodiments may be implemented.
  • these embodiments may be implemented in any combination as a whole or partially. That is, in the first and second embodiments, it is possible to freely combine the embodiments, modify any component of each embodiment, or omit any component in each embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Physics & Mathematics (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Game Theory and Decision Science (AREA)
  • Development Economics (AREA)
  • Data Mining & Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

In the present invention, a position estimation unit (110) estimates the position of a user (20) in an indoor space (30). A behavior estimation unit (120) acquires a location mapping table (152) in which are configured: location areas, which represent locations in the indoor space (30); and location behavior types, which are behavior types of the user in the location areas of the indoor space (30). The behavior estimation unit (120) acquires, from the location mapping table (152), the location behavior type which corresponds to the location area matching the position of the user (20), and estimates the behavior of the user (20) on the basis of the acquired location behavior type. A behavior history generation unit (130) uses the results of estimating the behavior of the user (20) to generate a history of user behavior as behavior history information (155).

Description

行動種別推定装置、行動種別推定方法、および行動種別推定プログラムBehavior type estimation device, behavior type estimation method, and behavior type estimation program
 本発明は、行動種別推定装置、行動種別推定方法、および行動種別推定プログラムに関する。特に、屋内におけるユーザの行動を推定する行動種別推定装置、行動種別推定方法、および行動種別推定プログラムに関する。 The present invention relates to a behavior type estimation device, a behavior type estimation method, and a behavior type estimation program. In particular, the present invention relates to an action type estimation device that estimates the user's behavior indoors, an action type estimation method, and an action type estimation program.
 近年、社員に対して健康増進を支援する動きが高まっている。このような動きとともに、人の行動あるいは運動量を自動的に判別する技術が開示されている。 In recent years, there has been an increasing movement to support employees to improve their health. Along with such movements, a technique for automatically discriminating a person's behavior or amount of exercise is disclosed.
 特許文献1では、ユーザが身に着けているウェアラブルセンサにより検出された複数のデータを用いて、判別対象とするユーザの行動種別のバリエーションを増やす技術が開示されている。具体的には、ユーザの加速度により分類された行動を、ユーザの心拍数により、さらに細分化する技術が開示されている。 Patent Document 1 discloses a technique for increasing variations in behavior types of a user to be discriminated by using a plurality of data detected by a wearable sensor worn by the user. Specifically, a technique for further subdividing the behavior classified by the acceleration of the user according to the heart rate of the user is disclosed.
特開2014-212915号公報Japanese Unexamined Patent Publication No. 2014-212915
 特許文献1の技術では、加速度計といったユーザの動きを検知するセンサが必須である。よって、ユーザは、ウェアラブルセンサの装着、あるいは、スマートフォン端末などの保持が必要になる。また、ウェアラブルセンサから送信される情報を元にサービスを提供する際、これらの情報をサービス提供側が集約する必要があり、継続的なデータアップロードの為の電力消費が課題となる。また、BYOD(Bring your own device)の場合には、各ユーザに対しデータ集約の許可を得るといった手続きが課題となる。さらに、多数のユーザの行動種別判定を行う場合、ユーザの数だけウェアラブルセンサあるいはスマートフォン端末を準備する必要がありコスト面での負担も大きい。 In the technology of Patent Document 1, a sensor such as an accelerometer that detects the movement of the user is indispensable. Therefore, the user needs to wear a wearable sensor or hold a smartphone terminal or the like. Further, when providing a service based on the information transmitted from the wearable sensor, the service provider needs to aggregate this information, and power consumption for continuous data upload becomes an issue. Further, in the case of BYOD (Bring your own device), a procedure such as obtaining permission for data aggregation from each user becomes an issue. Further, when determining the behavior type of a large number of users, it is necessary to prepare as many wearable sensors or smartphone terminals as there are users, which imposes a heavy cost burden.
 本発明は、ユーザにとって負担が少なく、かつ、低コストに行動種別を判別することを目的とする。 An object of the present invention is to discriminate an action type at low cost with less burden on the user.
 本発明に係る行動種別推定装置は、
 屋内におけるユーザの位置を推定する位置推定部と、
 前記屋内における場所を表す場所領域と前記屋内における場所領域での前記ユーザの行動種別である場所行動種別とが設定された場所対応表を取得し、前記場所対応表から前記ユーザの位置に合致する場所領域に対応する場所行動種別を取得し、取得した場所行動種別に基づいて前記ユーザの行動を推定する行動推定部と、
 前記ユーザの行動の推定結果を用いて、前記ユーザの行動の履歴を行動履歴情報として生成する行動履歴生成部とを備えた。
The behavior type estimation device according to the present invention is
A position estimation unit that estimates the user's position indoors,
A place correspondence table in which a place area representing a place indoors and a place action type which is an action type of the user in the place area in the room is set is acquired, and matches the position of the user from the place correspondence table. An action estimation unit that acquires a place action type corresponding to a place area and estimates the user's action based on the acquired place action type.
A behavior history generation unit that generates a history of the user's behavior as behavior history information using the estimation result of the user's behavior is provided.
 本発明に係る行動種別推定装置では、ユーザにとって負担が少なく、かつ、低コストに行動種別を判別することができるという効果を奏する。 The behavior type estimation device according to the present invention has an effect that the burden on the user is small and the behavior type can be discriminated at low cost.
実施の形態1に係る行動種別推定システムの構成図。The block diagram of the action type estimation system which concerns on Embodiment 1. FIG. 実施の形態1に係る情報更新部の動作を示すフロー図。The flow chart which shows the operation of the information update part which concerns on Embodiment 1. 実施の形態1に係る個人情報の例。An example of personal information according to the first embodiment. 実施の形態1に係る場所対応表の例。An example of a place correspondence table according to the first embodiment. 実施の形態1に係る詳細対応表の例。An example of a detailed correspondence table according to the first embodiment. 実施の形態1に係る場所予定表の例。An example of a place schedule according to the first embodiment. 実施の形態1に係る行動種別推定装置の動作を示すフロー図。The flow chart which shows the operation of the action type estimation apparatus which concerns on Embodiment 1. 実施の形態1に係る行動推定部の動作を示すフロー図。The flow chart which shows the operation of the action estimation part which concerns on Embodiment 1. 実施の形態1に係る行動履歴情報の例。An example of behavior history information according to the first embodiment. 実施の形態1の変形例に係る行動種別推定システムの構成図。The block diagram of the action type estimation system which concerns on the modification of Embodiment 1. 実施の形態2に係る行動種別推定装置の構成図。The block diagram of the action type estimation apparatus which concerns on Embodiment 2. FIG. 実施の形態2に係るユーザ予定表の例。An example of a user calendar according to the second embodiment. 実施の形態2に係る行動推定部の動作を示すフロー図。The flow chart which shows the operation of the action estimation part which concerns on Embodiment 2. 実施の形態3に係る行動種別推定装置の動作を示すフロー図。The flow chart which shows the operation of the action type estimation apparatus which concerns on Embodiment 3. 実施の形態3に係る行動履歴情報の例。An example of action history information according to the third embodiment.
 以下、本発明の実施の形態について、図を用いて説明する。なお、各図中、同一または相当する部分には、同一符号を付している。実施の形態の説明において、同一または相当する部分については、説明を適宜省略または簡略化する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In each figure, the same or corresponding parts are designated by the same reference numerals. In the description of the embodiment, the description will be omitted or simplified as appropriate for the same or corresponding parts.
 実施の形態1.
***構成の説明***
 図1を用いて、本実施の形態に係る行動種別推定システム500の構成を説明する。
 行動種別推定システム500は、ユーザ通信機21と、屋内通信機22と、行動種別推定装置100とを備える。
 行動種別推定装置100は、屋内30におけるユーザ20の行動を推定する。例えば、屋内30は、ユーザ20が仕事をしている会社の建物内のことである。ユーザ20は、会社の建物内である屋内30で仕事をしている社員である。会社は、行動種別推定装置100により推定された社員の行動を分析することにより、社員の運動量を判定し、社員の健康増進を支援する。
Embodiment 1.
*** Explanation of configuration ***
The configuration of the action type estimation system 500 according to the present embodiment will be described with reference to FIG.
The action type estimation system 500 includes a user communication device 21, an indoor communication device 22, and an action type estimation device 100.
The action type estimation device 100 estimates the behavior of the user 20 in the indoor 30. For example, the indoor 30 is in the building of the company where the user 20 is working. The user 20 is an employee working indoors 30 in the company building. The company determines the amount of exercise of the employee by analyzing the behavior of the employee estimated by the behavior type estimation device 100, and supports the improvement of the health of the employee.
 ユーザ通信機21は、ビーコンといった近距離無線通信により電波を発信する機器である。ユーザ通信機21は、具体的には、社員証、社員証に付与されるネームプレート、あるいはキーホルダーといったユーザ20により携帯される物に内蔵される。ユーザ通信機21は、ウェアラブル端末あるいは携帯端末装置に比較してコストが低く、ユーザ20にとって携帯する負担が小さい機器である。
 ユーザ通信機21が発信する電波はビーコン情報でも良いことから、ユーザ20がアプリを操作するなどにより意図的に送信する必要が無く、また、送信可否の許可をユーザ20にとる必要も無い。
The user communication device 21 is a device that transmits radio waves by short-range wireless communication such as a beacon. Specifically, the user communication device 21 is built in an employee ID card, a name plate attached to the employee ID card, or a key chain or the like carried by the user 20. The user communication device 21 is a device whose cost is lower than that of a wearable terminal or a mobile terminal device, and the burden on the user 20 to carry is small.
Since the radio wave transmitted by the user communication device 21 may be beacon information, it is not necessary for the user 20 to intentionally transmit the information by operating the application or the like, and it is not necessary for the user 20 to obtain permission for transmission.
 屋内通信機22は、会社の建物といった屋内30に複数設置される。屋内通信機22は、ユーザ20により携帯されている近距離無線通信用のユーザ通信機21から発信される電波を受信する。そして、屋内通信機22は、ユーザ通信機21から発信される電波を信号31として行動種別推定装置100に送信する。例えば、屋内通信機22は、屋内30に数メートルから数十メートル間隔で設定される。あるいは、屋内通信機22は、屋内30における場所の入口および出口に設置される。屋内30における場所は、例えば、部あるいは課ごとの領域、資料室、食堂、洗面所、廊下、階段、エレベータ、あるいは会議室といった領域である。 A plurality of indoor communication devices 22 are installed indoors 30 such as a company building. The indoor communication device 22 receives radio waves transmitted from the user communication device 21 for short-range wireless communication carried by the user 20. Then, the indoor communication device 22 transmits the radio wave transmitted from the user communication device 21 as a signal 31 to the action type estimation device 100. For example, the indoor communication device 22 is set indoors 30 at intervals of several meters to several tens of meters. Alternatively, the indoor communicator 22 is installed at the entrance and exit of a place in the indoor 30. The place in the indoor 30 is, for example, an area for each department or section, a reference room, a dining room, a washroom, a corridor, a staircase, an elevator, or a conference room.
 また、ユーザ20は、個人用端末23を有していてもよい。個人用端末23は、ユーザ20が使用する端末である。具体的には、PC(Personal Computer)、スマートフォン、タブレット端末、あるいは携帯電話端末といった機器である。ユーザ20は、個人用端末23を用いて、各種の情報の更新、あるいは、行動履歴情報の閲覧といった処理を実施する。 Further, the user 20 may have a personal terminal 23. The personal terminal 23 is a terminal used by the user 20. Specifically, it is a device such as a PC (Personal Computer), a smartphone, a tablet terminal, or a mobile phone terminal. The user 20 uses the personal terminal 23 to perform processing such as updating various information or viewing action history information.
 行動種別推定装置100は、コンピュータである。行動種別推定装置100は、プロセッサ910を備えるとともに、メモリ921、補助記憶装置922、入力インタフェース930、出力インタフェース940、および通信装置950といった他のハードウェアを備える。プロセッサ910は、信号線を介して他のハードウェアと接続され、これら他のハードウェアを制御する。 The action type estimation device 100 is a computer. The action type estimation device 100 includes a processor 910 and other hardware such as a memory 921, an auxiliary storage device 922, an input interface 930, an output interface 940, and a communication device 950. The processor 910 is connected to other hardware via a signal line and controls these other hardware.
 行動種別推定装置100は、機能要素として、位置推定部110と行動推定部120と行動履歴生成部130と情報更新部140と記憶部150とを備える。記憶部150には、個人情報151と場所対応表152と詳細対応表153と場所予定表154と行動履歴情報155が記憶されている。 The action type estimation device 100 includes a position estimation unit 110, an action estimation unit 120, an action history generation unit 130, an information update unit 140, and a storage unit 150 as functional elements. The storage unit 150 stores personal information 151, location correspondence table 152, detailed correspondence table 153, location schedule 154, and action history information 155.
 位置推定部110と行動推定部120と行動履歴生成部130と情報更新部140の機能は、ソフトウェアにより実現される。記憶部150は、メモリ921に備えられる。 The functions of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140 are realized by software. The storage unit 150 is provided in the memory 921.
 プロセッサ910は、行動種別推定プログラムを実行する装置である。行動種別推定プログラムは、位置推定部110と行動推定部120と行動履歴生成部130と情報更新部140の機能を実現するプログラムである。
 プロセッサ910は、演算処理を行うIC(Integrated Circuit)である。プロセッサ910の具体例は、CPU(Central Processing Unit)、DSP(Digital Signal Processor)、GPU(Graphics Processing Unit)である。
The processor 910 is a device that executes an action type estimation program. The action type estimation program is a program that realizes the functions of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140.
The processor 910 is an IC (Integrated Circuit) that performs arithmetic processing. Specific examples of the processor 910 are a CPU (Central Processing Unit), a DSP (Digital Signal Processor), and a GPU (Graphics Processing Unit).
 メモリ921は、データを一時的に記憶する記憶装置である。メモリ921の具体例は、SRAM(Static Random Access Memory)、あるいはDRAM(Dynamic Random Access Memory)である。
 補助記憶装置922は、データを保管する記憶装置である。補助記憶装置922の具体例は、HDDである。また、補助記憶装置922は、SD(登録商標)メモリカード、CF、NANDフラッシュ、フレキシブルディスク、光ディスク、コンパクトディスク、ブルーレイ(登録商標)ディスク、DVDといった可搬の記憶媒体であってもよい。なお、HDDは、Hard Disk Driveの略語である。SD(登録商標)は、Secure Digitalの略語である。CFは、CompactFlash(登録商標)の略語である。DVDは、Digital Versatile Diskの略語である。
The memory 921 is a storage device that temporarily stores data. A specific example of the memory 921 is a SRAM (Static Random Access Memory) or a DRAM (Dynamic Random Access Memory).
The auxiliary storage device 922 is a storage device that stores data. A specific example of the auxiliary storage device 922 is an HDD. Further, the auxiliary storage device 922 may be a portable storage medium such as an SD (registered trademark) memory card, CF, NAND flash, flexible disc, optical disk, compact disc, Blu-ray (registered trademark) disc, or DVD. HDD is an abbreviation for Hard Disk Drive. SD® is an abbreviation for Secure Digital. CF is an abbreviation for CompactFlash®. DVD is an abbreviation for Digital Versaille Disc.
 入力インタフェース930は、マウス、キーボード、あるいはタッチパネルといった入力装置と接続されるポートである。入力インタフェース930は、具体的には、USB(Universal Serial Bus)端子である。なお、入力インタフェース930は、LAN(Local Area Network)と接続されるポートであってもよい。
 出力インタフェース940は、ディスプレイといった出力機器のケーブルが接続されるポートである。出力インタフェース940は、具体的には、USB端子またはHDMI(登録商標)(High Definition Multimedia Interface)端子である。ディスプレイは、具体的には、LCD(Liquid Crystal Display)である。行動種別推定装置100は、出力インタフェース940を介して、行動履歴情報155をディスプレイに表示してもよい。
The input interface 930 is a port connected to an input device such as a mouse, keyboard, or touch panel. Specifically, the input interface 930 is a USB (Universal Serial Bus) terminal. The input interface 930 may be a port connected to a LAN (Local Area Network).
The output interface 940 is a port to which a cable of an output device such as a display is connected. Specifically, the output interface 940 is a USB terminal or an HDMI (registered trademark) (High Definition Multimedia Interface) terminal. Specifically, the display is an LCD (Liquid Crystal Display). The action type estimation device 100 may display the action history information 155 on the display via the output interface 940.
 通信装置950は、レシーバとトランスミッタを有する。通信装置950は、無線で、LAN、インターネット、あるいは電話回線といった通信網に接続している。通信装置950は、具体的には、通信チップまたはNIC(Network Interface Card)である。行動種別推定装置100は、通信装置950を介して、屋内通信機22から信号31を取得する。また、行動種別推定装置100は、通信装置950を介して、行動履歴情報155をユーザ20の個人用端末23に送信してもよい。 The communication device 950 has a receiver and a transmitter. The communication device 950 is wirelessly connected to a communication network such as a LAN, the Internet, or a telephone line. Specifically, the communication device 950 is a communication chip or a NIC (Network Interface Card). The action type estimation device 100 acquires the signal 31 from the indoor communication device 22 via the communication device 950. Further, the action type estimation device 100 may transmit the action history information 155 to the personal terminal 23 of the user 20 via the communication device 950.
 行動種別推定プログラムは、プロセッサ910に読み込まれ、プロセッサ910によって実行される。メモリ921には、行動種別推定プログラムだけでなく、OS(Operating System)も記憶されている。プロセッサ910は、OSを実行しながら、行動種別推定プログラムを実行する。行動種別推定プログラムおよびOSは、補助記憶装置922に記憶されていてもよい。補助記憶装置922に記憶されている行動種別推定プログラムおよびOSは、メモリ921にロードされ、プロセッサ910によって実行される。なお、行動種別推定プログラムの一部または全部がOSに組み込まれていてもよい。 The action type estimation program is read into the processor 910 and executed by the processor 910. In the memory 921, not only the action type estimation program but also the OS (Operating System) is stored. The processor 910 executes the action type estimation program while executing the OS. The action type estimation program and the OS may be stored in the auxiliary storage device 922. The action type estimation program and the OS stored in the auxiliary storage device 922 are loaded into the memory 921 and executed by the processor 910. In addition, a part or all of the action type estimation program may be incorporated in the OS.
 行動種別推定装置100は、プロセッサ910を代替する複数のプロセッサを備えていてもよい。これら複数のプロセッサは、行動種別推定プログラムの実行を分担する。それぞれのプロセッサは、プロセッサ910と同じように、行動種別推定プログラムを実行する装置である。 The action type estimation device 100 may include a plurality of processors that replace the processor 910. These plurality of processors share the execution of the action type estimation program. Each processor, like the processor 910, is a device that executes an action type estimation program.
 行動種別推定プログラムにより利用、処理または出力されるデータ、情報、信号値および変数値は、メモリ921、補助記憶装置922、または、プロセッサ910内のレジスタあるいはキャッシュメモリに記憶される。 Data, information, signal values and variable values used, processed or output by the action type estimation program are stored in the memory 921, the auxiliary storage device 922, or the register or cache memory in the processor 910.
 位置推定部110と行動推定部120と行動履歴生成部130と情報更新部140の各部の「部」を「処理」、「手順」あるいは「工程」に読み替えてもよい。また位置推定処理と行動推定処理と行動履歴生成処理と情報更新処理の「処理」を「プログラム」、「プログラムプロダクト」または「プログラムを記録したコンピュータ読取可能な記憶媒体」に読み替えてもよい。 The "department" of each of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140 may be read as "processing", "procedure", or "process". Further, the "process" of the position estimation process, the action estimation process, the action history generation process, and the information update process may be read as "program", "program product", or "computer-readable storage medium on which the program is recorded".
 行動種別推定プログラムは、上記の各部の「部」を「処理」、「手順」あるいは「工程」に読み替えた各処理、各手順あるいは各工程を、コンピュータに実行させる。また、行動種別推定方法は、上記の各部の「部」を「手順」に読み替えた各手順に相当する。
 行動種別推定プログラムは、コンピュータ読取可能な記録媒体に格納されて提供されてもよい。また、行動種別推定プログラムは、プログラムプロダクトとして提供されてもよい。
The action type estimation program causes a computer to execute each process, each procedure or each process in which the "part" of each of the above parts is read as "process", "procedure" or "process". In addition, the action type estimation method corresponds to each procedure in which the "part" of each of the above parts is read as "procedure".
The action type estimation program may be provided stored in a computer-readable recording medium. In addition, the behavior type estimation program may be provided as a program product.
***動作の説明***
 次に、本実施の形態に係る行動種別推定システム500および行動種別推定装置100の動作について説明する。行動種別推定装置100の動作手順は、行動種別推定方法に相当する。また、行動種別推定装置100の動作を実現するプログラムは、行動種別推定プログラムに相当する。
*** Explanation of operation ***
Next, the operations of the action type estimation system 500 and the action type estimation device 100 according to the present embodiment will be described. The operation procedure of the action type estimation device 100 corresponds to the action type estimation method. Further, the program that realizes the operation of the action type estimation device 100 corresponds to the action type estimation program.
<情報更新処理>
 図2を用いて、本実施の形態に係る情報更新部140の動作について説明する。
 情報更新部140は、個人情報151と場所対応表152と詳細対応表153と場所予定表154との少なくともいずれかを更新する。
<Information update process>
The operation of the information updating unit 140 according to the present embodiment will be described with reference to FIG.
The information updating unit 140 updates at least one of the personal information 151, the location correspondence table 152, the detailed correspondence table 153, and the location schedule 154.
 ステップS101において、情報更新部140は、記憶部150に記憶されている情報に対して、更新要求があるか否かを判定する。具体的には、情報更新部140は、記憶部150に記憶されている情報に対して、ユーザ20の個人用端末23から更新要求を受信したか否かを判定する。更新要求には、更新する対象の情報と、更新内容とが含まれる。具体的には、今まで資料室では座って閲覧をしていたユーザ20が、今後は健康のために立って閲覧するようにした場合に、場所対応表152あるいは詳細対応表153を変更するといったケースがある。
 更新要求がある場合、ステップS102において、情報更新部140は、更新要求に基づいて、記憶部150に記憶されている情報を更新する。更新要求がない場合、処理はステップS101に戻る。
In step S101, the information update unit 140 determines whether or not there is an update request for the information stored in the storage unit 150. Specifically, the information update unit 140 determines whether or not an update request has been received from the personal terminal 23 of the user 20 for the information stored in the storage unit 150. The update request includes information to be updated and update contents. Specifically, when the user 20 who used to sit and browse in the reference room will now stand and browse for the sake of health, the location correspondence table 152 or the detailed correspondence table 153 will be changed. There is a case.
When there is an update request, in step S102, the information update unit 140 updates the information stored in the storage unit 150 based on the update request. If there is no update request, the process returns to step S101.
 図3は、本実施の形態に係る個人情報151の例である。
 個人情報151には、社員個人に関する情報が設定される。例えば、個人情報151には、社員番号、社員名、所属部署、内線番号といった個人の情報が設定されている。
FIG. 3 is an example of personal information 151 according to the present embodiment.
Information about individual employees is set in the personal information 151. For example, personal information such as an employee number, an employee name, an affiliation department, and an extension number is set in the personal information 151.
 図4は、本実施の形態に係る場所対応表152の例である。
 場所対応表152は、屋内30における場所を表す場所領域521と、屋内30における場所領域521でのユーザ20の行動種別である場所行動種別522とが設定されている。
 図4に示すように、屋内30に座標系を設定し、場所対応表152では場所領域521を座標で表す。あるいは、屋内30における場所の名称で場所領域521を表してもよい。その場合は、別途、屋内30に座標系と場所の名称との対応テーブルが設けられる。場所行動種別522には、場所領域521においてユーザ20が主に行う行動の種別が設定される。具体的には、座標[0,0],[3,0],[3,2],[0,2]で表される廊下では、ユーザ20は主に歩いているため、「歩き」が設定される。また、座標[3,0],[6,0],[6,6],[3,6]で表される食堂では、ユーザ20は主に座っているため、「座り」が設定される。また、座標[0,2],[3,2],[3,6],[0,6]で表される資料室では、ユーザ20は主に立っているため、「立ち」が設定される。
FIG. 4 is an example of the location correspondence table 152 according to the present embodiment.
In the place correspondence table 152, a place area 521 representing a place in the indoor 30 and a place action type 522 which is an action type of the user 20 in the place area 521 in the indoor 30 are set.
As shown in FIG. 4, a coordinate system is set in the indoor 30, and the location area 521 is represented by coordinates in the location correspondence table 152. Alternatively, the place area 521 may be represented by the name of the place in the indoor 30. In that case, a table corresponding to the coordinate system and the name of the place is separately provided indoors 30. In the place action type 522, the type of the action mainly performed by the user 20 in the place area 521 is set. Specifically, in the corridor represented by the coordinates [0,0], [3,0], [3,2], [0,2], since the user 20 is mainly walking, "walking" is Set. Further, in the dining room represented by the coordinates [3,0], [6,0], [6,6], [3,6], since the user 20 is mainly sitting, "sitting" is set. .. Further, in the reference room represented by the coordinates [0,2], [3,2], [3,6], [0,6], since the user 20 is mainly standing, "standing" is set. To.
 なお、ユーザ20により場所における行動種別は異なると考えられるため、場所対応表152はユーザごとに設けられることが好ましい。上述したように、今まで資料室での場所行動種別522を「座り」としていたユーザ20が、今後は健康のために立って閲覧するように場所対応表152を設定し直す場合がある。このように、場所における行動種別はユーザ20により異なる。 Since it is considered that the action type at the place differs depending on the user 20, it is preferable that the place correspondence table 152 is provided for each user. As described above, the user 20 who has previously set the place action type 522 in the reference room as "sitting" may reset the place correspondence table 152 so that he / she can stand and browse for the sake of his / her health. In this way, the action type at the place differs depending on the user 20.
 図5は、本実施の形態に係る詳細対応表153の例である。
 詳細対応表153には、場所対応表152に含まれる場所領域521を区分けした詳細領域531と、詳細領域531に対応する行動種別である詳細行動種別532とが設定されている。領域および行動種別の設定方式は、場所対応表152と同様である。具体的には、資料室は、本棚エリアと閲覧机エリアとに細分化される。資料室の詳細対応表153では、座標[5,0],[6,0],[6,6],[3,6]で表される本棚エリアでは、ユーザ20は主に立っているため、「立ち」が設定される。また、座標[3,0],[5,0],[5,6],[3,6]で表される閲覧机エリアでは、ユーザ20は主に座っているため、「座り」が設定される。
FIG. 5 is an example of the detailed correspondence table 153 according to the present embodiment.
In the detailed correspondence table 153, a detailed area 531 that divides the place area 521 included in the place correspondence table 152 and a detailed action type 532 that is an action type corresponding to the detailed area 531 are set. The area and action type setting method is the same as in the location correspondence table 152. Specifically, the reference room is subdivided into a bookshelf area and a reading desk area. In the detailed correspondence table 153 of the reference room, the user 20 stands mainly in the bookshelf area represented by the coordinates [5,0], [6,0], [6,6], [3,6]. , "Standing" is set. Further, in the reading desk area represented by the coordinates [3,0], [5,0], [5,6], [3,6], since the user 20 is mainly sitting, "sitting" is set. Will be done.
 図6は、本実施の形態に係る場所予定表154の例である。
 場所予定表154には、場所領域における予定が設定されている。具体的には、「食堂では2019年3月29日の12:00から15:00まで立食パーティ」といった内容が設定されている。
FIG. 6 is an example of the place schedule 154 according to the present embodiment.
In the place schedule 154, a schedule in the place area is set. Specifically, the content such as "At the cafeteria, a standing party from 12:00 to 15:00 on March 29, 2019" is set.
 図7を用いて、本実施の形態に係る行動種別推定装置100の動作について説明する。 The operation of the action type estimation device 100 according to the present embodiment will be described with reference to FIG. 7.
<位置推定処理>
 ステップS201において、位置推定部110は、屋内30におけるユーザ20の位置を推定する。ユーザ20の位置は、屋内30で設定されたローカルな座標系でもよいし、屋内30における位置を特定する名称でもよい。ユーザ20の位置は、屋内30における位置を特定することができればどのような情報でもよい。位置推定部110は、屋内30に複数設置された近距離無線通信用の屋内通信機22と、ユーザ20により携帯されている近距離無線通信用のユーザ通信機21との間で通信される信号31を取得し、取得した信号31からユーザ20の位置を推定する。
 具体的には、屋内30に設置された複数の屋内通信機22が、ユーザ20により携帯されている近距離無線通信用のユーザ通信機21から発信された電波を受信することにより、ユーザ20の位置が推定される。屋内通信機22は、ユーザ通信機21から電波を受信する。そして、屋内通信機22は、受信した電波を信号31として位置推定部110に送信する。位置推定部110は、通信装置950を介して信号31を受信し、信号31に基づいて、ユーザ20の位置を推定する。
 ステップS202において、位置推定部110は、ユーザ20の位置の取得が成功したか否かを判定する。成功した場合、処理はステップS203に進む。失敗した場合、処理はステップS201に戻る。
<Position estimation processing>
In step S201, the position estimation unit 110 estimates the position of the user 20 in the indoor 30. The position of the user 20 may be a local coordinate system set in the indoor 30 or a name that specifies the position in the indoor 30. The position of the user 20 may be any information as long as the position in the indoor 30 can be specified. The position estimation unit 110 is a signal communicated between a plurality of indoor communication devices 22 for short-range wireless communication installed indoors 30 and a user communication device 21 for short-range wireless communication carried by the user 20. 31 is acquired, and the position of the user 20 is estimated from the acquired signal 31.
Specifically, a plurality of indoor communication devices 22 installed in the indoor 30 receive radio waves transmitted from the user communication device 21 for short-range wireless communication carried by the user 20, so that the user 20 can receive radio waves. The position is estimated. The indoor communication device 22 receives radio waves from the user communication device 21. Then, the indoor communication device 22 transmits the received radio wave as a signal 31 to the position estimation unit 110. The position estimation unit 110 receives the signal 31 via the communication device 950, and estimates the position of the user 20 based on the signal 31.
In step S202, the position estimation unit 110 determines whether or not the position of the user 20 has been successfully acquired. If successful, the process proceeds to step S203. If unsuccessful, the process returns to step S201.
<行動推定処理>
 ステップS203において、行動推定部120は、位置推定部110により取得されたユーザ20の位置と、記憶部150に記憶されている情報とに基づいて、ユーザ20の行動を推定する。記憶部150に記憶されている情報とは、場所対応表152、詳細対応表153、および場所予定表154の少なくともいずれかの情報である。
<Behavior estimation processing>
In step S203, the behavior estimation unit 120 estimates the behavior of the user 20 based on the position of the user 20 acquired by the position estimation unit 110 and the information stored in the storage unit 150. The information stored in the storage unit 150 is at least one of the location correspondence table 152, the detailed correspondence table 153, and the location schedule 154.
 図8を用いて、本実施の形態に係る行動推定部120の動作について説明する。
 ステップS301において、行動推定部120は、場所対応表152を取得し、場所対応表152からユーザ20の位置に合致する場所領域521に対応する場所行動種別522を取得する。行動推定部120は、取得した場所行動種別522に基づいてユーザ20の行動を推定する。このとき、行動推定部120は、場所対応表152がユーザごとに設定されていれば、ユーザ20に対応する場所対応表152を選択する。また、場所対応表152は、役職あるいは役割ごとに設定されていてもよい。その場合、行動推定部120は、個人情報151からユーザ20の役職あるいは役割を取得する。そして、行動推定部120は、ユーザ20の役職あるいは役割に対応する場所対応表152を選択する。
The operation of the behavior estimation unit 120 according to the present embodiment will be described with reference to FIG.
In step S301, the action estimation unit 120 acquires the location correspondence table 152, and acquires the location action type 522 corresponding to the location area 521 matching the position of the user 20 from the location correspondence table 152. The behavior estimation unit 120 estimates the behavior of the user 20 based on the acquired location behavior type 522. At this time, the behavior estimation unit 120 selects the location correspondence table 152 corresponding to the user 20 if the location correspondence table 152 is set for each user. Further, the location correspondence table 152 may be set for each job title or role. In that case, the behavior estimation unit 120 acquires the job title or role of the user 20 from the personal information 151. Then, the behavior estimation unit 120 selects the location correspondence table 152 corresponding to the job title or role of the user 20.
 ステップS302において、行動推定部120は、場所対応表152に含まれる場所領域521を区分けした詳細領域531と、詳細領域531に対応する詳細行動種別532とが設定された詳細対応表153を取得する。行動推定部120は、詳細対応表153からユーザ20の位置に合致する詳細領域531に対応する詳細行動種別を取得する。行動推定部120は、ユーザ20の位置に合致する場所行動種別522とユーザ20の位置に合致する詳細行動種別532とに基づいて、ユーザ20の行動を推定する。 In step S302, the action estimation unit 120 acquires the detailed correspondence table 153 in which the detailed area 531 that divides the place area 521 included in the place correspondence table 152 and the detailed action type 532 corresponding to the detailed area 531 are set. .. The action estimation unit 120 acquires the detailed action type corresponding to the detailed area 531 that matches the position of the user 20 from the detailed correspondence table 153. The action estimation unit 120 estimates the action of the user 20 based on the place action type 522 that matches the position of the user 20 and the detailed action type 532 that matches the position of the user 20.
 ステップS303において、行動推定部120は、場所領域における予定が設定された場所予定表154を取得する。行動推定部120は、現在時刻と場所予定表154を用いて、場所行動種別522と詳細行動種別532と場所予定表154に設定された現在の予定とに基づいて、ユーザ20の行動を推定する。 In step S303, the action estimation unit 120 acquires the place schedule 154 in which the schedule in the place area is set. The action estimation unit 120 estimates the action of the user 20 based on the place action type 522, the detailed action type 532, and the current schedule set in the place schedule 154, using the current time and the place schedule 154. ..
 ステップS301からステップS302の処理は、具体的には以下の通りである。
 行動推定部120は、場所行動種別522のみに基づいてユーザ20の行動を推定してもよい。廊下あるいはエレベータといった場所領域には、詳細対応表153は存在しない。このように、詳細対応表153および場所予定表154が存在しない場所領域の場合は、行動推定部120は、場所行動種別522のみに基づいてユーザ20の行動を推定する。例えば、ユーザ20が廊下にいる場合は「歩き」と推定され、エレベータにいる場合は「立ち」と推定される。
Specifically, the processes from step S301 to step S302 are as follows.
The behavior estimation unit 120 may estimate the behavior of the user 20 based only on the location behavior type 522. There is no detailed correspondence table 153 in a place area such as a corridor or an elevator. As described above, in the case of the place area where the detailed correspondence table 153 and the place schedule 154 do not exist, the action estimation unit 120 estimates the action of the user 20 based only on the place action type 522. For example, when the user 20 is in the corridor, it is estimated to be "walking", and when the user 20 is in the elevator, it is estimated to be "standing".
 また、行動推定部120は、ユーザ20の位置に合致する場所行動種別522とユーザ20の位置に合致する詳細行動種別532とに基づいて、ユーザ20の行動を推定する。例えば、位置推定部110が資料室のうち本棚の領域でユーザ20を検出した場合、場所行動種別522は「立ち」となり、詳細行動種別532は「座り」となる。このとき、行動推定部120は、詳細対応表153を優先して、ユーザ20の行動を「座り」と推定する。 Further, the action estimation unit 120 estimates the action of the user 20 based on the place action type 522 that matches the position of the user 20 and the detailed action type 532 that matches the position of the user 20. For example, when the position estimation unit 110 detects the user 20 in the area of the bookshelf in the data room, the place action type 522 is "standing" and the detailed action type 532 is "sitting". At this time, the behavior estimation unit 120 gives priority to the detailed correspondence table 153 and estimates the behavior of the user 20 as "sitting".
 また、行動推定部120は、ユーザ20の位置に対応する場所について場所予定表154が存在する場合、場所予定表154を加味してユーザ20の行動を推定する。例えば、位置推定部110が、2019年3月29日の13:00に食堂のうち食卓の領域でユーザ20を検出した場合、場所行動種別522は「座り」となり、詳細行動種別532も「座り」となる。しかし、場所対応表153では、2019年3月29日の12:00から15:00まで食堂で立食パーティであるため、行動推定部120は場所予定表154を優先して、ユーザ20の行動を「立ち」と推定する。
 なお、場所予定表154には、予定の内容に行動種別を対応付けて設定してもよい。あるいは、予定の内容と行動種別を対応付けた予定別行動種別といったテーブルが記憶部150に記憶されていてもよい。予定別行動種別には、例えば、立食パーティには「立ち」が対応付けられている。
Further, when the place schedule 154 exists for the place corresponding to the position of the user 20, the behavior estimation unit 120 estimates the behavior of the user 20 in consideration of the place schedule 154. For example, when the position estimation unit 110 detects the user 20 in the dining table area of the dining room at 13:00 on March 29, 2019, the place action type 522 becomes "sitting" and the detailed action type 532 also "sitting". ". However, in the location correspondence table 153, since it is a standing party at the cafeteria from 12:00 to 15:00 on March 29, 2019, the behavior estimation unit 120 gives priority to the location schedule 154 and gives the user 20's behavior. Estimated to be "standing".
In addition, in the place schedule table 154, the action type may be set in association with the contents of the schedule. Alternatively, a table such as a schedule-specific action type in which the schedule content and the action type are associated with each other may be stored in the storage unit 150. For example, "standing" is associated with a standing party as an action type by schedule.
 さらに、行動推定部120は、ユーザ20の行動種別の時間軸方向における遷移、あるいは、ユーザ20の位置の時間軸方向における遷移に基づいて、ユーザ20の行動を推定してもよい。
 具体例として、行動推定部120は、ユーザ20の位置が5秒間で10m移動したため「早歩き」という行動をとったと推定することができる。あるいは、行動推定部120は、直前に「座る」という行動種別をとっていたため、「走る」という行動をとる可能性は低いと推定することができる。
Further, the action estimation unit 120 may estimate the action of the user 20 based on the transition of the action type of the user 20 in the time axis direction or the transition of the position of the user 20 in the time axis direction.
As a specific example, it can be estimated that the behavior estimation unit 120 has taken the action of "fast walking" because the position of the user 20 has moved 10 m in 5 seconds. Alternatively, since the behavior estimation unit 120 has taken the action type of "sitting" immediately before, it can be estimated that the possibility of taking the action of "running" is low.
<行動履歴生成処理>
 行動履歴生成部130は、ユーザ20の行動の推定結果を用いて、ユーザ20の行動の履歴を行動履歴情報155として生成する。具体的には、以下の通りである。
 ステップS204において、行動履歴生成部130は、ユーザ20の行動の推定結果が、前回の行動と同一か否かを判定する。同一である、すなわちユーザ20の行動に変化がない場合、処理はステップS201に戻る。同一でない、すなわちユーザ20の行動に変化がある場合、処理はステップS205に進む。
 ステップS205において、行動履歴生成部130は、行動履歴情報155に新たな行動と、新たな行動の開始時刻とを記録する。具体的には、行動履歴生成部130は、現在時刻を新たな行動の開始時刻として記録する。また、行動履歴生成部130は、現在時刻を終了した行動の終了時刻として記録してもよい。
<Behavior history generation process>
The action history generation unit 130 generates the action history of the user 20 as the action history information 155 by using the estimation result of the action of the user 20. Specifically, it is as follows.
In step S204, the action history generation unit 130 determines whether or not the estimation result of the action of the user 20 is the same as the previous action. If they are the same, that is, if there is no change in the behavior of the user 20, the process returns to step S201. If they are not the same, that is, if there is a change in the behavior of the user 20, the process proceeds to step S205.
In step S205, the action history generation unit 130 records a new action and a start time of the new action in the action history information 155. Specifically, the action history generation unit 130 records the current time as the start time of a new action. Further, the action history generation unit 130 may record the current time as the end time of the end of the action.
 図9は、本実施の形態に係る行動履歴情報155の例である。
 行動履歴情報155には、ユーザ20である社員ごとに、時間と、場所と、推定結果としての行動が設定される。行動履歴情報155の記録方式は、社員ごとの行動ごとに継続時間を集計して記録してもよい。その他、ユーザ20ごとに行動が記録できればどのような記録方式でもよい。
FIG. 9 is an example of the action history information 155 according to the present embodiment.
In the action history information 155, a time, a place, and an action as an estimation result are set for each employee who is the user 20. In the recording method of the action history information 155, the duration may be aggregated and recorded for each action of each employee. In addition, any recording method may be used as long as the action can be recorded for each user 20.
***他の構成***
<変形例1>
 本実施の形態では、位置推定部がビーコンによりユーザの位置を推定することを想定して説明した。しかし、屋内のユーザの位置を推定することができれば、ビーコン以外の位置推定方法でもよい。
 例えば、位置推定部は、ACS(Access Control System)によるゲート通過の履歴情報を用いて、ユーザの位置を推定してもよい。また、位置推定部は、エレベータの機能を用いてユーザの位置を推定してもよい。具体的には、位置推定部は、エレベータの制御信号を用いる。あるいは、エレベータ内にビーコンの受信機を設置し、ユーザがエレベータのかご内にいることを特定できるようにしてもよい。また、位置推定部は、監視カメラによる人物特定機能を用いて、ユーザの位置を推定してもよい。
 さらに、位置推定部は、ビーコン、ACS、エレベータ、および監視カメラを組み合わせることにより、ユーザの位置を推定してもよい。
*** Other configurations ***
<Modification example 1>
In the present embodiment, the description has been made on the assumption that the position estimation unit estimates the position of the user by the beacon. However, if the position of the user indoors can be estimated, a position estimation method other than the beacon may be used.
For example, the position estimation unit may estimate the user's position by using the history information of the gate passage by the ACS (Access Control System). Further, the position estimation unit may estimate the position of the user by using the function of the elevator. Specifically, the position estimation unit uses an elevator control signal. Alternatively, a beacon receiver may be installed in the elevator so that the user can be identified as being in the elevator car. Further, the position estimation unit may estimate the position of the user by using the person identification function by the surveillance camera.
Further, the position estimation unit may estimate the user's position by combining a beacon, an ACS, an elevator, and a surveillance camera.
 ここで、入退管理システムであるACSを用いた位置の推定について、説明する。ACSは、具体的には会社の構内あるいは特定の部屋に入室する際に、社員証などをかざすことで、ゲートあるいは扉を開錠させるセキュリティシステムの1つである。その際、個人が何時にその場所にいたのかのログが取られている。このログを利用することで、例えば「食堂退出、作業部屋入室」といったログから、「食堂、渡り廊下、作業部屋」といった位置を推定することができる。また、推定した位置を利用して、「座り作業、歩行、立ち作業」といった行動種別を推定することができる。また、ACS間のログの時刻から移動スピードも算出可能である。 Here, the estimation of the position using the ACS, which is an entry / exit management system, will be described. Specifically, ACS is one of the security systems that unlocks the gate or door by holding an employee ID card or the like when entering a company premises or a specific room. At that time, a log is taken of what time the individual was at the place. By using this log, for example, the position of "dining room, crossing corridor, working room" can be estimated from the log such as "dining room exit, work room entry". In addition, the estimated position can be used to estimate an action type such as "sitting work, walking, standing work". In addition, the movement speed can be calculated from the time of the log between ACSs.
<変形例2>
 本実施の形態では、屋内に複数設置された近距離無線通信用の屋内通信機を受信機とし、ユーザにより携帯されている近距離無線通信用のユーザ通信機をビーコンの発信機として、位置推定部がユーザの位置を推定する。しかし、屋内に複数設置された近距離無線通信用の屋内通信機を発信機とし、ユーザにより携帯されている近距離無線通信用のユーザ通信機を受信機としてもよい。位置推定部は、屋内通信機から発信され、ユーザ通信機により受信された信号を、ユーザ通信機から取得し、ユーザの位置を推定してもよい。このとき、ユーザ通信機の具体例は、スマートフォン端末あるいはタブレット端末といった携帯端末である。
<Modification 2>
In the present embodiment, the position is estimated by using a plurality of indoor communication devices for short-range wireless communication installed indoors as a receiver and a user communication device for short-range wireless communication carried by the user as a beacon transmitter. The department estimates the user's position. However, a plurality of indoor communication devices for short-range wireless communication installed indoors may be used as a transmitter, and a user communication device for short-range wireless communication carried by the user may be used as a receiver. The position estimation unit may estimate the position of the user by acquiring the signal transmitted from the indoor communication device and received by the user communication device from the user communication device. At this time, a specific example of the user communication device is a mobile terminal such as a smartphone terminal or a tablet terminal.
<変形例3>
 本実施の形態では、ユーザごとの場所対応表がない場合、行動推定部は個人情報からユーザの役職あるいは役割を取得し、役職あるいは役割に対応する場所対応表を選択する。個人情報が無い場合は、役職あるいは役割といった属性のテーブルを参照し、ユーザ20の役職あるいは役割を取得してもよい。
<Modification example 3>
In the present embodiment, when there is no location correspondence table for each user, the behavior estimation unit acquires the user's job title or role from the personal information and selects the location correspondence table corresponding to the job title or role. If there is no personal information, the job title or role of the user 20 may be acquired by referring to the table of attributes such as job title or role.
<変形例4>
 本実施の形態では、情報更新部は、ユーザからの更新要求を受け、場所対応表あるいは詳細対応表といった情報を更新する。しかし、行動種別推定装置において、屋内の各ユーザの位置ごとの活動記録を収集し学習することで、場所対応表あるいは詳細対応表を自動的に作成あるいは更新してもよい。
<Modification example 4>
In the present embodiment, the information update unit receives an update request from the user and updates information such as a location correspondence table or a detailed correspondence table. However, the action type estimation device may automatically create or update the location correspondence table or the detailed correspondence table by collecting and learning the activity record for each position of each user indoors.
<変形例5>
 行動推定部に以下のような機能を持たせ、ユーザの行動の推定精度を向上させてもよい。
 (1)移動開始時と移動終了時の位置は判明しているが、移動中の位置が不明である場合がある。このような場合は、行動推定部は、エレベータの運行履歴情報、および、ACSによるゲート通過の履歴情報といったビル側が取得できる情報と、移動時間とを用いて、移動中の位置を推定する。そして、行動推定部は、推定位置を用いて場所対応表、詳細対応表、あるいは場所予定表を選択し、行動を推定する。
 (2)ユーザが自己の座席付近で停止している場合は、座っている可能性が高い。一方、他人の座席付近で停止している場合は、立っている可能性が高い。よって、ユーザが自己の座席付近で停止している場合は、「座り」と推定する。また、ユーザが他人の座席付近で停止している場合は、「立ち」と推定する。
 (3)位置情報が一定時間以上不明な場合、行動推定部は、その日の行動履歴が似ている人物を抽出し、ユーザがその人物と同様の行動をとっているとして、行動を推定する。
 (4)行動推定部は、時期、曜日、出勤時間および退社時間、ならびに、前日および翌日の予定といった情報を加味して、ユーザの行動を推定する。時期の例として、期末あるいは入社直後は通常とは異なる場所対応表および詳細対応表が設定される。曜日の例として、曜日固定の予定がある場合は、予定に応じた場所対応表および詳細対応表が設定される。出勤時間および退社時間の例として、行動推定部は、遅刻時早退時は普段と行動が変わることを加味する。前日および翌日の予定の例として、翌日にプレゼンテーションの予定がある場合、行動推定部は、前日の会議ではプレゼンテーションの予行練習を会議室で立って行っていると推定する。
 (5)行動推定部は、ユーザが階段を移動中の場合、階段の前後の滞在フロアを識別することにより、階段を上っているか下っているかを推定する。
<Modification 5>
The behavior estimation unit may be provided with the following functions to improve the estimation accuracy of the user's behavior.
(1) Although the positions at the start and end of the movement are known, the position during the movement may be unknown. In such a case, the behavior estimation unit estimates the moving position by using the information that can be acquired by the building side such as the operation history information of the elevator and the history information of passing through the gate by ACS, and the moving time. Then, the behavior estimation unit selects a location correspondence table, a detailed correspondence table, or a location schedule using the estimated position, and estimates the behavior.
(2) If the user is stopped near his / her seat, it is highly likely that he / she is sitting. On the other hand, if you are stopped near someone else's seat, you are likely to be standing. Therefore, when the user is stopped near his / her seat, it is presumed to be “sitting”. If the user is stopped near another person's seat, it is presumed to be "standing".
(3) When the position information is unknown for a certain period of time or more, the behavior estimation unit extracts a person having a similar behavior history on that day, and estimates the behavior assuming that the user is taking the same behavior as that person.
(4) The behavior estimation unit estimates the user's behavior by adding information such as time, day of the week, work time and leaving time, and schedules of the previous day and the next day. As an example of the timing, a place correspondence table and a detailed correspondence table different from the usual ones are set at the end of the term or immediately after joining the company. As an example of the day of the week, if there is a schedule with a fixed day of the week, a location correspondence table and a detailed correspondence table are set according to the schedule. As an example of attendance time and leaving time, the behavior estimation department takes into account that the behavior changes normally when the person is late or leaves early. As an example of the schedule of the previous day and the next day, if the presentation is scheduled for the next day, the behavior estimation department presumes that the rehearsal of the presentation is performed standing in the meeting room at the meeting of the previous day.
(5) When the user is moving on the stairs, the behavior estimation unit estimates whether the user is going up or down the stairs by identifying the floors before and after the stairs.
<変形例6>
 本実施の形態では、位置推定部110と行動推定部120と行動履歴生成部130と情報更新部140の機能がソフトウェアで実現される。変形例として、位置推定部110と行動推定部120と行動履歴生成部130と情報更新部140の機能がハードウェアで実現されてもよい。
<Modification 6>
In the present embodiment, the functions of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140 are realized by software. As a modification, the functions of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140 may be realized by hardware.
 図10は、本実施の形態の変形例に係る行動種別推定システム500の構成を示す図である。
 行動種別推定装置100は、電子回路909、メモリ921、補助記憶装置922、入力インタフェース930、および出力インタフェース940を備える。
FIG. 10 is a diagram showing a configuration of an action type estimation system 500 according to a modified example of the present embodiment.
The action type estimation device 100 includes an electronic circuit 909, a memory 921, an auxiliary storage device 922, an input interface 930, and an output interface 940.
 電子回路909は、位置推定部110と行動推定部120と行動履歴生成部130と情報更新部140の機能を実現する専用の電子回路である。
 電子回路909は、具体的には、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ロジックIC、GA、ASIC、または、FPGAである。GAは、Gate Arrayの略語である。ASICは、Application Specific Integrated Circuitの略語である。FPGAは、Field-Programmable Gate Arrayの略語である。
 位置推定部110と行動推定部120と行動履歴生成部130と情報更新部140の機能は、1つの電子回路で実現されてもよいし、複数の電子回路に分散して実現されてもよい。
 別の変形例として、位置推定部110と行動推定部120と行動履歴生成部130と情報更新部140の一部の機能が電子回路で実現され、残りの機能がソフトウェアで実現されてもよい。
The electronic circuit 909 is a dedicated electronic circuit that realizes the functions of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140.
The electronic circuit 909 is specifically a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, a logic IC, a GA, an ASIC, or an FPGA. GA is an abbreviation for Gate Array. ASIC is an abbreviation for Application Specific Integrated Circuit. FPGA is an abbreviation for Field-Programmable Gate Array.
The functions of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140 may be realized by one electronic circuit or may be distributed to a plurality of electronic circuits.
As another modification, some functions of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140 may be realized by an electronic circuit, and the remaining functions may be realized by software.
 プロセッサと電子回路の各々は、プロセッシングサーキットリとも呼ばれる。つまり、行動種別推定装置100において、位置推定部110と行動推定部120と行動履歴生成部130と情報更新部140の機能は、プロセッシングサーキットリにより実現される。 Each of the processor and the electronic circuit is also called a processing circuit. That is, in the action type estimation device 100, the functions of the position estimation unit 110, the action estimation unit 120, the action history generation unit 130, and the information update unit 140 are realized by the processing circuit.
***本実施の形態の効果の説明***
 本実施の形態に係る行動種別推定システムでは、屋内における位置と行動種別とを紐づけた場所対応表および詳細対応表を用いて、屋内におけるユーザの位置のみからユーザの行動種別を推定する。よって、本実施の形態に係る行動種別推定システムによれば、加速度計といった動きセンサを用いずに行動を推定することができる。また、ユーザがウェアラブル端末を身に着ける負担を減らすことができる。また、通常の入退管理といった屋内人位置測位を流用することで、サービス提供側の情報収集が容易になり、さらにコスト面での負担を減らすことができる。
*** Explanation of the effect of this embodiment ***
In the action type estimation system according to the present embodiment, the user's action type is estimated only from the user's position indoors by using the location correspondence table and the detailed correspondence table in which the indoor position and the action type are linked. Therefore, according to the behavior type estimation system according to the present embodiment, the behavior can be estimated without using a motion sensor such as an accelerometer. In addition, the burden on the user to wear the wearable terminal can be reduced. In addition, by diverting indoor person positioning such as normal entry / exit management, it becomes easier to collect information on the service provider side, and the burden on cost can be further reduced.
 本実施の形態に係る行動種別推定システムでは、場所対応表、詳細対応表、および場所予定表といった情報を複数備え、ユーザの個人情報、場所、あるいは時間帯によって、適切な情報を選択して参照する。よって、本実施の形態に係る行動種別推定システムによれば、ユーザの行動の推定精度を向上させることができる。 The action type estimation system according to the present embodiment includes a plurality of information such as a location correspondence table, a detailed correspondence table, and a location schedule, and selects and refers to appropriate information according to the user's personal information, location, or time zone. To do. Therefore, according to the behavior type estimation system according to the present embodiment, it is possible to improve the estimation accuracy of the user's behavior.
 本実施の形態に係る行動種別推定システムでは、ウェアラブル端末といった比較的高コストな機器よりも低コストであるビーコンといった機器を持たせることで、ビル利用者の行動を推定でき、活動量を判定に役立てることができる。加速度センサのみでは階段の上りか下りかの識別精度を向上させることは困難である。スマートフォンでは、気圧センサを併用して階段を上っているか下っているかを判定していることが多い。本実施の形態に係る行動種別推定システムでは、ビーコンといった機器の精度でも、階段の前後の滞在フロアを識別できれば、判定することが可能となる。 In the behavior type estimation system according to the present embodiment, the behavior of a building user can be estimated and the amount of activity can be determined by having a device such as a beacon, which has a lower cost than a relatively high cost device such as a wearable terminal. Can be useful. It is difficult to improve the accuracy of distinguishing between climbing and descending stairs using only the acceleration sensor. Smartphones often use barometric pressure sensors to determine whether they are climbing or descending stairs. In the action type estimation system according to the present embodiment, it is possible to determine the accuracy of a device such as a beacon if the stay floors before and after the stairs can be identified.
 実施の形態2.
 本実施の形態では、主に、実施の形態1と異なる点について説明する。なお、実施の形態1と同様の構成には同一の符号を付し、その説明を省略する場合がある。
Embodiment 2.
In this embodiment, the points different from those in the first embodiment will be mainly described. The same components as those in the first embodiment may be designated by the same reference numerals, and the description thereof may be omitted.
***構成の説明***
 図11を用いて、本実施の形態に係る行動種別推定装置100aの構成について説明する。本実施の形態に係る行動種別推定装置100aは、実施の形態1の行動種別推定装置100の構成に加え、ユーザ予定表156を備える。
*** Explanation of configuration ***
The configuration of the action type estimation device 100a according to the present embodiment will be described with reference to FIG. The action type estimation device 100a according to the present embodiment includes a user schedule 156 in addition to the configuration of the action type estimation device 100 according to the first embodiment.
 図12は、本実施の形態に係るユーザ予定表156の例を示す図である。
 ユーザ予定表156には、ユーザ20の作業予定564として日時561と場所562と作業内容563とが設定されている。作業予定564とは、ユーザ20の1日のスケジュールであり、日時561と場所562と作業内容563とから構成される。また、ユーザ予定表156には、ユーザ20を識別する社員番号および社員名が設定される。さらに、ユーザ予定表156には、作業内容563に対応する行動種別が設定されていてもよい。
 本実施の形態では、個人情報151とユーザ予定表156が別の情報として記載されているが、個人情報151とユーザ予定表156を1つの情報として構成してもよい。
FIG. 12 is a diagram showing an example of the user schedule 156 according to the present embodiment.
In the user schedule 156, a date and time 561, a place 562, and a work content 563 are set as a work schedule 564 of the user 20. The work schedule 564 is a daily schedule of the user 20, and is composed of a date and time 561, a place 562, and a work content 563. In addition, an employee number and an employee name that identify the user 20 are set in the user calendar 156. Further, the user schedule 156 may be set with an action type corresponding to the work content 563.
In the present embodiment, the personal information 151 and the user calendar 156 are described as separate information, but the personal information 151 and the user calendar 156 may be configured as one information.
 なお、情報更新部140は、ユーザ予定表156についても、ユーザからの更新要求に基づき更新する。すなわち、情報更新部140は、場所対応表152と詳細対応表153と場所予定表154とユーザ予定表156との少なくともいずれかを更新する。 The information update unit 140 also updates the user schedule 156 based on the update request from the user. That is, the information updating unit 140 updates at least one of the location correspondence table 152, the detailed correspondence table 153, the location schedule 154, and the user schedule 156.
***動作の説明***
 図13を用いて、本実施の形態に係る行動推定部120の動作について説明する。
 ステップS301aにおいて、行動推定部120は、ユーザ予定表156を取得する。具体的には、行動推定部120は、ユーザ20に対応するユーザ予定表156を選択する。
 ステップS301bにおいて、行動推定部120は、現在時刻とユーザ20の位置とユーザ予定表156とを用いて、ユーザ20が作業内容563に合致した場所にいるか否かを判定する。ユーザ20が作業内容563に合致した場所にいると判定すると、ステップS301cに進む。
*** Explanation of operation ***
The operation of the behavior estimation unit 120 according to the present embodiment will be described with reference to FIG.
In step S301a, the behavior estimation unit 120 acquires the user schedule 156. Specifically, the behavior estimation unit 120 selects the user schedule 156 corresponding to the user 20.
In step S301b, the action estimation unit 120 determines whether or not the user 20 is in a place that matches the work content 563 by using the current time, the position of the user 20, and the user schedule 156. If it is determined that the user 20 is in a place matching the work content 563, the process proceeds to step S301c.
 ユーザ20が作業内容563に合致した場所にいないと判定すると、ステップS301に進む。ステップS301では、行動推定部120は、場所対応表152におけるユーザ20の位置に合致する場所領域521に対応する場所行動種別522に基づいてユーザ20の行動を推定する。なお、ステップS301からステップS303の処理は実施の形態1と同様である。 If it is determined that the user 20 is not in a location that matches the work content 563, the process proceeds to step S301. In step S301, the behavior estimation unit 120 estimates the behavior of the user 20 based on the location behavior type 522 corresponding to the location region 521 that matches the location of the user 20 in the location correspondence table 152. The processing of steps S301 to S303 is the same as that of the first embodiment.
 ステップS301cにおいて、行動推定部120は、ユーザ20の作業予定に合致した作業内容563に基づいてユーザ20の行動を推定する。具体的には、現在時刻が2019年3月29日の9:30であり、ユーザ20の位置が会議室Aであることを取得する。行動推定部120は、ユーザ予定表156から、ユーザ20はプレゼンテーション中であると判定し、ユーザ20の行動を「立ち」と推定する。 In step S301c, the action estimation unit 120 estimates the action of the user 20 based on the work content 563 that matches the work schedule of the user 20. Specifically, it is acquired that the current time is 9:30 on March 29, 2019, and the position of the user 20 is the conference room A. From the user schedule 156, the behavior estimation unit 120 determines that the user 20 is in the presentation, and estimates the behavior of the user 20 as "standing".
***本実施の形態の効果の説明***
 本実施の形態に係る行動種別推定システムでは、ユーザの位置がユーザ予定表に合致するものであれば、ユーザ予定表から直接、ユーザの行動を推定する。よって、処理をより簡単にすることができる。また、ユーザ予定表をより詳細に特定することにより、座って聞いている、あるいは、前に立って話をしているというように、より推定精度が高くなる。また、予定している場所にいない場合には、実施の形態1で説明した方法により高精度に行動を推定できる。
*** Explanation of the effect of this embodiment ***
In the action type estimation system according to the present embodiment, if the user's position matches the user's calendar, the user's behavior is estimated directly from the user's calendar. Therefore, the process can be simplified. In addition, by specifying the user calendar in more detail, the estimation accuracy becomes higher, such as sitting and listening or standing in front of the user. In addition, when the user is not at the planned location, the behavior can be estimated with high accuracy by the method described in the first embodiment.
 実施の形態3.
 本実施の形態では、主に、実施の形態1と異なる点について説明する。なお、実施の形態1と同様の構成には同一の符号を付し、その説明を省略する場合がある。
Embodiment 3.
In this embodiment, the points different from those in the first embodiment will be mainly described. The same components as those in the first embodiment may be designated by the same reference numerals, and the description thereof may be omitted.
***構成の説明***
 本実施の形態に係る行動種別推定装置の構成は、実施の形態1の行動種別推定装置100の構成と同様である。
 ユーザの行動種別は、ユーザの運動の種別を含む。実施の形態1で説明したように、ユーザの行動種別は、立ち、座り、歩き、走り、あるいは、早歩きといった運動の種別である。
 本実施の形態では、行動履歴生成部130は、ユーザの行動種別に基づいて、ユーザの運動量を算出する。
*** Explanation of configuration ***
The configuration of the action type estimation device according to the present embodiment is the same as the configuration of the action type estimation device 100 according to the first embodiment.
The user's action type includes the type of the user's exercise. As described in the first embodiment, the action type of the user is a type of exercise such as standing, sitting, walking, running, or fast walking.
In the present embodiment, the action history generation unit 130 calculates the amount of exercise of the user based on the action type of the user.
***動作の説明***
 図14は、本実施の形態に係る行動種別推定装置100の動作を示すフロー図である。図14は、実施の形態1の図7に対応する図である。図14において図7と異なる点は、ステップS204aが追加された点とステップS205がステップS205aとなった点である。
*** Explanation of operation ***
FIG. 14 is a flow chart showing the operation of the action type estimation device 100 according to the present embodiment. FIG. 14 is a diagram corresponding to FIG. 7 of the first embodiment. The difference from FIG. 7 in FIG. 14 is that step S204a is added and step S205 becomes step S205a.
 ステップS204において、行動履歴生成部130は、ユーザ20の行動の推定結果が、前回の行動と同一か否かを判定する。同一でない、すなわちユーザ20の行動に変化がある場合、処理はステップS204aに進む。
 ステップS204aにおいて、行動履歴生成部130は、ユーザの行動種別に基づいて、ユーザの運動量を算出する。具体的には、行動履歴生成部130は、終了したユーザ20の行動における消費カロリーを運動量として算出する。消費カロリーの算出方法としては、行動種別ごとに予め設定されている係数、例えばMETS(Metabolic equivalents)値を用いて消費カロリーを算出する方法がある。行動履歴生成部130は、終了した行動の種別に対応するMETS値に、ユーザの体重と、その行動種別の継続時間とを掛け合わせることで、消費カロリーを算出する。行動種別の継続時間は、終了した行動の種別の開始時刻から現在時刻までの時間として算出される。
In step S204, the action history generation unit 130 determines whether or not the estimation result of the action of the user 20 is the same as the previous action. If they are not the same, that is, if there is a change in the behavior of the user 20, the process proceeds to step S204a.
In step S204a, the action history generation unit 130 calculates the amount of exercise of the user based on the action type of the user. Specifically, the action history generation unit 130 calculates the calorie consumption in the action of the completed user 20 as the amount of exercise. As a method of calculating calories burned, there is a method of calculating calories burned using a coefficient set in advance for each behavior type, for example, a METS (Metabolic equivalents) value. The action history generation unit 130 calculates the calorie consumption by multiplying the METS value corresponding to the type of completed action by the weight of the user and the duration of the action type. The duration of the action type is calculated as the time from the start time of the completed action type to the current time.
 ステップS205aにおいて、行動履歴生成部130は、行動履歴情報155に新たな行動と、新たな行動の開始時刻とを記録するとともに、終了した行動における消費カロリーを記録する。 In step S205a, the action history generation unit 130 records the new action and the start time of the new action in the action history information 155, and also records the calories burned in the finished action.
 図15は、本実施の形態に係る行動履歴情報155aの例を示す図である。
 図15に示すように、行動履歴情報155aには、各行動に対応する消費カロリーが運動量として記憶されている。また、行動履歴情報155aには、運動量を累積した運動量累積値が記録されていてもよい。
FIG. 15 is a diagram showing an example of action history information 155a according to the present embodiment.
As shown in FIG. 15, in the action history information 155a, the calorie consumption corresponding to each action is stored as the amount of exercise. In addition, the action history information 155a may record a cumulative value of the amount of exercise accumulated.
 以上の実施の形態1から2では、行動種別推定装置の各部を独立した機能ブロックとして説明した。しかし、行動種別推定装置の構成は、上述した実施の形態のような構成でなくてもよい。行動種別推定装置の機能ブロックは、上述した実施の形態で説明した機能を実現することができれば、どのような構成でもよい。また、行動種別推定装置は、1つの装置でなく、複数の装置から構成されたシステムでもよい。
 また、実施の形態1から2のうち、複数の部分を組み合わせて実施しても構わない。あるいは、これらの実施の形態のうち、1つの部分を実施しても構わない。その他、これら実施の形態を、全体としてあるいは部分的に、どのように組み合わせて実施しても構わない。
 すなわち、実施の形態1から2では、各実施の形態の自由な組み合わせ、あるいは各実施の形態の任意の構成要素の変形、もしくは各実施の形態において任意の構成要素の省略が可能である。
In the above-described first and second embodiments, each part of the action type estimation device has been described as an independent functional block. However, the configuration of the action type estimation device does not have to be the configuration as in the above-described embodiment. The functional block of the action type estimation device may have any configuration as long as the functions described in the above-described embodiment can be realized. Further, the action type estimation device may be a system composed of a plurality of devices instead of one device.
Further, in the first and second embodiments, a plurality of parts may be combined and carried out. Alternatively, one part of these embodiments may be implemented. In addition, these embodiments may be implemented in any combination as a whole or partially.
That is, in the first and second embodiments, it is possible to freely combine the embodiments, modify any component of each embodiment, or omit any component in each embodiment.
 なお、上述した実施の形態は、本質的に好ましい例示であって、本発明の範囲、本発明の適用物の範囲、および本発明の用途の範囲を制限することを意図するものではない。上述した実施の形態は、必要に応じて種々の変更が可能である。 It should be noted that the above-described embodiment is essentially a preferred example and is not intended to limit the scope of the present invention, the scope of the application of the present invention, and the scope of use of the present invention. The above-described embodiment can be variously modified as needed.
 20 ユーザ、21 ユーザ通信機、22 屋内通信機、23 個人用端末、30 屋内、31 信号、100,100a 行動種別推定装置、110 位置推定部、120 行動推定部、130 行動履歴生成部、140 情報更新部、150 記憶部、151 個人情報、152 場所対応表、153 詳細対応表、154 場所予定表、155,155a 行動履歴情報、156 ユーザ予定表、561 日時、562 場所、563 作業内容、564 作業予定、500 行動種別推定システム、521 場所領域、522 場所行動種別、531 詳細領域、532 詳細行動種別、909 電子回路、910 プロセッサ、921 メモリ、922 補助記憶装置、930 入力インタフェース、940 出力インタフェース、950 通信装置。 20 users, 21 user communication devices, 22 indoor communication devices, 23 personal terminals, 30 indoors, 31 signals, 100, 100a behavior type estimation device, 110 position estimation unit, 120 behavior estimation unit, 130 behavior history generation unit, 140 information Update part, 150 storage part, 151 personal information, 152 place correspondence table, 153 detailed correspondence table, 154 place schedule, 155,155a action history information, 156 user schedule, 561 date and time, 562 places, 563 work contents, 564 work Schedule, 500 action type estimation system, 521 location area, 522 location action type, 513 detailed area, 532 detailed action type, 909 electronic circuit, 910 processor, 921 memory, 922 auxiliary storage device, 930 input interface, 940 output interface, 950 Communication device.

Claims (11)

  1.  屋内におけるユーザの位置を推定する位置推定部と、
     前記屋内における場所を表す場所領域と前記屋内における場所領域での前記ユーザの行動種別である場所行動種別とが設定された場所対応表を取得し、前記場所対応表から前記ユーザの位置に合致する場所領域に対応する場所行動種別を取得し、取得した場所行動種別に基づいて前記ユーザの行動を推定する行動推定部と、
     前記ユーザの行動の推定結果を用いて、前記ユーザの行動の履歴を行動履歴情報として生成する行動履歴生成部と
    を備えた行動種別推定装置。
    A position estimation unit that estimates the user's position indoors,
    A place correspondence table in which a place area representing a place indoors and a place action type which is an action type of the user in the place area in the room is set is acquired, and matches the position of the user from the place correspondence table. An action estimation unit that acquires a place action type corresponding to a place area and estimates the user's action based on the acquired place action type.
    An action type estimation device including an action history generation unit that generates an action history of the user as action history information using the estimation result of the user's action.
  2.  前記位置推定部は、
     前記屋内に複数設置された近距離無線通信用の屋内通信機と、前記ユーザにより携帯されている近距離無線通信用のユーザ通信機との間で通信される信号を取得し、取得した信号から前記ユーザの位置を推定する請求項1に記載の行動種別推定装置。
    The position estimation unit
    A signal to be communicated between a plurality of indoor communication devices for short-range wireless communication installed indoors and a user communication device for short-range wireless communication carried by the user is acquired, and from the acquired signal. The action type estimation device according to claim 1, wherein the position of the user is estimated.
  3.  前記行動推定部は、
     前記場所対応表に含まれる場所領域を区分けした詳細領域と、前記詳細領域に対応する行動種別である詳細行動種別とが設定された詳細対応表を取得し、前記詳細対応表から前記ユーザの位置に合致する詳細領域に対応する詳細行動種別を取得し、前記ユーザの位置に合致する前記場所行動種別と前記ユーザの位置に合致する前記詳細行動種別とに基づいて、前記ユーザの行動を推定する請求項1または請求項2に記載の行動種別推定装置。
    The behavior estimation unit
    A detailed correspondence table in which a detailed area in which the location area included in the location correspondence table is divided and a detailed action type which is an action type corresponding to the detailed area are set is acquired, and the position of the user is obtained from the detailed correspondence table. The detailed action type corresponding to the detailed area corresponding to is acquired, and the action of the user is estimated based on the place action type matching the position of the user and the detailed action type matching the position of the user. The behavior type estimation device according to claim 1 or 2.
  4.  前記行動推定部は、
     前記場所領域における予定が設定された場所予定表を取得し、現在時刻と前記場所予定表を用いて、前記場所行動種別と前記詳細行動種別と前記場所予定表に設定された現在の予定とに基づいて、前記ユーザの行動を推定する請求項3に記載の行動種別推定装置。
    The behavior estimation unit
    The place schedule in which the schedule is set in the place area is acquired, and the current time and the place schedule are used to set the place action type, the detailed action type, and the current schedule set in the place schedule. The behavior type estimation device according to claim 3, which estimates the behavior of the user based on the behavior.
  5.  前記行動推定部は、
     前記ユーザの作業予定として日時と場所と作業内容とが設定されたユーザ予定表を取得し、現在時刻と前記ユーザの位置と前記ユーザ予定表とを用いて、前記ユーザが前記作業予定に合致した場所にいるか否かを判定し、前記ユーザが前記作業予定に合致した場所にいると判定すると、前記ユーザの作業予定に合致した作業内容に基づいて前記ユーザの行動を推定する請求項3または請求項4に記載の行動種別推定装置。
    The behavior estimation unit
    A user calendar in which a date and time, a place, and a work content are set as a work schedule of the user is acquired, and the user matches the work schedule by using the current time, the position of the user, and the user schedule. Claim 3 or claim that, when it is determined whether or not the user is in a place and the user is in a place that matches the work schedule, the behavior of the user is estimated based on the work content that matches the work schedule of the user. Item 4. The action type estimation device according to item 4.
  6.  前記行動推定部は、
     前記ユーザが前記作業予定に合致した場所にいないと判定すると、前記場所対応表における前記ユーザの位置に合致する場所領域に対応する場所行動種別に基づいて前記ユーザの行動を推定する請求項5に記載の行動種別推定装置。
    The behavior estimation unit
    According to claim 5, when it is determined that the user is not in a place that matches the work schedule, the behavior of the user is estimated based on the place action type corresponding to the place area that matches the position of the user in the place correspondence table. The described behavior type estimation device.
  7.  前記行動種別推定装置は、
     前記場所対応表と前記詳細対応表と場所予定表とユーザ予定表との少なくともいずれかを更新する情報更新部を備えた請求項6に記載の行動種別推定装置。
    The action type estimation device is
    The action type estimation device according to claim 6, further comprising an information updating unit that updates at least one of the location correspondence table, the detailed correspondence table, the location schedule, and the user schedule.
  8.  前記行動推定部は、
     前記ユーザの行動種別の時間軸方向における遷移、あるいは、前記ユーザの位置の時間軸方向における遷移に基づいて、前記ユーザの行動を推定する請求項1から請求項7のいずれか1項に記載の行動種別推定装置。
    The behavior estimation unit
    The method according to any one of claims 1 to 7, wherein the user's behavior is estimated based on the transition of the user's behavior type in the time axis direction or the transition of the user's position in the time axis direction. Behavior type estimation device.
  9.  前記ユーザの行動種別は、前記ユーザの運動の種別を含み、
     前記行動履歴生成部は、前記ユーザの行動種別に基づいて前記ユーザの運動量を算出する請求項1から請求項8のいずれか1項に記載の行動種別推定装置。
    The action type of the user includes the type of exercise of the user.
    The action type estimation device according to any one of claims 1 to 8, wherein the action history generation unit calculates the amount of exercise of the user based on the action type of the user.
  10.  位置推定部が、屋内におけるユーザの位置を推定し、
     行動推定部が、前記屋内における場所を表す場所領域と前記屋内における場所領域での前記ユーザの行動種別である場所行動種別とが設定された場所対応表を取得し、前記場所対応表から前記ユーザの位置に合致する場所領域に対応する場所行動種別を取得し、取得した場所行動種別に基づいて前記ユーザの行動を推定し、
     行動履歴生成部が、前記ユーザの行動の推定結果を用いて、前記ユーザの行動の履歴を行動履歴情報として生成する行動種別推定方法。
    The position estimater estimates the user's position indoors and
    The action estimation unit acquires a place correspondence table in which a place area representing the indoor place and a place action type which is the action type of the user in the indoor place area are set, and the user from the place correspondence table. The place action type corresponding to the place area corresponding to the position of is acquired, and the behavior of the user is estimated based on the acquired place action type.
    An action type estimation method in which an action history generation unit uses an estimation result of the user's action to generate a history of the user's action as action history information.
  11.  屋内におけるユーザの位置を推定する位置推定処理と、
     前記屋内における場所を表す場所領域と前記屋内における場所領域での前記ユーザの行動種別である場所行動種別とが設定された場所対応表を取得し、前記場所対応表から前記ユーザの位置に合致する場所領域に対応する場所行動種別を取得し、取得した場所行動種別に基づいて前記ユーザの行動を推定する行動推定処理と、
     前記ユーザの行動の推定結果を用いて、前記ユーザの行動の履歴を行動履歴情報として生成する行動履歴生成処理と
    をコンピュータに実行させる行動種別推定プログラム。
    Position estimation processing that estimates the user's position indoors,
    A place correspondence table in which a place area representing a place indoors and a place action type which is an action type of the user in the place area in the room is set is acquired, and matches the position of the user from the place correspondence table. An action estimation process that acquires a place action type corresponding to a place area and estimates the user's action based on the acquired place action type.
    An action type estimation program that causes a computer to execute an action history generation process that generates an action history of the user as action history information using the user's action estimation result.
PCT/JP2019/027340 2019-07-10 2019-07-10 Behavior type estimation device, behavior type estimation method, and behavior type estimation program WO2021005750A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020510616A JPWO2021005750A1 (en) 2019-07-10 2019-07-10 Behavior type estimation device, behavior type estimation method, and behavior type estimation program
PCT/JP2019/027340 WO2021005750A1 (en) 2019-07-10 2019-07-10 Behavior type estimation device, behavior type estimation method, and behavior type estimation program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/027340 WO2021005750A1 (en) 2019-07-10 2019-07-10 Behavior type estimation device, behavior type estimation method, and behavior type estimation program

Publications (1)

Publication Number Publication Date
WO2021005750A1 true WO2021005750A1 (en) 2021-01-14

Family

ID=74114133

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/027340 WO2021005750A1 (en) 2019-07-10 2019-07-10 Behavior type estimation device, behavior type estimation method, and behavior type estimation program

Country Status (2)

Country Link
JP (1) JPWO2021005750A1 (en)
WO (1) WO2021005750A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008090444A (en) * 2006-09-29 2008-04-17 Fujitsu Access Ltd Behavior display system
JP2014018570A (en) * 2012-07-23 2014-02-03 Takenaka Komuten Co Ltd Health management system and program
JP2014056410A (en) * 2012-09-12 2014-03-27 Ricoh Co Ltd Information processing device, action management system, and action management method
WO2017073421A1 (en) * 2015-10-30 2017-05-04 日本ビジネスシステムズ株式会社 Information presentation system, program, and information presentation method
JP2017091052A (en) * 2015-11-05 2017-05-25 株式会社Nttドコモ Extraction device
JP2018032294A (en) * 2016-08-26 2018-03-01 株式会社野村総合研究所 Communication support system, communication support method, computer program, and location confirmation method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11127020B2 (en) * 2009-11-20 2021-09-21 Palo Alto Research Center Incorporated Generating an activity inference model from contextual data

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008090444A (en) * 2006-09-29 2008-04-17 Fujitsu Access Ltd Behavior display system
JP2014018570A (en) * 2012-07-23 2014-02-03 Takenaka Komuten Co Ltd Health management system and program
JP2014056410A (en) * 2012-09-12 2014-03-27 Ricoh Co Ltd Information processing device, action management system, and action management method
WO2017073421A1 (en) * 2015-10-30 2017-05-04 日本ビジネスシステムズ株式会社 Information presentation system, program, and information presentation method
JP2017091052A (en) * 2015-11-05 2017-05-25 株式会社Nttドコモ Extraction device
JP2018032294A (en) * 2016-08-26 2018-03-01 株式会社野村総合研究所 Communication support system, communication support method, computer program, and location confirmation method

Also Published As

Publication number Publication date
JPWO2021005750A1 (en) 2021-09-13

Similar Documents

Publication Publication Date Title
CN107850443B (en) Information processing apparatus, information processing method, and program
US9740773B2 (en) Context labels for data clusters
CN105190233B (en) Position determines that processing unit, position determine that processing method, position determine processing routine, portable information processor, mobile message processing method, mobile message processing routine and storage medium
US20170142589A1 (en) Method for adjusting usage policy and electronic device for supporting the same
JP2018067328A (en) Method and system for communication in predetermined location
CN106537946A (en) Scoring beacon messages for mobile device wake-up
JP5714937B2 (en) Attendance confirmation system
US10769737B2 (en) Information processing device, information processing method, and program
CN104217096A (en) Method and system for creating and refining rules for personalized content delivery based on users physical activites
CN107548568A (en) The system and method that context for functions of the equipments is found
CN105631640A (en) Representing in an electronic calendar travel time to and from an event
CN111247782B (en) Method and system for automatically creating instant AD-HOC calendar events
WO2021175198A1 (en) Method for invoking nfc applications, electronic device, and nfc apparatus
CN103348375B (en) Terminal control mechanism
EP2958307A1 (en) Information processing device, information processing method, and program
EP4449321A1 (en) Method and system for facilitating convergence
WO2015190141A1 (en) Information processing device, information processing method, and program
WO2021005750A1 (en) Behavior type estimation device, behavior type estimation method, and behavior type estimation program
JP7331526B2 (en) Information processing device, document creation method, program
JP2016157384A (en) Congestion estimation apparatus and congestion estimation method
US20220394098A1 (en) Information processing system, system, and information processing method
KR102421076B1 (en) Device for matching users of shared real estate
CN105786174B (en) Method and system for transmitting signals between electronic devices
CN116415061A (en) Service recommendation method and related device
JP7313216B2 (en) Information processing system, information processing device, and program

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2020510616

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19937135

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19937135

Country of ref document: EP

Kind code of ref document: A1