CN111758015B - Dynamic detection device and dynamic detection method - Google Patents

Dynamic detection device and dynamic detection method Download PDF

Info

Publication number
CN111758015B
CN111758015B CN201880090037.4A CN201880090037A CN111758015B CN 111758015 B CN111758015 B CN 111758015B CN 201880090037 A CN201880090037 A CN 201880090037A CN 111758015 B CN111758015 B CN 111758015B
Authority
CN
China
Prior art keywords
floor
unit
air pressure
coordinate
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880090037.4A
Other languages
Chinese (zh)
Other versions
CN111758015A (en
Inventor
川浦健央
加岛隆博
佐藤健二
安田浩之
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Building Solutions Corp
Original Assignee
Mitsubishi Electric Building Solutions Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Building Solutions Corp filed Critical Mitsubishi Electric Building Solutions Corp
Publication of CN111758015A publication Critical patent/CN111758015A/en
Application granted granted Critical
Publication of CN111758015B publication Critical patent/CN111758015B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/005Traffic control systems for road vehicles including pedestrian guidance indicator

Abstract

The device comprises: an inertial sensor (102) that determines the motion of the machine; a pedestrian autonomous navigation unit (103) that calculates the relative coordinates of the user who holds the pedestrian autonomous navigation unit with respect to the reference coordinates, based on the motion measured by the inertial sensor (102); an air pressure sensor (104) that measures the air pressure around the host; a movement means estimating unit (105) for estimating the type of movement means used when the user moves between floors in a building, based on the measured movement and the measured air pressure; a floor estimating unit (106) for estimating a floor on which the user is located, based on the measured air pressure; an electronic compass (107) that detects the orientation of the travel direction of the host; a reference coordinate estimating unit (108) that estimates, as reference coordinates, absolute coordinates of a lifting/lowering opening of the moving means used by the user on the basis of the estimated type of the moving means, the estimated floor, and the detected azimuth; and a coordinate conversion unit (109) that converts the calculated relative coordinates into absolute coordinates using the estimated reference coordinates.

Description

Dynamic detection device and dynamic detection method
Technical Field
The present invention relates to a dynamic detection device and a dynamic detection method for detecting the dynamics of a user who holds a local device.
Background
Conventionally, a dynamic detection device for detecting the dynamic state of a user who holds the device is known (for example, refer to patent document 1). In the device disclosed in patent document 1, the user's movement is detected by detecting the height and recognizing the user's walking motion, and estimating the type of movement means used by the user. Examples of the type of the moving means include stairs and elevators installed in a building such as a building.
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open No. 2007-093433
Disclosure of Invention
Problems to be solved by the invention
However, when a plurality of moving means exist in a building, the apparatus has a problem that the moving means used by the user cannot be estimated. In this case, it is necessary to provide a separate unit for providing the general position information on the building side for the apparatus, and the range of the moving means used by the user is narrowed.
The present invention has been made to solve the above-described problems, and an object thereof is to provide a dynamic detection device as follows: even when a plurality of moving means exist in a building, the device alone can be used to detect the dynamics of a user who holds the device.
Means for solving the problems
The dynamic detection device of the present invention is characterized in that the dynamic detection device comprises: an inertial sensor that measures the motion of the machine; a pedestrian autonomous navigation unit that calculates relative coordinates of a user who holds the pedestrian autonomous navigation unit with respect to the reference coordinates based on the motion measured by the inertial sensor; an air pressure sensor for measuring the air pressure around the host; a floor estimating unit for estimating a floor on which the user is located, based on the air pressure measured by the air pressure sensor; a time counting unit that counts a time for which the air pressure is fixed during walking of the user, based on the movement measured by the inertial sensor and the air pressure measured by the air pressure sensor; a reference coordinate estimating unit that estimates, as a reference coordinate, an absolute coordinate of a lifting/lowering port in the floor by a moving means used by a user, based on the floor estimated by the floor estimating unit and the time counted by the time counting unit; and a coordinate conversion unit that converts the relative coordinates calculated by the walker autonomous navigation unit into absolute coordinates using the reference coordinates estimated by the reference coordinate estimation unit.
Effects of the invention
According to the present invention, since the above-described configuration is adopted, even when a plurality of moving means exist in a building, the user's own dynamics can be detected by the device alone.
Drawings
Fig. 1 is a block diagram showing a configuration example of a dynamic detection device according to embodiment 1 of the present invention.
Fig. 2 is a diagram showing an example of a hardware configuration of the dynamic detection device according to embodiment 1 of the present invention.
Fig. 3 is a flowchart showing an example of the operation of the dynamic sensing device according to embodiment 1 of the present invention.
Fig. 4 is a flowchart showing an example of the operation of the motion estimation unit in embodiment 1 of the present invention.
Fig. 5A is a view showing an example of the measurement result of the inertial sensor and the air pressure sensor when the user moves by the stairs, and fig. 5B is a view showing an example of the measurement result of the inertial sensor and the air pressure sensor when the user moves by the elevators.
Fig. 6 is a diagram showing an outline of a process of floor estimation by the floor estimating unit in embodiment 1 of the present invention.
Fig. 7 is a floor diagram for explaining reference coordinate estimation by the reference coordinate estimating unit in embodiment 1 of the present invention.
Fig. 8 is a diagram showing information recorded in the coordinate information recording unit in embodiment 1 of the present invention, and corresponds to the floor map shown in fig. 7.
Fig. 9 is a flowchart showing an example of the operation of the reference coordinate estimating unit in embodiment 1 of the present invention.
Fig. 10 is a block diagram showing a configuration example of a dynamic detection device according to embodiment 2 of the present invention.
Fig. 11 is a floor diagram for explaining reference coordinate estimation by the reference coordinate estimating unit in embodiment 2 of the present invention.
Fig. 12 is a diagram showing information recorded in the coordinate information recording unit in embodiment 2 of the present invention, and corresponds to the floor map shown in fig. 11.
Fig. 13A is a view showing a case where the user moves on the jog dial of the stairway, and fig. 13B is a view showing an example of the measurement result of the air pressure sensor when the user moves on the jog dial.
Fig. 14 is a flowchart showing an example of the operation of the reference coordinate estimating unit according to embodiment 2 of the present invention.
Fig. 15 is a block diagram showing a configuration example of a dynamic detection device according to embodiment 3 of the present invention.
Fig. 16 is a floor diagram for explaining an outline of the process of reference coordinate estimation by the reference coordinate estimating unit and the reference coordinate determining unit in embodiment 3 of the present invention.
Fig. 17 is a flowchart showing an example of the operation of the reference coordinate determination unit in embodiment 3 of the present invention.
Detailed Description
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
Embodiment 1
Fig. 1 is a block diagram showing a configuration example of a dynamic detection device 1 according to embodiment 1 of the present invention.
The dynamic detection device 1 detects the dynamics of a user who holds the device (dynamic detection device 1). Examples of the user include an operator who makes a round trip to each floor of a building such as a building. As shown in fig. 1, the dynamic state detection device 1 includes a coordinate information recording unit 101, an inertial sensor 102, a pedestrian autonomous navigation unit 103, an air pressure sensor 104, a movement means estimating unit 105, a floor estimating unit 106, an electronic compass 107, a reference coordinate estimating unit 108, and a coordinate converting unit 109.
The coordinate information recording unit 101 records, for each floor, information indicating the absolute coordinates of the elevator entrance of the moving means existing in the building and the azimuth of the elevator entrance. The orientation of the hoistway is the orientation of the entrance/exit of the hoistway as viewed from the center of the hoistway. The information recorded in the coordinate information recording unit 101 is read out by the reference coordinate estimating unit 108.
In addition, although fig. 1 shows a case where the coordinate information recording unit 101 is provided inside the dynamic detection apparatus 1, the present invention is not limited to this, and the coordinate information recording unit 101 may be provided outside the dynamic detection apparatus 1.
The inertial sensor 102 measures the motion of the own machine (dynamic detection device 1). The motion of the machine includes acceleration, rotation, and the like. Information indicating the motion measured by the inertial sensor 102 is output to the pedestrian autonomous navigation unit 103 and the movement means estimation unit 105.
The pedestrian autonomous navigation unit 103 calculates the relative coordinates of the user with respect to the reference coordinates based on the motion measured by the inertial sensor 102. Information indicating the relative coordinates calculated by the pedestrian autonomous navigation unit 103 is output to the coordinate conversion unit 109.
The air pressure sensor 104 measures the air pressure around the host (dynamic detection device 1). Information indicating the air pressure measured by the air pressure sensor 104 is output to the movement means estimating unit 105 and the floor estimating unit 106.
The movement means estimating unit 105 estimates the type of the movement means used by the user for the movement between floors based on the movement measured by the inertial sensor 102 and the air pressure measured by the air pressure sensor 104. Examples of the type of the moving means include stairs and elevators. Information indicating the type of the moving means estimated by the moving means estimating unit 105 is output to the reference coordinate estimating unit 108.
The floor estimating unit 106 estimates the floor on which the user is located based on the air pressure measured by the air pressure sensor 104. Information indicating the floor estimated by the floor estimating unit 106 is output to the reference coordinate estimating unit 108.
The electronic compass 107 detects the orientation of the traveling direction of the own machine (dynamic detection device 1). Information indicating the azimuth detected by the electronic compass 107 is output to the reference coordinate estimating unit 108.
The reference coordinate estimating unit 108 estimates, as the reference coordinate, the absolute coordinate of the elevator entrance of the moving means used by the user on the floor based on the type of the moving means estimated by the moving means estimating unit 105, the floor estimated by the floor estimating unit 106, and the azimuth detected by the electronic compass 107, from the information recorded in the coordinate information recording unit 101. Information indicating the reference coordinates estimated by the reference coordinate estimating unit 108 is output to the coordinate converting unit 109.
The coordinate conversion unit 109 converts the relative coordinates calculated by the pedestrian autonomous navigation unit 103 into absolute coordinates using the reference coordinates estimated by the reference coordinate estimation unit 108. At this time, the coordinate conversion unit 109 adds the reference coordinates to the relative coordinates, thereby obtaining the absolute coordinates of the user. Information indicating the absolute coordinates obtained by the coordinate conversion unit 109 is output to the outside.
Fig. 2 shows an example of a hardware configuration of a computer of the dynamic detection apparatus 1 according to embodiment 1 of the present invention.
The functions of the pedestrian autonomous navigation unit 103, the movement means estimation unit 105, the floor estimation unit 106, the reference coordinate estimation unit 108, and the coordinate conversion unit 109 are realized by a processing circuit. As shown in fig. 2, the processing circuit is a CPU (Central Processing Unit (central processing unit), also called central processing unit, processing device, arithmetic device, microprocessor, microcomputer, processor or DSP (Digital Signal Processor: digital signal processor)) 201 that executes programs stored in the system memory 202.
The functions of the pedestrian autonomous navigation portion 103, the movement means estimation portion 105, the floor estimation portion 106, the reference coordinate estimation portion 108, and the coordinate conversion portion 109 are realized by software, firmware, or a combination of software and firmware. The software and firmware are described as programs and stored in the system memory 202. The processing circuit reads out and executes a program stored in the system memory 202, thereby realizing the functions of each section. That is, the dynamic state detecting apparatus 1 has a system memory 202, and the system memory 202 stores a program that, when executed by a processing circuit, performs each step shown in fig. 3, for example, as a result. These programs can also be said to cause the computer to execute the steps and methods of the pedestrian autonomous navigation unit 103, the movement means estimation unit 105, the floor estimation unit 106, the reference coordinate estimation unit 108, and the coordinate conversion unit 109. Here, the system Memory 202 is, for example, a nonvolatile or volatile semiconductor Memory such as RAM (Random Access Memory: random access Memory), ROM (Read Only Memory), flash Memory, EPROM (Erasable Programmable ROM: erasable programmable Read Only Memory), EEPROM (Electrically EPROM: electrically erasable programmable Read Only Memory), a magnetic disk, a floppy disk, an optical disk, a high-density disk, a mini disk, a DVD (Digital Versatile Disc: digital versatile disk), or the like.
In addition, the GUI for the operation of the above program is displayed on the display device 208 via the GPU204, the frame memory 205, and the RAMDAC (Random Access Memory Digital-to-Analog Converter) 206. The user operates the GUI using the operation device 207.
The coordinate information recording unit 101 uses the memory 203 for recording information.
Next, an operation example of the dynamic sensing device 1 according to embodiment 1 will be described with reference to fig. 3. The coordinate information recording unit 101 records, for each floor, information indicating the absolute coordinates of the elevator entrance of the moving means existing in the building and the azimuth of the elevator entrance.
In an operation example of the dynamic sensing device 1, as shown in fig. 3, first, the inertial sensor 102 measures the motion of the own machine (step ST 301). Information indicating the motion measured by the inertial sensor 102 is output to the pedestrian autonomous navigation unit 103 and the movement means estimation unit 105.
Next, the pedestrian autonomous navigation unit 103 calculates the relative coordinates of the user with respect to the reference coordinates based on the detection result of the inertial sensor 102 (step ST 302). Information indicating the relative coordinates calculated by the pedestrian autonomous navigation unit 103 is output to the coordinate conversion unit 109.
The air pressure sensor 104 measures the air pressure around the host (step ST 303). Information indicating the air pressure measured by the air pressure sensor 104 is output to the movement means estimating unit 105 and the floor estimating unit 106.
Next, the movement means estimating unit 105 estimates the type of the movement means used by the user for the movement between floors based on the movement measured by the inertial sensor 102 and the air pressure measured by the air pressure sensor 104 (step ST 304). The operation of the movement means estimating unit 105 will be described in detail later. Information indicating the type of the moving means estimated by the moving means estimating unit 105 is output to the reference coordinate estimating unit 108.
The floor estimating unit 106 estimates the floor on which the user is located based on the air pressure measured by the air pressure sensor 104 (step ST 305). The operation of the floor estimating unit 106 will be described in detail later. Information indicating the floor estimated by the floor estimating unit 106 is output to the reference coordinate estimating unit 108.
Further, the electronic compass 107 detects the direction of travel of the own machine (step ST 306). Information indicating the azimuth detected by the electronic compass 107 is output to the reference coordinate estimating unit 108.
Next, the reference coordinate estimating unit 108 estimates, based on the type of the moving means estimated by the moving means estimating unit 105, the floor estimated by the floor estimating unit 106, and the azimuth detected by the electronic compass 107, the absolute coordinates of the lifting/lowering port of the moving means used by the user on the floor as the reference coordinates based on the information recorded in the coordinate information recording unit 101 (step ST 307). The operation of the reference coordinate estimating unit 108 will be described in detail later. Information indicating the reference coordinates estimated by the reference coordinate estimating unit 108 is output to the coordinate converting unit 109.
Next, the coordinate conversion unit 109 converts the relative coordinates calculated by the pedestrian autonomous navigation unit 103 into absolute coordinates using the reference coordinates estimated by the reference coordinate estimation unit 108 (step ST 308). Information indicating the absolute coordinates obtained by the coordinate conversion unit 109 is output to the outside.
In addition, when the type of the moving means is not estimated by the moving means estimating unit 105 in step ST304, that is, when the user does not perform the inter-floor movement, the processing in steps ST305 to 307 is skipped, and the coordinate converting unit 109 converts the relative coordinates calculated by the pedestrian autonomous navigation unit 103 into absolute coordinates using the reference coordinates estimated last by the reference coordinate estimating unit 108.
Next, an operation example of the movement means estimating unit 105 will be described with reference to fig. 4. In addition, the following is shown: the type of moving means is stairs and elevators, and the motion measured by the inertial sensor 102 is acceleration in the vertical direction (z-axis direction).
First, the movement means estimating unit 105 determines whether or not the frequency of the acceleration waveform in the vertical direction is equal to or greater than a preset threshold value α [ Hz ] based on the movement measured by the inertial sensor 102 (step ST 401). The threshold value α is a value that can distinguish whether the user is walking or stationary.
When the movement means estimating unit 105 determines that the frequency of the acceleration waveform in the vertical direction is equal to or higher than the threshold value α in step ST401, it is determined whether or not the magnitude of the change per unit time of the air pressure measured by the air pressure sensor 104 is equal to or higher than a preset threshold value γ [ Pa/sec ] (step ST 402). The threshold value γ is a value that can determine whether the user moves to another floor.
In step ST402, when the movement means estimating unit 105 determines that the magnitude of the change per unit time of the air pressure is equal to or greater than the threshold value γ, it determines that the type of the movement means is a staircase (step ST 403). In the case where the type of the moving means used by the user is a stair, for example, as shown in fig. 5A, the acceleration waveform in the vertical direction measured by the inertial sensor 102 and the air pressure measured by the air pressure sensor 104 are shown.
On the other hand, when the movement means estimating unit 105 determines that the magnitude of the change in the air pressure per unit time is smaller than the threshold value γ in step ST402, it determines that the user is walking on the floor (step ST 404).
On the other hand, when the movement means estimating unit 105 determines that the frequency of the acceleration waveform in the vertical direction is smaller than the threshold α in step ST401, it is determined whether or not the magnitude of the change per unit time of the air pressure measured by the air pressure sensor 104 is equal to or greater than the preset threshold β [ Pa/sec ] (step ST 405). The threshold value β is a value that can determine whether the user moves to another floor.
In step ST405, when the movement means estimating unit 105 determines that the magnitude of the change per unit time of the air pressure is equal to or greater than the threshold value β, it determines that the type of the movement means is an elevator (step ST 406). In the case where the type of the moving means used by the user is an elevator, for example, the acceleration waveform in the vertical direction measured by the inertial sensor 102 and the air pressure measured by the air pressure sensor 104 are as shown in fig. 5B.
On the other hand, when the movement means estimating unit 105 determines in step ST405 that the magnitude of the change in the air pressure per unit time is smaller than the threshold β, it determines that the user is stationary on the floor (step ST 407).
Next, an example of the operation of the floor estimating unit 106 will be described with reference to fig. 6.
The floor estimating unit 106 estimates that the air pressure has a linear relationship with the height, and estimates the floor on which the user is located based on the air pressure measured by the air pressure sensor 104.
Here, as shown in fig. 6, the dynamic detection device 1 previously measures the air pressure a on the floor of the 1 th floor and the air pressure B on the floor of the M floors, and the floor estimation unit 106 obtains the air pressure difference Δp for each 1 floor by the following expression (1) based on the measurement results.
ΔP=(A-B)/(M-1) (1)
Then, the floor estimating unit 106 estimates the floor N on which the user is located by the following expression (2) based on the air pressure C measured by the air pressure sensor 104 during operation.
N={(A-C)/ΔP}+1 (2)
When the height h of each 1 floor of the building to be targeted is known, the floor estimating unit 106 may estimate the floor N on which the user is located by the following expression (3) based on the air pressure change amount D of the height per 1 m. The air pressure change amount D per 1m of height is usually about 11 to 12 pa/m.
N={(A-C)/(D×h)}+1 (3)
Next, an operation example of the reference coordinate estimating unit 108 will be described with reference to fig. 7 to 9.
The floor diagram shown in fig. 7 shows an arbitrary floor in a building, and shows a case where 4 moving means are provided. In fig. 7, reference numerals a to d denote positions of the lifting ports of the moving means, and reference numeral x denotes a position of the user. The table shown in fig. 8 is information indicating the absolute coordinates of the elevator entrance and the azimuth of the elevator entrance of the moving means on the floor shown in fig. 7, which are recorded in the coordinate information recording unit 101.
As shown in fig. 9, first, the reference coordinate estimating unit 108 obtains, from the coordinate information recording unit 101, the absolute coordinates of the lifting/lowering ports and the orientations of the lifting/lowering ports of all the moving means existing on the floor, based on the floor estimated by the floor estimating unit 106 (step ST 901). Next, the reference coordinate estimating unit 108 is set to acquire the information shown in fig. 8.
Next, the reference coordinate estimating unit 108 determines whether or not the type of the moving means estimated by the moving means estimating unit 105 is a staircase (step ST 902).
If the reference coordinate estimation unit 108 determines in step ST902 that the type of the moving means is a stair, it determines whether or not the azimuth detected by the electronic compass 107 is a western direction (step ST 903).
When the reference coordinate estimating unit 108 determines that the azimuth is western in step ST903, the absolute coordinate of the position b is estimated as the reference coordinate based on the information shown in fig. 8 (step ST 904).
On the other hand, when the reference coordinate estimating unit 108 determines in step ST903 that the azimuth is not western, that is, when the azimuth is eastbound, the absolute coordinate of the position d is estimated as the reference coordinate based on the information shown in fig. 8 (step ST 905).
On the other hand, when the reference coordinate estimation unit 108 determines in step ST902 that the type of the moving means is not a stair, that is, when the type of the moving means is an elevator, it is determined whether or not the azimuth detected by the electronic compass 107 is the western direction (step ST 906).
When the reference coordinate estimating unit 108 determines that the azimuth is western in step ST906, the absolute coordinate of the position a is estimated as the reference coordinate based on the information shown in fig. 8 (step ST 907).
On the other hand, when the reference coordinate estimating unit 108 determines that the azimuth is not the western direction in step ST906, that is, when the azimuth is the eastbound direction, the absolute coordinate of the position c is estimated as the reference coordinate based on the information shown in fig. 8 (step ST 908).
As described above, according to embodiment 1, the dynamic detection device 1 includes: an inertial sensor 102 that determines the motion of the machine; a pedestrian autonomous navigation unit 103 that calculates relative coordinates of a user who holds the pedestrian autonomous navigation unit with respect to the reference coordinates based on the motion measured by the inertial sensor 102; an air pressure sensor 104 for measuring the air pressure around the host; a movement means estimating unit 105 for estimating a type of movement means used when the user moves between floors in a building, based on the movement measured by the inertial sensor 102 and the air pressure measured by the air pressure sensor 104; a floor estimating unit 106 for estimating a floor on which the user is located, based on the air pressure measured by the air pressure sensor 104; an electronic compass 107 that detects the orientation of the traveling direction of the own machine; a reference coordinate estimating unit 108 for estimating, as a reference coordinate, an absolute coordinate of a lifting opening of the moving means used by the user on the floor, based on the type of the moving means estimated by the moving means estimating unit 105, the floor estimated by the floor estimating unit 106, and the azimuth detected by the electronic compass 107; and a coordinate conversion unit 109 that converts the relative coordinates calculated by the pedestrian autonomous navigation unit 103 into absolute coordinates using the reference coordinates estimated by the reference coordinate estimation unit 108. Thus, the dynamic state detection device 1 can detect the dynamic state of the user by using the device alone even when a plurality of moving means exist in the building.
Embodiment 2
Fig. 10 is a block diagram showing a configuration example of the dynamic detection device 1 according to embodiment 2 of the present invention. In the dynamic state detection device 1 according to embodiment 2 shown in fig. 10, a time counting unit 110 is added to the dynamic state detection device 1 according to embodiment 1 shown in fig. 1. Other structures are the same, and only different parts will be described with the same reference numerals.
The coordinate information recording unit 101 records, for each floor, information indicating a reference time (reference required time) required for the user to move from the lifting/lowering port of each moving means to the first place, in addition to the information shown in embodiment 1. The first location may be arbitrarily set for each floor, and may be, for example, a location that the user must first approach when arriving at the floor.
Information indicating the motion of the machine measured by the inertial sensor 102 and information indicating the air pressure measured by the air pressure sensor 104 are also output to the time counting unit 110, respectively.
The time counting unit 110 counts the time when the air pressure is fixed (including the meaning of substantially fixed) during walking of the user, that is, the time when the air pressure is not changed, based on the movement measured by the inertial sensor 102 and the air pressure measured by the air pressure sensor 104. Information indicating the time counted by the time counting unit 110 is output to the reference coordinate estimating unit 108. The function of the time counting unit 110 is realized by a processing circuit.
The reference coordinate estimating unit 108 has the following functions in addition to the functions described in embodiment 1: based on the floor estimated by the floor estimating unit 106 and the time counted by the time counting unit 110, the absolute coordinates of the elevator entrance at the floor by the moving means used by the user are estimated as reference coordinates based on the information recorded in the coordinate information recording unit 101.
Next, an example of the operation of the reference coordinate estimating unit 108 will be described with reference to fig. 11 to 14.
The floor diagram shown in fig. 11 shows an arbitrary floor in a building, and shows a case where 3 moving means are provided. In fig. 11, reference numerals a to c denote positions of the lifting ports of the moving means, and reference numeral x denotes a position of the user. In fig. 11, the first place is an entrance of a management room. The table shown in fig. 12 is information indicating the absolute coordinates of the elevator entrance, the azimuth of the elevator entrance, and the reference time required for the elevator entrance to reach the first place, which are recorded in the coordinate information recording unit 101, and indicate the moving means on the floor shown in fig. 11.
In the floor diagram shown in fig. 11, all the moving means are in the same direction (north in fig. 11). Therefore, in the method according to embodiment 1, the movement means used by the user may not be estimated. Therefore, in this case, the dynamic detection device 1 estimates the movement means using the time counted by the time counting unit 110.
In this case, as shown in fig. 14, first, the reference coordinate estimating unit 108 obtains, from the coordinate information recording unit 101, the absolute coordinates of the lifting/lowering ports of all the moving means existing on the floor and the reference required time from the lifting/lowering ports to the first place, based on the floor estimated by the floor estimating unit 106 (step ST 1401). Next, the reference coordinate estimating unit 108 is set to acquire the information shown in fig. 12.
Next, the reference coordinate estimating unit 108 determines whether or not the time counted by the time counting unit 110 is equal to or less than a preset threshold θ (step ST 1402). The threshold value θ is shorter than the time required for each reference to the first location on the floor, and is, for example, the time required for the user to move on the jog step of the stairs as shown in fig. 13A.
When the reference coordinate estimation unit 108 determines that the time is equal to or less than the threshold value θ in step ST1402, the sequence returns to step ST1401. That is, when the air pressure is fixed for a time equal to or less than the threshold value θ while the user is walking, the user is not located on a floor, and for example, as shown in fig. 13A, it can be estimated that the user is located on a jogging step of a stairs. In this case, the reference coordinate estimating unit 108 re-estimates the reference coordinates. Fig. 13B shows an example of the measurement result of the air pressure sensor 104 when the user moves the jog dial.
On the other hand, when the reference coordinate estimation unit 108 determines that the time is not equal to or less than the threshold θ in step ST1402, it determines whether or not the time is equal to or less than 100 seconds (step ST 1403). This 100 seconds is the shortest time among the times required for the reference to the first place included in the information shown in fig. 12.
When the reference coordinate estimating unit 108 determines that the time is 100 seconds or less in step ST1403, the absolute coordinate of the position a is estimated as the reference coordinate based on the information shown in fig. 12 (step ST 1404).
On the other hand, when the reference coordinate estimation unit 108 determines that the time is not 100 seconds or less in step ST1404, it determines whether or not the time is 300 seconds or less (step ST 1405). The 300 seconds is the 2 nd shortest time among the times required for the reference to the first place included in the information shown in fig. 12.
When the reference coordinate estimating unit 108 determines that the time is 300 seconds or less in step ST1405, the absolute coordinate of the position b is estimated as the reference coordinate based on the information shown in fig. 12 (step ST 1406).
On the other hand, when the reference coordinate estimating unit 108 determines in step ST1406 that the time is not 300 seconds or less, the absolute coordinate of the position c is estimated as the reference coordinate based on the information shown in fig. 12 (step ST 1407).
As described above, according to embodiment 2, the dynamic sensing device 1 includes the time counting unit 110, the time counting unit 110 counts the time for which the walking air pressure of the user is fixed based on the movement measured by the inertial sensor 102 and the air pressure measured by the air pressure sensor 104, and the reference coordinate estimating unit 108 estimates the absolute coordinates of the lifting/lowering opening in the floor by the moving means used by the user as the reference coordinates based on the floor estimated by the floor estimating unit 106 and the time counted by the time counting unit 110. In this way, the dynamic detection device 1 can estimate the reference coordinates in addition to the effects of embodiment 1 even if the lifting ports of the plurality of moving means are in the same orientation.
Embodiment 3
Fig. 15 is a block diagram showing a configuration example of the dynamic detection device 1 according to embodiment 3 of the present invention. In the dynamic state detection device 1 according to embodiment 3 shown in fig. 15, a reference coordinate determination unit 111 is added to the dynamic state detection device 1 according to embodiment 1 shown in fig. 1. Other structures are the same, and only different parts will be described with the same reference numerals.
The coordinate information recording unit 101 records information indicating absolute coordinates of an obstacle existing in a building for each floor, in addition to the information shown in embodiment 1. Examples of the obstacle include a wall.
The information indicating the floor estimated by the floor estimating unit 106 is also output to the reference coordinate determining unit 111.
The reference coordinate estimating unit 108 has the following functions in addition to the functions described in embodiment 1: based on the determination result of the floor estimated by the floor estimating unit 106 and the reference coordinate determining unit 111, the absolute coordinates of the lifting/lowering port in the floor by the moving means used by the user are estimated as the reference coordinates based on the information recorded in the coordinate information recording unit 101.
At this time, first, the reference coordinate estimating unit 108 estimates, as the reference coordinates, absolute coordinates of the lifting/lowering ports of all the moving means present on the floor, based on the floor estimated by the floor estimating unit 106, based on the information recorded in the coordinate information recording unit 101. Then, the reference coordinate estimating unit 108 excludes the reference coordinates detected by the reference coordinate determining unit 111 from the estimation object.
The reference coordinate determination unit 111 detects, based on the floor estimated by the floor estimation unit 106, the reference coordinates constituting the absolute coordinates intersecting the absolute coordinates of the obstacle present at the floor, among the absolute coordinates obtained by the coordinate conversion unit 109, from the information recorded by the coordinate information recording unit 101. Information indicating the reference coordinates detected by the reference coordinate determination unit 111 is output to the reference coordinate estimation unit 108. The function of the reference coordinate determination unit 111 is realized by a processing circuit.
Next, an operation example of the reference coordinate estimating unit 108 and the reference coordinate determining unit 111 will be described with reference to fig. 16 and 17.
The floor diagram shown in fig. 16 shows an arbitrary floor in a building, and shows a case where 3 moving means are provided. In fig. 16, reference numerals a to c denote positions of the lifting ports of the moving means, and reference numeral x denotes a position of the user.
In the floor diagram shown in fig. 16, all the moving means are in the same direction (north in fig. 16). Therefore, in the method according to embodiment 1, the movement means used by the user may not be estimated. In this case, therefore, the dynamic detection device 1 uses the reference coordinate determination unit 111 to estimate the movement means.
In this case, first, the reference coordinate estimating unit 108 estimates, as the reference coordinates, absolute coordinates of the lifting/lowering ports of all the moving means present on the floor, based on the floor estimated by the floor estimating unit 106, based on the information recorded in the coordinate information recording unit 101. Next, the coordinate conversion unit 109 converts the relative coordinates calculated by the pedestrian autonomous navigation unit 103 into absolute coordinates using the reference coordinates estimated by the reference coordinate estimation unit 108. Fig. 16 shows trajectories x1 to x3 of absolute coordinates of the user when the absolute coordinates of positions a to c of the lifting/lowering ports of the respective moving means are set as reference coordinates.
Next, the reference coordinate determination unit 111 detects, based on the floor estimated by the floor estimation unit 106, the reference coordinates constituting the absolute coordinates intersecting the trajectory among the absolute coordinates obtained by the coordinate conversion unit 109, with respect to the absolute coordinates of the obstacle present at the floor, based on the information recorded in the coordinate information recording unit 101. Next, the operation of the reference coordinate determination unit 111 will be described in detail with reference to fig. 17.
As shown in fig. 17, first, the reference coordinate determination unit 111 acquires absolute coordinates of all obstacles present on the floor from the coordinate information recording unit 101 based on the floor estimated by the floor estimation unit 106 (step ST 1701). Next, the reference coordinate determination unit 111 is set to acquire absolute coordinates of all obstacles present on the floor shown in fig. 16.
The reference coordinate determination unit 111 obtains all absolute coordinates obtained by the coordinate conversion unit 109 (step ST 1702).
Next, the reference coordinate determination unit 111 detects reference coordinates constituting absolute coordinates where the trajectory intersects with absolute coordinates of the obstacle existing on the floor among the acquired absolute coordinates (step ST 1703). In the example of fig. 16, the trajectory x2 and the trajectory x3 intersect with the absolute coordinates of the obstacle, and therefore, the reference coordinate determination unit 111 detects the absolute coordinates of the position b (reference coordinates) and the absolute coordinates of the position c (reference coordinates). Information indicating the reference coordinates detected by the reference coordinate determination unit 111 is output to the reference coordinate estimation unit 108.
Next, the reference coordinate estimating unit 108 excludes the reference coordinates detected by the reference coordinate determining unit 111 from the estimation object. In the example of fig. 16, the reference coordinate estimating unit 108 excludes the absolute coordinates of the position b and the absolute coordinates of the position c from the estimation object, and the coordinate converting unit 109 converts the relative coordinates calculated by the pedestrian autonomous navigation unit 103 into absolute coordinates using only the absolute coordinates of the position a.
As described above, according to embodiment 3, the reference coordinate estimating unit 108 estimates, as reference coordinates, absolute coordinates of the lifting/lowering ports of all the moving means present on the floor, based on the floor estimated by the floor estimating unit 106, and the dynamic detecting device 1 includes the reference coordinate determining unit 111, and the reference coordinate determining unit 111 detects, based on the floor estimated by the floor estimating unit 106, reference coordinates constituting absolute coordinates, of which the absolute coordinates obtained by the coordinate converting unit 109 intersect with absolute coordinates of the obstacle present on the floor, and the reference coordinate estimating unit 108 excludes the reference coordinates detected by the reference coordinate determining unit 111 from the estimation target. In this way, the dynamic detection device 1 can estimate the reference coordinates in addition to the effects of embodiment 1 even if the lifting ports of the plurality of moving means are in the same orientation.
In the dynamic sensing device 1 according to embodiment 3, the reference coordinates can be estimated without knowing the time required for the reference to the first point shown in embodiment 2.
The present application can freely combine the embodiments, change any of the components of the embodiments, or omit any of the components of the embodiments within the scope of the present invention.
Industrial applicability
The dynamic detection device of the present invention can detect the dynamic state of a user who holds the device by using the device alone even when a plurality of moving means exist in a building, and is suitable for a dynamic detection device or the like for detecting the dynamic state of the user who holds the device.
Description of the reference numerals
1: a dynamic detection device; 101: a coordinate information recording unit; 102: an inertial sensor; 103: a pedestrian autonomous navigation unit; 104: an air pressure sensor; 105: a movement means estimation unit; 106: a floor estimating unit; 107: an electronic compass; 108: a reference coordinate estimation unit; 109: a coordinate conversion section; 110: a time counting unit; 111: a reference coordinate determination unit; 201: a CPU;202: a system memory; 203: a memory; 204: a GPU;205: a frame memory; 206: RAMDAC;207: an operating device; 208: a display device.

Claims (5)

1. A dynamic detection device, the dynamic detection device comprising:
an inertial sensor that measures the motion of the machine;
a pedestrian autonomous navigation unit that calculates relative coordinates of a user who holds the pedestrian autonomous navigation unit with respect to reference coordinates based on the motion measured by the inertial sensor;
an air pressure sensor for measuring the air pressure around the host;
a floor estimating unit that estimates a floor on which the user is located, based on the air pressure measured by the air pressure sensor;
a time counting unit that counts a time during which the air pressure is fixed while the user is walking, based on the movement measured by the inertial sensor and the air pressure measured by the air pressure sensor;
a reference coordinate estimating unit that estimates, as the reference coordinate, an absolute coordinate of a lifting/lowering port in the floor by a moving means used by the user, based on the floor estimated by the floor estimating unit and the time counted by the time counting unit; and
and a coordinate conversion unit that converts the relative coordinates calculated by the pedestrian autonomous navigation unit into absolute coordinates using the reference coordinates estimated by the reference coordinate estimation unit.
2. The dynamic detection device according to claim 1, wherein,
when the time counted by the time counting unit is equal to or less than a threshold value, which is shorter than a time required from the lift gate of the moving means existing on the floor estimated by the floor estimating unit to the reference of the first place, the reference coordinate estimating unit re-estimates the reference coordinates.
3. The dynamic detection device according to claim 1, wherein,
the reference coordinate estimating unit estimates, as the reference coordinate, absolute coordinates of the lifting/lowering ports of all the moving means existing on the floor based on the floor estimated by the floor estimating unit,
the dynamic detection device has a reference coordinate determination unit that detects, from the floor estimated by the floor estimation unit, a reference coordinate that constitutes an absolute coordinate intersecting an absolute coordinate of a trajectory and an absolute coordinate of an obstacle present in the floor among absolute coordinates obtained by the coordinate conversion unit,
the reference coordinate estimating unit excludes the reference coordinates detected by the reference coordinate determining unit from the estimation object.
4. The dynamic detection device according to claim 1, wherein,
the dynamic detection device has:
a movement means estimating unit that estimates a type of movement means used when the user moves between floors in a building, based on the movement measured by the inertial sensor and the air pressure measured by the air pressure sensor; and
an electronic compass that detects the orientation of the traveling direction of the machine,
the reference coordinate estimating unit estimates, as the reference coordinate, an absolute coordinate of a lifting opening of the moving means used by the user in the floor based on the type of the moving means estimated by the moving means estimating unit, the floor estimated by the floor estimating unit, and the azimuth detected by the electronic compass.
5. A dynamic detection method, the dynamic detection method having the steps of:
the inertial sensor measures the motion of the machine;
a pedestrian autonomous navigation unit that calculates relative coordinates of a user who holds the pedestrian autonomous navigation unit with respect to reference coordinates based on the motion measured by the inertial sensor;
the air pressure sensor measures the air pressure around the machine;
a floor estimating unit that estimates a floor on which the user is located, based on the air pressure measured by the air pressure sensor;
a time counting unit that counts a time during which the air pressure is fixed while the user is walking, based on the movement measured by the inertial sensor and the air pressure measured by the air pressure sensor;
a reference coordinate estimating unit that estimates, as the reference coordinate, an absolute coordinate of a lifting/lowering port of the moving means used by the user in the floor based on the floor estimated by the floor estimating unit and the time counted by the time counting unit; and
the coordinate conversion unit converts the relative coordinates calculated by the pedestrian autonomous navigation unit into absolute coordinates using the reference coordinates estimated by the reference coordinate estimation unit.
CN201880090037.4A 2018-03-02 2018-03-02 Dynamic detection device and dynamic detection method Active CN111758015B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/008091 WO2019167269A1 (en) 2018-03-02 2018-03-02 Motion state detection device

Publications (2)

Publication Number Publication Date
CN111758015A CN111758015A (en) 2020-10-09
CN111758015B true CN111758015B (en) 2024-02-09

Family

ID=64017111

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880090037.4A Active CN111758015B (en) 2018-03-02 2018-03-02 Dynamic detection device and dynamic detection method

Country Status (3)

Country Link
JP (1) JP6415796B1 (en)
CN (1) CN111758015B (en)
WO (1) WO2019167269A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109974717B (en) * 2019-03-13 2021-05-25 浙江吉利汽车研究院有限公司 Method, device and terminal for repositioning target point on map
JP7309097B2 (en) 2021-04-22 2023-07-14 三菱電機株式会社 POSITION DETECTION DEVICE, POSITION DETECTION METHOD, AND POSITION DETECTION PROGRAM

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004212146A (en) * 2002-12-27 2004-07-29 Sumitomo Precision Prod Co Ltd Dead reckoning navigation system and dead reckoning navigation for walker
JP2012237719A (en) * 2011-05-13 2012-12-06 Kddi Corp Portable device for estimating ascending/descending movement state by using atmospheric pressure sensor, program, and method
CN104569909A (en) * 2014-12-31 2015-04-29 深圳市鼎泰富科技有限公司 Indoor positioning system and method
CN104977003A (en) * 2015-06-29 2015-10-14 中国人民解放军国防科学技术大学 Indoor people search method, cloud server, and system based on shared track
CN106031263A (en) * 2014-02-28 2016-10-12 德州仪器公司 Method and system for location estimation
JP2017049168A (en) * 2015-09-03 2017-03-09 住友電気工業株式会社 Location estimation device, location estimation system, location estimation method and location estimation program
CN106918334A (en) * 2015-12-25 2017-07-04 高德信息技术有限公司 Indoor navigation method and device
JP2017181179A (en) * 2016-03-29 2017-10-05 Kddi株式会社 Device, program and method for position estimation capable of correction of position based on transition between floors
JP2018028480A (en) * 2016-08-18 2018-02-22 株式会社ゼンリンデータコム Information processing apparatus, information processing method, and program

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5938760B2 (en) * 2012-03-13 2016-06-22 株式会社日立製作所 Travel amount estimation system, travel amount estimation method, mobile terminal
US10206068B2 (en) * 2015-07-09 2019-02-12 OneMarket Network LLC Systems and methods to determine a location of a mobile device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004212146A (en) * 2002-12-27 2004-07-29 Sumitomo Precision Prod Co Ltd Dead reckoning navigation system and dead reckoning navigation for walker
JP2012237719A (en) * 2011-05-13 2012-12-06 Kddi Corp Portable device for estimating ascending/descending movement state by using atmospheric pressure sensor, program, and method
CN106031263A (en) * 2014-02-28 2016-10-12 德州仪器公司 Method and system for location estimation
CN104569909A (en) * 2014-12-31 2015-04-29 深圳市鼎泰富科技有限公司 Indoor positioning system and method
CN104977003A (en) * 2015-06-29 2015-10-14 中国人民解放军国防科学技术大学 Indoor people search method, cloud server, and system based on shared track
JP2017049168A (en) * 2015-09-03 2017-03-09 住友電気工業株式会社 Location estimation device, location estimation system, location estimation method and location estimation program
CN106918334A (en) * 2015-12-25 2017-07-04 高德信息技术有限公司 Indoor navigation method and device
JP2017181179A (en) * 2016-03-29 2017-10-05 Kddi株式会社 Device, program and method for position estimation capable of correction of position based on transition between floors
JP2018028480A (en) * 2016-08-18 2018-02-22 株式会社ゼンリンデータコム Information processing apparatus, information processing method, and program

Also Published As

Publication number Publication date
CN111758015A (en) 2020-10-09
JPWO2019167269A1 (en) 2020-04-09
JP6415796B1 (en) 2018-10-31
WO2019167269A1 (en) 2019-09-06

Similar Documents

Publication Publication Date Title
US9872149B2 (en) Method of estimating position of a device using geographical descriptive data
US8019475B2 (en) Routing apparatus for autonomous mobile unit
CN106044430B (en) The method of position detection for lift car
KR102196937B1 (en) Method and Apparatus For Estimating Position of User in Building
CN111758015B (en) Dynamic detection device and dynamic detection method
JP2017505433A (en) Method and apparatus for positioning with an always-on barometer
JP2007093433A (en) Detector for motion of pedestrian
CN109211233B (en) Elevator motion detection and abnormal position parking judgment method based on acceleration sensor
US9771240B2 (en) Inertial measurement unit assisted elevator position calibration
Vanini et al. Adaptive context-agnostic floor transition detection on smart mobile devices
KR101396877B1 (en) Method and system for wifi-based indoor positioning compensation
KR101522466B1 (en) Apparatus for detecting the pedestrian foot zero velocity and Method thereof, and Inertial navigation system of pedestrian using same
US20190352130A1 (en) Method and an elevator system for performing a synchronization run of an elevator car
JP2013130495A (en) Information processor and information processing method
JP2019215646A (en) Parking position guide system and parking position guide program
JP4774401B2 (en) Autonomous mobile route setting device
WO2018211655A1 (en) Position detection device, elevator control device, and elevator system
CN114018278A (en) System and method for realizing cross-floor guidance according to time cooperation of multiple navigation robots
Kronenwett et al. Elevator and escalator classification for precise indoor localization
CN110072797B (en) Crane control method, computer program, equipment and crane updating method
JP6733847B2 (en) Elevator user recognition system
Song et al. Finding 9-1-1 callers in tall buildings
JP6494552B2 (en) Position estimating apparatus, program and method capable of correcting position based on transition between floors
JP6289791B1 (en) Information processing apparatus and information processing system
KR101492061B1 (en) Method and Apparatus for Estimating Movement Information by Combination of Movement Sensing Results in Segmentation Regions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Tokyo, Japan

Applicant after: MITSUBISHI ELECTRIC Corp.

Applicant after: Mitsubishi Electric Building Solutions Co.,Ltd.

Address before: Tokyo, Japan

Applicant before: MITSUBISHI ELECTRIC Corp.

Applicant before: MITSUBISHI ELECTRIC BUILDING TECHNO-SERVICE Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20240115

Address after: Tokyo, Japan

Applicant after: Mitsubishi Electric Building Solutions Co.,Ltd.

Address before: Tokyo, Japan

Applicant before: MITSUBISHI ELECTRIC Corp.

Applicant before: Mitsubishi Electric Building Solutions Co.,Ltd.

GR01 Patent grant
GR01 Patent grant