WO2023032507A1 - 位置推定システム、及び、位置推定方法 - Google Patents
位置推定システム、及び、位置推定方法 Download PDFInfo
- Publication number
- WO2023032507A1 WO2023032507A1 PCT/JP2022/028348 JP2022028348W WO2023032507A1 WO 2023032507 A1 WO2023032507 A1 WO 2023032507A1 JP 2022028348 W JP2022028348 W JP 2022028348W WO 2023032507 A1 WO2023032507 A1 WO 2023032507A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- space
- coordinates
- indoor space
- communication
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 32
- 238000004891 communication Methods 0.000 claims description 159
- 230000010365 information processing Effects 0.000 description 27
- 238000010586 diagram Methods 0.000 description 17
- 238000012545 processing Methods 0.000 description 12
- 238000001514 detection method Methods 0.000 description 10
- 238000010801 machine learning Methods 0.000 description 10
- 238000004590 computer program Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 6
- 238000012986 modification Methods 0.000 description 6
- 230000004048 modification Effects 0.000 description 6
- 239000000470 constituent Substances 0.000 description 4
- 238000013135 deep learning Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 230000011218 segmentation Effects 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 4
- 238000005286 illumination Methods 0.000 description 3
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/02—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
Definitions
- the present invention relates to a position estimation system and a position estimation method.
- Patent Document 1 discloses a technique for detecting the position of a video projection device from MAC addresses and radio wave intensities of a plurality of access points.
- the present invention provides a position estimating system and a position estimating method capable of estimating where and what kind of object is located in a given space with high accuracy.
- a position estimation system includes an acquisition unit that acquires either image information of an image showing a space in which an object is located or temperature distribution information of the space; communication between an estimating unit that estimates the coordinates of the object in the space, a first communication device held by an object located in the space, and a plurality of second communication devices installed in the space, respectively, based on a position information acquisition unit that acquires target position information, which is position information of the target, from a positioning system that measures the position of the object based on a state; and identification information of the target that is included in the acquired target position information. is associated with the estimated coordinates and stored in a storage unit.
- a position estimation method includes an obtaining step of obtaining either image information of an image showing a space in which an object is positioned or temperature distribution information of the space; an estimating step of estimating the coordinates of the object in the space based on, and communication between a first communication device held by an object located in the space and a plurality of second communication devices installed in the space.
- a position information acquisition step of acquiring target position information, which is position information of the target, from a positioning system that measures the position of the object based on a state; and identification information of the target included in the acquired target position information. are stored in association with the estimated coordinates.
- a program according to one aspect of the present invention is a program for causing a computer to execute the position estimation method.
- the position estimation system and position estimation method of the present invention can highly accurately estimate where and what kind of object is located in a predetermined space.
- FIG. 1 is a block diagram showing a functional configuration of a position estimation system according to Embodiment 1.
- FIG. FIG. 2 is a diagram showing an indoor space to which the position estimation system according to Embodiment 1 is applied.
- FIG. 3 is a flowchart of an operation example of the position estimation system according to Embodiment 1;
- FIG. 4 is a diagram schematically showing a plurality of pieces of position information.
- FIG. 5 is a block diagram showing the functional configuration of the position estimation system according to Embodiment 2.
- FIG. 6 is a diagram showing an indoor space to which the position estimation system according to Embodiment 2 is applied.
- FIG. 7 is a diagram schematically showing a thermal image.
- FIG. 8 is a flowchart of an operation example of the position estimation system according to Embodiment 2.
- FIG. 9 is a block diagram showing the functional configuration of the positioning system according to the modification.
- FIG. 10 is a diagram showing an indoor space to which the positioning system according to the modification is applied.
- each figure is a schematic diagram and is not necessarily strictly illustrated. Moreover, in each figure, the same code
- FIG. 1 is a block diagram showing a functional configuration of a position estimation system according to Embodiment 1.
- FIG. 2 is a diagram showing an indoor space to which the position estimation system according to Embodiment 1 is applied.
- the position estimation system 10 is a system that acquires image information of an image of the indoor space 50 output by the camera 20 and estimates the coordinates of an object among the objects positioned in the indoor space 50 based on the acquired image information.
- the indoor space 50 is, for example, an office space, but may be a space in a commercial facility or an indoor space in other facilities such as a space in a house.
- the object (object) is, for example, a living organism such as person A and person B, but may be a tangible object (an object other than a person; for example, a non-living object) such as chair C.
- the position estimation system 10 includes a camera 20, a server device 30, and a positioning system 40. Note that the position estimation system 10 may include multiple cameras 20 .
- the camera 20 is installed, for example, on the ceiling of the indoor space 50, and images the indoor space 50 from above. Also, the camera 20 transmits image information of the captured image to the server device 30 .
- An image captured by the camera 20 is, for example, a still image.
- the camera 20 may capture a moving image, and the image captured by the camera 20 in this case is, for example, a still image corresponding to one frame forming the moving image.
- Camera 20 is implemented by, for example, an image sensor.
- the camera 20 is detachably connected to, for example, a power supply terminal of the lighting device 22 installed on the ceiling of the indoor space 50, and receives power from the lighting device 22 to operate.
- the power supply terminal is, for example, a USB (Universal Serial Bus) terminal.
- the camera 20 may be fixed directly to the ceiling of the indoor space 50 without the illumination device 22 interposed therebetween.
- the camera 20 may be fixed to a wall or the like so as to capture an image of the indoor space 50 from the side.
- the server device 30 acquires the image information generated by the camera 20, and estimates the coordinates of the object located in the indoor space 50 based on the acquired image information.
- the server device 30 is an edge computer provided in a facility (building) that constitutes the indoor space 50, but may be a cloud computer provided outside the facility.
- the server device 30 includes a communication section 31 , an information processing section 32 and a storage section 33 .
- the communication unit 31 is a communication module (communication circuit) for the server device 30 to communicate with the camera 20 and the positioning system 40 .
- the communication unit 31 receives image information from the camera 20, for example. Also, the communication unit 31 receives position information of an object located in the indoor space 50 from the positioning system 40 .
- the communication performed by the communication unit 31 may be wireless communication or wired communication.
- the communication standard used for communication is also not particularly limited.
- the information processing section 32 acquires the image information received by the communication section 31 and performs information processing for estimating the coordinates of the object located in the indoor space 50 based on the acquired image information.
- the information processing section 32 is specifically realized by a processor or a microcomputer.
- the information processing section 32 includes an acquisition section 34 , an estimation section 35 , a position information acquisition section 36 and a control section 37 .
- the functions of the acquisition unit 34, the estimation unit 35, the position information acquisition unit 36, and the control unit 37 are realized by the processor or microcomputer constituting the information processing unit 32 executing a computer program stored in the storage unit 33. be done. Details of functions of the acquisition unit 34, the estimation unit 35, the position information acquisition unit 36, and the control unit 37 will be described later.
- the storage unit 33 is a storage device that stores image information and position information received by the communication unit 31, computer programs executed by the information processing unit 32, and the like.
- the storage unit 33 also stores a machine learning model, which will be described later, and registration information indicating what kind of object the first identification information, which will be described later, specifically indicates.
- the storage unit 33 is implemented by a semiconductor memory, HDD (Hard Disk Drive), or the like.
- the positioning system 40 locates the object in the indoor space 50 based on the state of communication between the first communication device 41 held by the object located in the indoor space 50 and the plurality of second communication devices 42 installed in the indoor space 50 . Measure position.
- the positioning system 40 includes a plurality of first communication devices 41 , a plurality of second communication devices 42 and a positioning server device 43 . Note that the positioning system 40 may include at least one first communication device 41 .
- the first communication device 41 is a beacon transmitter that transmits beacon signals.
- the first communication device 41 is, for example, a dedicated beacon transmitter, but may be a portable information terminal (such as a smart phone) capable of operating as a beacon transmitter.
- the first communication device 41 is held by an object (person or tangible object) located in the indoor space 50 .
- the beacon signal contains the first identification of the object on which the first communication device 41 is held.
- the first identification information may be the identification information of the first communication device 41 itself.
- the second communication device 42 is a beacon receiver (scanner) that receives beacon signals transmitted by the first communication device 41 .
- the second communication device 42 measures the received signal strength (RSSI: Received Signal Strength Indicator) of the received beacon signal, and adds the first identification information contained in the beacon signal and the second The signal strength information associated with the second identification information of the communication device 42 itself is transmitted to the positioning server device 43 .
- RSSI Received Signal Strength Indicator
- the second communication device 42 is, for example, detachably connected to a power supply terminal of the lighting device 22 installed on the ceiling of the indoor space 50, and operates by receiving power from the lighting device 22.
- the power supply terminal is, for example, a USB terminal.
- the second communication device 42 may be fixed directly to the ceiling of the indoor space 50 without the lighting device 22 interposed therebetween. Also, the second communication device 42 may be fixed to a wall or the like.
- the plurality of second communication devices 42 are two-dimensionally distributed when viewed from above.
- the positioning server device 43 acquires the signal strength information of the beacon signal transmitted by the first communication device 41 from each of the plurality of second communication devices 42, and based on the acquired signal strength information, the first communication device 41 measures the position of the held object.
- the positioning server device 43 is an edge computer provided in a facility (building) that constitutes the indoor space 50, but may be a cloud computer provided outside the facility.
- the positioning server device 43 includes a communication section 44 , an information processing section 45 and a storage section 46 .
- the communication unit 44 is a communication module (communication circuit) for the positioning server device 43 to communicate with the plurality of second communication devices 42 and the server device 30 .
- the communication unit 44 receives signal strength information from each of the plurality of second communication devices 42, for example.
- the communication unit 44 also transmits the position information of the object located in the indoor space 50 to the server device 30 .
- the communication performed by the communication unit 44 may be wireless communication or wired communication.
- the communication standard used for communication is also not particularly limited.
- the information processing unit 45 measures the position of an object located in the indoor space 50 based on the multiple pieces of signal intensity information received by the communication unit 44, and outputs position information indicating the measured position.
- the output position information is transmitted to the server device 30 by the communication unit 44 .
- the information processing unit 45 stores the first identification information of the person A (the first identification information included in the beacon signal transmitted by the first communication device 41 owned by the person A) and the first identification information of the second communication device 42 itself.
- the position of the person A is measured based on a plurality of signal strength information including the second identification information and the arrangement information indicating the arrangement (installation position) of the plurality of second communication devices 42 in the indoor space 50, and the position of the person A is measured.
- the position information associated with the first identification information of person A (that is, the position information of person A) is output.
- the arrangement information is specifically information that associates the second identification information of the second communication device 42 with the coordinates (two-dimensional coordinates) of the installation position of the second communication device 42 . Any existing algorithm may be used as a method of measuring a position based on a plurality of pieces of signal strength information and arrangement information.
- the information processing section 45 is specifically realized by a processor or a microcomputer.
- the functions of the information processing section 45 are realized by executing a computer program stored in the storage section 46 by the processor or microcomputer constituting the information processing section 45 .
- the storage unit 46 is a storage device that stores signal strength information received by the communication unit 44, arrangement information indicating the arrangement of the plurality of second communication devices 42, computer programs executed by the information processing unit 45, and the like. .
- the storage unit 46 also stores registration information that indicates which first identification information specifically indicates what kind of object.
- the storage unit 46 is specifically implemented by a semiconductor memory, HDD, or the like.
- the positioning system 40 of the beacon signal transmitted by the first communication device 41, based on the received signal strength of each of the plurality of second communication devices 42 and the arrangement information of the plurality of second communication devices 42
- the position of the object on which the first communication device 41 is held can be measured.
- the positioning system 40 can specifically measure where and what kind of object is located in the indoor space 50 .
- the accuracy of the position measured by the positioning system 40 may not be very high.
- the position estimation system 10 uses both the image information of the image captured by the camera 20 and the position information of the object provided by the positioning system 40 to estimate the position of the object with high accuracy.
- An operation example of such a position estimation system 10 will be described below.
- FIG. 3 is a flow chart of an example operation of the position estimation system 10 .
- the communication unit 31 of the server device 30 receives image information from the camera 20 (S11).
- the received image information is stored in the storage section 33 by the information processing section 32 .
- the image information is, for example, image information of an image when the indoor space 50 is viewed from above.
- the acquiring unit 34 acquires the image information received by the communication unit 31 and stored in the storage unit 33 (S12), and the estimating unit 35 calculates the indoor space 50 based on the acquired image information. is estimated (S13).
- the estimating unit 35 estimates the position of the object in the image by, for example, performing object detection processing using deep learning (machine learning model) on the image information (image), and estimates the position of the object in the image. Transform the position into coordinates in the indoor space 50 .
- the estimation unit 35 performs object detection processing based on methods such as R-CNN (Region-Convolutional Neural Network), YOLO (You Only Look at Once), or SSD (Single Shot Multibox Detector).
- a machine learning model for performing these object detection processes is constructed using, for example, images obtained by imaging the indoor space 50 (or other indoor space) from above as learning data.
- the storage unit 33 stores table information indicating the correspondence relationship between the positions of pixels in the image and the coordinates in the indoor space 50.
- the estimation unit 35 uses such table information to obtain the image can be transformed into coordinates of the object in the room space 50 .
- the coordinates of the position of the object estimated by the estimation unit 35 are two-dimensional coordinates of the indoor space 50 viewed from above. Become.
- FIG. 4 is a diagram schematically showing a plurality of pieces of position information. As shown in FIG. 4, in each of the plurality of pieces of position information, first identification information of an object is associated with the position (coordinates) of the object. The position of the object is, for example, two-dimensional coordinates when the indoor space 50 is viewed from above.
- the positional information acquisition unit 36 acquires a plurality of pieces of positional information received by the communication unit 31 and stored in the storage unit 33 (S15). is obtained (selected) as target position information (S16).
- control unit 37 regards the first identification information included in the acquired target position information as the identification information of the target object, and stores it in the storage unit 33 as coordinate information in association with the coordinates estimated in step S13. (S17).
- the coordinate information of all objects is stored in the storage unit 33 . It should be noted that which first identification information specifically indicates what kind of object is registered in the storage unit 33 in advance.
- the coordinate information stored in this manner is provided by the control unit 37 to an information terminal (not shown) such as a personal computer or a smartphone, and visualized by the information terminal.
- an information terminal such as a personal computer or a smartphone
- the user viewing the display of the information terminal can easily grasp the position of the object in the indoor space 50 . If the object is a tangible object, it becomes easier to grasp the position of the tangible object and maintain the tangible object.
- the coordinate information may be provided to a control device (not shown) that controls equipment such as an air conditioner.
- equipment such as an air conditioner.
- the controller can control the equipment based on the person's position in the indoor space 50 .
- the position estimation system 10 highly accurately estimates the coordinates of an object based on image information.
- the position estimation system 10 identifies the identification information of the object using the position information acquired from the positioning system 40 .
- the position estimation system 10 can accurately estimate where and what kind of object is located in the indoor space 50 and manage where and what kind of object is located in the indoor space 50 .
- the estimation unit 35 estimates the coordinates of the object in the indoor space 50 by performing object detection processing on the image information (image) acquired by the acquisition unit 34 .
- the estimation unit 35 may perform a process of dividing the image into regions.
- the estimation unit 35 may perform segmentation using deep learning (machine learning model).
- the estimation unit 35 can estimate the coordinates of the object in the indoor space 50 based on the position of the area in the image where the object is shown.
- a machine learning model for performing segmentation is constructed, for example, using images obtained by imaging the indoor space 50 (or other indoor space) from above as learning data.
- FIG. 5 is a block diagram showing the functional configuration of the position estimation system according to Embodiment 2.
- FIG. 6 is a diagram showing an indoor space 50 to which the position estimation system according to Embodiment 2 is applied.
- the position estimation system 10a acquires temperature distribution information indicating the temperature distribution of the indoor space 50 output by the infrared sensor 21, and based on the acquired temperature distribution information, coordinates of the object among the objects located in the indoor space 50. is a system for estimating
- the position estimation system 10a includes an infrared sensor 21, a server device 30, and a positioning system 40. That is, the position estimation system 10a includes an infrared sensor 21 instead of the camera 20.
- the position estimation system 10a may include a plurality of infrared sensors 21 .
- the components other than the infrared sensor 21 are the same as those in the first embodiment, so detailed description thereof will be omitted.
- the infrared sensor 21 is installed, for example, on the ceiling of the indoor space 50, and generates temperature distribution information (hereinafter also referred to as a thermal image) indicating the temperature distribution when the indoor space 50 is viewed from above.
- the temperature distribution information is transmitted to the server device 30 .
- the infrared sensor 21 is, for example, an infrared array sensor (thermal image sensor) configured by an array of 8 ⁇ 8 infrared detection elements. In other words, the thermal image produced by infrared sensor 21 has 8 ⁇ 8 pixels.
- the thermal image shows the temperature distribution in the sensing range of the infrared sensor 21 with a resolution of 8x8.
- FIG. 7 is a diagram schematically showing a thermal image. Each of the 8 ⁇ 8 small regions in FIG. 7 means a pixel included in the thermal image.
- a numerical value in a pixel is a pixel value, and specifically indicates a temperature.
- the temperature here is the surface temperature of the indoor space 50 .
- the infrared sensor 21 is not limited to an infrared array sensor, and may be, for example, a sensor that scans the indoor space 50 with a single infrared detection element, or an infrared image sensor with relatively high resolution. good.
- the infrared sensor 21 is detachably connected to, for example, a power supply terminal of the lighting device 22 installed on the ceiling of the indoor space 50, and receives power from the lighting device 22 to operate.
- the power supply terminal is, for example, a USB terminal.
- the infrared sensor 21 may be fixed directly to the ceiling of the indoor space 50 without the illumination device 22 interposed therebetween. Further, the infrared sensor 21 may be fixed to a wall or the like to generate a thermal image showing the temperature distribution when the indoor space 50 is viewed from the side.
- the position estimation system 10a uses both the temperature distribution information from the infrared sensor 21 and the object position information provided by the positioning system 40 to estimate the position of the object with high accuracy.
- the object here is an object having a temperature difference from its surroundings, such as a person (living organism), but may be a tangible object (an object other than a person) having a temperature difference from its surroundings.
- An operation example of such a position estimation system 10a will be described below.
- FIG. 8 is a flowchart of an operation example of the position estimation system 10a.
- the communication unit 31 of the server device 30 receives the temperature distribution information from the infrared sensor 21 (S21).
- the received temperature distribution information is stored in the storage unit 33 by the information processing unit 32 .
- the temperature distribution information indicates, for example, the temperature distribution when the indoor space 50 is viewed from above.
- the acquisition unit 34 acquires the temperature distribution information received by the communication unit 31 and stored in the storage unit 33 (S22), and the estimation unit 35 applies the super-resolution technique to the acquired temperature distribution information. is applied to increase the resolution of the temperature distribution information (S23).
- the temperature distribution information is referred to as a thermal image.
- the estimation unit 35 increases the resolution of the thermal image by applying SRGAN (Generative Adversarial Network for Super-Resolution) to the thermal image.
- SRGAN Geneative Adversarial Network for Super-Resolution
- the method for increasing the resolution of the thermal image is not limited to SRGAN, and the estimation unit 35 may increase the resolution of the thermal image by applying SRCNN (Super-Resolution Convolutional Neural Network) to the thermal image.
- SRCNN Super-Resolution Convolutional Neural Network
- a high-resolution thermal image can be generated from the inexpensive infrared sensor 21. It should be noted that the application of the super-resolution technique to the thermal image is not essential, and the processing for increasing the resolution of the thermal image may be omitted.
- the estimation unit 35 estimates the coordinates of the object in the indoor space 50 based on the thermal image (temperature distribution information) to which the super-resolution technology has been applied (S24).
- the estimating unit 35 estimates the position of the object in the thermal image by, for example, performing object detection processing using deep learning (machine learning model) on the thermal image, and calculates the position of the object in the thermal image. to coordinates in the indoor space 50 .
- the estimation unit 35 performs object detection processing based on techniques such as R-CNN, YOLO, or SSD.
- the machine learning model for performing these object detection processes is constructed from a thermal image obtained by imaging the indoor space 50 (or other indoor space) from above, instead of general color images, as learning data. be done.
- the storage unit 33 also stores table information indicating the correspondence between the positions of pixels in the thermal image and the coordinates in the indoor space 50.
- the estimation unit 35 uses such table information to The object's position in the thermal image can be transformed into the object's coordinates in the room space 50 .
- the coordinates of the position of the object estimated by the estimation unit 35 are two-dimensional coordinates when the indoor space 50 is viewed from above. coordinates.
- steps S25 to S28 are the same as the processes of steps S14 to S17 of the first embodiment.
- the position estimation system 10a highly accurately estimates the coordinates of the object based on the temperature distribution information.
- the position estimation system 10a identifies the target using the position information acquired from the positioning system 40 for the identification information of the target.
- the position estimation system 10a can estimate with high accuracy where and what kind of object is located in the indoor space 50 and can manage where and what kind of object is located in the indoor space 50 .
- the estimation unit 35 estimates the coordinates of the object in the indoor space 50 by performing object detection processing on the thermal image acquired by the acquisition unit 34 .
- the estimation unit 35 may perform a process of segmenting the thermal image.
- the estimation unit 35 may perform segmentation using deep learning (machine learning model).
- the estimating unit 35 can estimate the coordinates of the object in the indoor space 50 based on the position of the area in which the object is shown in the thermal image.
- a machine learning model for performing segmentation is constructed, for example, using thermal images obtained by imaging the indoor space 50 (or other indoor space) from above as learning data.
- the estimation unit 35 may estimate the coordinates of the object by performing information processing on the thermal image based on a rule-based algorithm that does not use a machine learning model. For example, the estimating unit 35 may perform a process of detecting a pixel having a maximum pixel value among a plurality of pixels included in the thermal image.
- a pixel having a maximum pixel value means a pixel having a maximum pixel value in a two-dimensional arrangement of pixels.
- a pixel having a maximum pixel value means, in other words, a pixel having a higher pixel value than surrounding pixels when comparing pixel values at the same time in a two-dimensional arrangement of pixels.
- the estimating unit 35 detects a pixel having a maximum pixel value and a pixel value equal to or greater than a predetermined value (for example, 30° C. or greater), the estimation unit 35 determines that an object exists in the indoor space 50. can be estimated.
- a predetermined value for example, 30° C. or greater
- the estimating unit 35 applies the table information to the position of the pixel having the maximum pixel value and having the pixel value equal to or higher than a predetermined value (for example, 30° C. or higher), thereby determining the indoor space.
- a predetermined value for example, 30° C. or higher
- Another example of information processing based on a rule-based algorithm is processing that detects temporal changes in the pixel values (temperature) of each of a plurality of pixels included in a thermal image. Assuming that there is no heat source other than the object in the indoor space 50, and if there is no object in the indoor space 50, the pixel values (temperatures) of the plurality of pixels included in the thermal image change slowly over time. be. In this state, when an object enters the indoor space 50, the pixel values of the pixels in the part where the object is shown in the thermal image change (increase) abruptly.
- the estimating unit 35 can estimate that an object exists in the indoor space 50 when the pixel value suddenly increases by monitoring the temporal change of the pixel value of each of the plurality of pixels.
- the estimation unit 35 can estimate the coordinates of the position of the object in the indoor space 50 by applying the table information described above to the position of the pixel whose pixel value has increased sharply.
- the positioning system 40 measures the position of the object holding the first communication device 41 based on the received signal strength of the beacon signal transmitted by the first communication device 41 at each of the plurality of second communication devices 42 .
- position estimation system 10 or position estimation system 10a may include another positioning system instead of positioning system 40 .
- Another positioning system measures the position of the object held by the first communication device, for example, based on the received signal strength at the first communication device of the beacon signal transmitted by each of the plurality of second communication devices. It is a positioning system. That is, the position estimation system 10 or the position estimation system 10a may include a positioning system in which the relationship between the positioning system 40 and beacon signal transmission and reception is reversed.
- FIG. 9 is a block diagram showing the functional configuration of the positioning system according to the modification.
- FIG. 10 is a diagram showing an indoor space 50 to which the positioning system according to the modification is applied.
- the positioning system 60 includes a first communication device 61 held by an object located in the indoor space 50 and a plurality of second communication devices 62 installed in the indoor space 50. The position of the object in the indoor space 50 is measured based on the communication state.
- the positioning system 60 includes a plurality of first communication devices 61 , a plurality of second communication devices 62 and a positioning server device 63 .
- the positioning system 60 may include at least one first communication device 61 .
- the first communication device 61 is a beacon receiver (scanner) that receives beacon signals transmitted by each of the plurality of second communication devices 62 .
- the first communication device 61 measures the received signal strength indicator (RSSI) of the received beacon signal, and adds the second identification information of the second communication device 62 included in the beacon signal to the measured received signal strength. 1.
- the communication device 61 transmits the signal strength information associated with the first identification information of the held object to the positioning server device 63 .
- the first identification information may be identification information of the first communication device 61 itself.
- the first communication device 61 is, for example, a portable information terminal (smartphone, etc.) capable of operating as a beacon receiver, but may be a dedicated beacon receiver. As shown in FIG. 10 , the first communication device 61 is held by an object (person or tangible object) located in the indoor space 50 .
- the second communication device 62 is a beacon transmitter that transmits beacon signals.
- the beacon signal includes second identification information of the second communication device 62 .
- the second communication device 62 is detachably connected to, for example, a power supply terminal of the lighting device 22 installed on the ceiling of the indoor space 50, and operates by receiving power from the lighting device 22. .
- the power supply terminal is, for example, a USB terminal.
- the second communication device 62 may be fixed directly to the ceiling of the indoor space 50 without the illumination device 22 interposed therebetween. Also, the second communication device 62 may be fixed to a wall or the like. Note that, as shown in FIG. 10, the plurality of second communication devices 62 are two-dimensionally distributed when viewed from above.
- the positioning server device 63 acquires from the first communication device 61 a plurality of pieces of signal strength information corresponding to the plurality of second communication devices 62, and based on the acquired plurality of signal strength information, a first position located in the indoor space 50.
- One communication device 61 measures the position of the held object.
- the positioning server device 63 is an edge computer provided in the facility (building) that constitutes the indoor space 50, but may be a cloud computer provided outside the facility.
- the positioning server device 63 includes a communication section 64 , an information processing section 65 and a storage section 66 .
- the communication unit 64 is a communication module (communication circuit) for the positioning server device 63 to communicate with the plurality of first communication devices 61 and the server device 30 .
- the communication unit 64 receives, for example, a plurality of pieces of signal strength information corresponding to the plurality of second communication devices 62 from each of the plurality of first communication devices 61 .
- the communication unit 64 also transmits the position information of the object located in the indoor space 50 to the server device 30 .
- the communication performed by the communication unit 64 may be wireless communication or wired communication.
- the communication standard used for communication is also not particularly limited.
- the information processing unit 65 measures the position of the object located in the indoor space 50 based on the multiple pieces of signal intensity information received by the communication unit 64, and outputs position information indicating the measured position.
- the output position information is transmitted to the server device 30 by the communication unit 64 .
- the information processing unit 65 receives a plurality of signal strength information corresponding to a plurality of second communication devices 62 transmitted by the first communication device 61 possessed by the person A, and a plurality of second communication devices in the indoor space 50. 62, the position of the person A is measured, and the measured position is associated with the first identification information of the person A (that is, the position information of the person A).
- Any existing algorithm may be used as a method of measuring a position based on a plurality of pieces of signal strength information and arrangement information.
- the information processing section 65 is specifically realized by a processor or a microcomputer.
- the functions of the information processing section 65 are realized by executing a computer program stored in the storage section 66 by a processor or microcomputer constituting the information processing section 65 .
- the storage unit 66 is a storage device that stores signal strength information received by the communication unit 64, arrangement information indicating the arrangement of the plurality of second communication devices 62, computer programs executed by the information processing unit 65, and the like. .
- the storage unit 66 is specifically implemented by a semiconductor memory, HDD, or the like.
- the positioning system 60 determines the position of the object held by the first communication device 61 based on the received signal strength of the beacon signal transmitted by each of the plurality of second communication devices 62 at the first communication device 61. can be measured.
- the positioning system 60 can specifically measure where and what kind of object is located in the indoor space 50 .
- Position estimation system 10 or position estimation system 10 a may include positioning system 60 instead of positioning system 40 .
- the location information acquisition unit 36 of the server device 30 acquires location information from the positioning system 60 .
- the position estimating system 10 or the position estimating system 10a acquires either the image information of the image showing the indoor space 50 where the object is located or the temperature distribution information of the indoor space 50.
- an estimating unit 35 for estimating the coordinates of an object in the indoor space 50 based on the acquired information; a first communication device held by an object located in the indoor space 50;
- a position information acquisition unit 36 for acquiring target position information, which is position information of a target object, from a positioning system that measures the position of an object based on the state of communication with each of the plurality of second communication devices, and the acquired target position information and a control unit 37 that stores the identification information of the object included in the storage unit 33 in association with the estimated coordinates.
- the positioning system here is the positioning system 40 or the positioning system 60, the first communication device is the first communication device 41 or the first communication device 61, and the second communication device is the second communication device 42 or the second communication device.
- Two communication devices 62 are the positioning system 40 or the positioning system 60, the first communication device is the first
- Such a position estimation system 10 or position estimation system 10a is capable of estimating where and what kind of object is located in the indoor space 50 with high accuracy.
- the position information acquisition unit 36 acquires from the positioning system a plurality of pieces of position information each indicating the position of an object located in the indoor space 50, and among the plurality of acquired pieces of position information, the estimated coordinates are the most Positional information indicating the position of a nearby object is acquired as target positional information.
- Such a position estimation system 10 or position estimation system 10a can specify the identification information of the object based on the distance relationship between the estimated coordinates of the object and the position indicated by the position information.
- the acquisition unit 34 acquires image information of an image when the indoor space 50 is viewed from above, and the estimation unit 35 estimates coordinates based on the acquired image information.
- the coordinates are two-dimensional coordinates when the indoor space 50 is viewed from above.
- Such a position estimation system 10 can highly accurately estimate where and what kind of object is located in the indoor space 50 based on the image information.
- the acquiring unit 34 acquires temperature distribution information indicating the temperature distribution of the indoor space 50 when the indoor space 50 is viewed from above, and the estimating unit 35 acquires the acquired temperature distribution information. Estimate the coordinates based on The coordinates are two-dimensional coordinates when the indoor space 50 is viewed from above.
- Such a position estimation system 10 can highly accurately estimate where and what kind of object is located in the indoor space 50 based on the temperature distribution information.
- the positioning system 40 measures the position of the object based on the received signal strength of the beacon signal transmitted by the first communication device 41 in each of the plurality of second communication devices 42 .
- Such a position estimation system 10 or position estimation system 10a can highly accurately estimate where and what kind of object is located in the indoor space 50 based on the position information provided by the positioning system 40.
- the positioning system 60 measures the position of an object based on the received signal strength of the beacon signal transmitted by each of the plurality of second communication devices 62 at the first communication device 61 .
- Such a position estimation system 10 or position estimation system 10a can highly accurately estimate where and what kind of object is located in the indoor space 50 based on the position information provided by the positioning system 60.
- the object is a person.
- Such a position estimation system 10 or position estimation system 10a can estimate who is located where in the indoor space 50 with high accuracy.
- the target object is an object other than a person.
- Such a position estimation system 10 or position estimation system 10a can highly accurately estimate where and what kind of tangible object is located in the indoor space 50.
- the position estimation method executed by a computer such as the position estimation system 10 or the position estimation system 10a is either image information of an image showing the indoor space 50 where the object is located, or temperature distribution information of the indoor space 50.
- Such a position estimation method can highly accurately estimate what kind of object is located where in the indoor space 50 .
- the position estimation system is implemented by a plurality of devices, but may be implemented as a single device.
- the position estimation system may be implemented as a single device that corresponds to the server device.
- each component included in the position estimation system may be distributed to the plurality of devices in any way.
- part or all of the functions of the positioning server device may be provided by the server device.
- processing executed by a specific processing unit may be executed by another processing unit.
- order of multiple processes may be changed, and multiple processes may be executed in parallel.
- each component may be realized by executing a software program suitable for each component.
- Each component may be realized by reading and executing a software program recorded in a recording medium such as a hard disk or a semiconductor memory by a program execution unit such as a CPU or processor.
- each component may be realized by hardware.
- each component may be a circuit (or integrated circuit). These circuits may form one circuit as a whole, or may be separate circuits. These circuits may be general-purpose circuits or dedicated circuits.
- the present invention may be implemented in a system, apparatus, method, integrated circuit, computer program, or recording medium such as a computer-readable CD-ROM.
- any combination of systems, devices, methods, integrated circuits, computer programs and recording media may be implemented.
- the present invention may be implemented as a computer-implemented position estimation method, such as a position estimation system.
- the present invention may be implemented as a program for causing a computer to execute a position estimation method, or as a computer-readable non-temporary recording medium storing such a program. .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
Abstract
Description
[構成]
まず、実施の形態1に係る位置推定システムの構成について説明する。図1は、実施の形態1に係る位置推定システムの機能構成を示すブロック図である。図2は、実施の形態1に係る位置推定システムが適用される室内空間を示す図である。
上述のように測位システム40は、第一通信装置41が送信するビーコン信号の、複数の第二通信装置42のそれぞれにおける受信信号強度、及び、複数の第二通信装置42の配置情報に基づいて第一通信装置41が保持されている物体の位置を計測することができる。測位システム40は、具体的には室内空間50のどこにどのような物体が位置するかを計測することができる。しかしながら、測位システム40によって計測される位置の精度はあまり高くない場合がある。
[構成]
次に、実施の形態2に係る位置推定システムの構成について説明する。図5は、実施の形態2に係る位置推定システムの機能構成を示すブロック図である。図6は、実施の形態2に係る位置推定システムが適用される室内空間50を示す図である。
位置推定システム10aは、赤外線センサ21によって温度分布情報と、測位システム40によって提供される物体の位置情報とを併用することで、高精度に対象物の位置を推定する。ここでの対象物は、周囲と温度差のある物体であり、例えば、人(生命体)であるが、周囲と温度差のある有体物(人以外の物体)であってもよい。以下、このような位置推定システム10aの動作例について説明する。図8は、位置推定システム10aの動作例のフローチャートである。
測位システム40は、第一通信装置41が送信するビーコン信号の、複数の第二通信装置42のそれぞれにおける受信信号強度に基づいて第一通信装置41が保持されている物体の位置を計測した。ここで、位置推定システム10または位置推定システム10aは、測位システム40に代えてもう一つの測位システムを備えてもよい。もう一つの測位システムは、例えば、複数の第二通信装置のそれぞれが送信するビーコン信号の第一通信装置における受信信号強度に基づいて、第一通信装置が保持されている物体の位置を計測する測位システムである。つまり、位置推定システム10または位置推定システム10aは、測位システム40とビーコン信号の送信及び受信の関係が逆転した測位システムを備えてもよい。
以上説明したように、位置推定システム10または位置推定システム10aは、対象物が位置する室内空間50が映る画像の画像情報、及び、室内空間50の温度分布情報のいずれかの情報を取得する取得部34と、取得された情報に基づいて、室内空間50における対象物の座標を推定する推定部35と、室内空間50に位置する物体に保持された第一通信装置と室内空間50に設置された複数の第二通信装置それぞれとの通信状態に基づいて物体の位置を計測する測位システムから対象物の位置情報である対象位置情報を取得する位置情報取得部36と、取得された対象位置情報に含まれる対象物の識別情報を、推定された座標に対応付けて記憶部33に記憶する制御部37とを備える。ここでの測位システムは、測位システム40または測位システム60であり、第一通信装置は、第一通信装置41または第一通信装置61であり、第二通信装置は、第二通信装置42または第二通信装置62である。
以上、実施の形態に係る位置推定システム、及び、位置推定方法について説明したが、本発明は、上記実施の形態に限定されるものではない。
20 カメラ
21 赤外線センサ
22 照明装置
30 サーバ装置
31、44、64 通信部
32、45、65 情報処理部
33、46、66 記憶部
34 取得部
35 推定部
36 位置情報取得部
37 制御部
40、60 測位システム
41、61 第一通信装置
42、62 第二通信装置
43、63 測位用サーバ装置
50 室内空間(空間)
Claims (10)
- 対象物が位置する空間が映る画像の画像情報、及び、前記空間の温度分布情報のいずれかの情報を取得する取得部と、
取得された前記情報に基づいて、前記空間における前記対象物の座標を推定する推定部と、
前記空間に位置する物体に保持された第一通信装置と前記空間に設置された複数の第二通信装置それぞれとの通信状態に基づいて前記物体の位置を計測する測位システムから前記対象物の位置情報である対象位置情報を取得する位置情報取得部と、
取得された前記対象位置情報に含まれる前記対象物の識別情報を、推定された前記座標に対応付けて記憶部に記憶する制御部とを備える
位置推定システム。 - 前記位置情報取得部は、
各々が前記空間に位置する物体の位置を示す複数の位置情報を前記測位システムから取得し、
取得した前記複数の位置情報のうち、推定された前記座標の最も近くにある物体の位置を示す位置情報を前記対象位置情報として取得する
請求項1に記載の位置推定システム。 - 前記取得部は、前記空間を上方から見たときの前記画像の前記画像情報を取得し、
前記推定部は、取得された前記画像情報に基づいて前記座標を推定し、
前記座標は、前記空間を上方から見たときの二次元座標である
請求項1または2に記載の位置推定システム。 - 前記取得部は、前記空間を上方から見たときの前記空間の温度分布を示す前記温度分布情報を取得し、
前記推定部は、取得された前記温度分布情報に基づいて前記座標を推定し、
前記座標は、前記空間を上方から見たときの二次元座標である
請求項1または2に記載の位置推定システム。 - 前記測位システムは、前記第一通信装置が送信するビーコン信号の、前記複数の第二通信装置のそれぞれにおける受信信号強度に基づいて前記物体の位置を計測する
請求項1または2に記載の位置推定システム。 - 前記測位システムは、前記複数の第二通信装置のそれぞれが送信するビーコン信号の前記第一通信装置における受信信号強度に基づいて前記物体の位置を計測する
請求項1または2に記載の位置推定システム。 - 前記対象物は、人である
請求項1または2に記載の位置推定システム。 - 前記対象物は、人以外の物体である
請求項1または2に記載の位置推定システム。 - 対象物が位置する空間が映る画像の画像情報、及び、前記空間の温度分布情報のいずれかの情報を取得する取得ステップと、
取得された前記情報に基づいて、前記空間における前記対象物の座標を推定する推定ステップと、
前記空間に位置する物体に保持された第一通信装置と前記空間に設置された複数の第二通信装置それぞれとの通信状態に基づいて前記物体の位置を計測する測位システムから前記対象物の位置情報である対象位置情報を取得する位置情報取得ステップと、
取得された前記対象位置情報に含まれる前記対象物の識別情報を、推定された前記座標に対応付けて記憶する記憶ステップとを含む
位置推定方法。 - 請求項9に記載の位置推定方法をコンピュータに実行させるためのプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22864095.9A EP4397988A1 (en) | 2021-08-31 | 2022-07-21 | Position estimation system and position estimation method |
JP2023545150A JPWO2023032507A1 (ja) | 2021-08-31 | 2022-07-21 | |
CN202280051095.2A CN117677860A (zh) | 2021-08-31 | 2022-07-21 | 位置估计系统和位置估计方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021140723 | 2021-08-31 | ||
JP2021-140723 | 2021-08-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023032507A1 true WO2023032507A1 (ja) | 2023-03-09 |
Family
ID=85412120
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/028348 WO2023032507A1 (ja) | 2021-08-31 | 2022-07-21 | 位置推定システム、及び、位置推定方法 |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP4397988A1 (ja) |
JP (1) | JPWO2023032507A1 (ja) |
CN (1) | CN117677860A (ja) |
WO (1) | WO2023032507A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014048416A (ja) | 2012-08-30 | 2014-03-17 | Ricoh Co Ltd | 装置、システム |
US20170228949A1 (en) * | 2016-02-04 | 2017-08-10 | Sensormatic Electronics, LLC | Access Control System with Curtain Antenna System |
JP2018204922A (ja) * | 2017-06-09 | 2018-12-27 | アズビル株式会社 | 人検知装置および方法 |
JP2020020645A (ja) * | 2018-07-31 | 2020-02-06 | 清水建設株式会社 | 位置検出システム及び位置検出方法 |
JP2020112441A (ja) * | 2019-01-11 | 2020-07-27 | 株式会社Where | 情報処理装置、位置算出システム |
JP2021169963A (ja) * | 2020-04-15 | 2021-10-28 | ダイキン工業株式会社 | 位置特定方法、及び位置特定システム |
-
2022
- 2022-07-21 CN CN202280051095.2A patent/CN117677860A/zh active Pending
- 2022-07-21 EP EP22864095.9A patent/EP4397988A1/en active Pending
- 2022-07-21 WO PCT/JP2022/028348 patent/WO2023032507A1/ja active Application Filing
- 2022-07-21 JP JP2023545150A patent/JPWO2023032507A1/ja active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014048416A (ja) | 2012-08-30 | 2014-03-17 | Ricoh Co Ltd | 装置、システム |
US20170228949A1 (en) * | 2016-02-04 | 2017-08-10 | Sensormatic Electronics, LLC | Access Control System with Curtain Antenna System |
JP2018204922A (ja) * | 2017-06-09 | 2018-12-27 | アズビル株式会社 | 人検知装置および方法 |
JP2020020645A (ja) * | 2018-07-31 | 2020-02-06 | 清水建設株式会社 | 位置検出システム及び位置検出方法 |
JP2020112441A (ja) * | 2019-01-11 | 2020-07-27 | 株式会社Where | 情報処理装置、位置算出システム |
JP2021169963A (ja) * | 2020-04-15 | 2021-10-28 | ダイキン工業株式会社 | 位置特定方法、及び位置特定システム |
Also Published As
Publication number | Publication date |
---|---|
CN117677860A (zh) | 2024-03-08 |
EP4397988A1 (en) | 2024-07-10 |
JPWO2023032507A1 (ja) | 2023-03-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN211262493U (zh) | 温度测量系统和温度测量主系统 | |
US9295141B2 (en) | Identification device, method and computer program product | |
KR100649674B1 (ko) | 이동단말의 내장 카메라를 이용한 위치인식 방법 및 그장치 | |
KR102044493B1 (ko) | 타겟 장치의 위치를 결정하기 위한 방법 및 그 전자 장치 | |
JP6721884B2 (ja) | カメラ校正用ボード、カメラ校正用装置、カメラ校正用方法、及びカメラ校正用プログラム記録媒体 | |
US11037014B2 (en) | Image processing apparatus, image processing method, and non-transitory computer-readable storage medium | |
JP6588413B2 (ja) | 監視装置および監視方法 | |
US11625828B2 (en) | Cost effective, mass producible system for rapid detection of fever conditions based on thermal imaging | |
CN103839358A (zh) | 智能空调及其防盗监控方法和装置 | |
US11624660B1 (en) | Dynamic radiometric thermal imaging compensation | |
JP2016197797A (ja) | 画像処理装置、画像処理方法、及び画像処理システム | |
JP4227037B2 (ja) | 撮像システム及び校正方法 | |
US20140191959A1 (en) | Pointing system and display having improved operable range | |
WO2023032507A1 (ja) | 位置推定システム、及び、位置推定方法 | |
US11256910B2 (en) | Method and system for locating an occupant | |
WO2023032769A1 (ja) | 位置推定システム、及び、位置推定方法 | |
JP2019191020A (ja) | 表面特性取得装置、表面特性取得システム及びプログラム | |
CN111414967A (zh) | 提高测温系统鲁棒性的方法及监测系统 | |
WO2022030548A1 (ja) | 監視情報処理装置、方法およびプログラム | |
KR102050418B1 (ko) | 영상 정합 장치 및 이를 이용한 영상 정합 방법 | |
CN113108919B (zh) | 人体温度检测方法、装置和存储介质 | |
WO2023032770A1 (ja) | 人検知システム、及び、人検知方法 | |
WO2020116100A1 (ja) | 情報処理装置、情報処理方法、プログラム及び投影システム | |
CN109564084B (zh) | 记录介质、位置推断装置以及位置推断方法 | |
US20200082840A1 (en) | Sensor data array and method of counting occupants |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22864095 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023545150 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280051095.2 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022864095 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022864095 Country of ref document: EP Effective date: 20240402 |