CN102572390B - Apparatus and method for monitoring motion of monitored objects - Google Patents

Apparatus and method for monitoring motion of monitored objects Download PDF

Info

Publication number
CN102572390B
CN102572390B CN201110397969.3A CN201110397969A CN102572390B CN 102572390 B CN102572390 B CN 102572390B CN 201110397969 A CN201110397969 A CN 201110397969A CN 102572390 B CN102572390 B CN 102572390B
Authority
CN
China
Prior art keywords
action
information
trooping
candidate
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201110397969.3A
Other languages
Chinese (zh)
Other versions
CN102572390A (en
Inventor
浅原彰规
佐藤晓子
秋山高行
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Publication of CN102572390A publication Critical patent/CN102572390A/en
Application granted granted Critical
Publication of CN102572390B publication Critical patent/CN102572390B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Alarm Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The present invention provides an apparatus and a method for monitoring the motions of monitored objects, which can adjust the accuracy for analyzing the motions of monitored objects according to the position information. The monitoring apparatus monitors the motions of a plurality of monitored objects in a monitored object area. The monitoring apparatus is provided with a processor and a memory connected with the processor, wherein the memory stores positioning data representing the positions of the mobile terminals carried by the plurality of monitored objects; the processor classifies the motions of the monitored objects according to the positioning data, selects candidates for replaced objects from the classified motion, extracts a plurality of positioning data according to the motions selected for the candidates, outputs the information related to the extracted plurality of positioning data, and changes the classification of the motion selected for the candidates.

Description

The device that monitored object person's action is monitored and method
Technical field
The present invention relates to the surveillance of use location information, relate in particular to according to the technology of monitored object person's the action route person's that analyzes monitored object action.
Background technology
In the past, the technology that the image that had proposed by the motion track with reference to monitored object person, to move route and take by surveillance camera monitors monitored object person's action.
For example, the technology of retrieving the image being photographed by surveillance camera with monitored object person's action route is disclosed in patent documentation 1.
In patent documentation 2, disclose by monitored object person's the positional information to obtaining with RFID and the image of surveillance camera and compared both correspondences, detected thus abnormal technology.
In the case of having obtained monitored object person's action route, can analyze monitored object person's action by analyzing this action route.But, the fineness that this analysis is required, the difference according to analyzing monitored object person's the object of action.In other words, whether be necessary by analysis distinguish certain action, according to analyze object and difference.
For example, in the case of analyzing monitored object person by the action of stair, if judge monitored object simply, whether person has carried out mobilely with regard in enough situations, does not relate to by needed and judges monitored object temporally whether person has passed through stair.For example, when passage (corridor) monitored object person by beyond stair too.
But, if the person's that for example analyzes monitored object taking the investigation relevant to the structure of stair as object action, the action of the action that sometimes needs the monitored object person " to have passed through stair with the shorter time " and " having passed through stair through the long time " is judged to be different action.In this embodiment, do not need using its needed time as problem by the action in corridor about monitored object person yet.In this case, also analyzed according to same benchmark in the past, therefore such as, for the scope (corridor or stair etc.) of each analysis, cannot at random set according to object the fineness of analysis.
Patent documentation 1: TOHKEMY 2010-123069 communique
Patent documentation 2: TOHKEMY 2006-311111 communique
Summary of the invention
Propose in view of the above problems the present invention, its object is, in the time analyzing monitored object person's action route, by specify fineness arbitrarily for each scope, can extract necessary information.
The invention provides a kind of monitoring arrangement that the action of multiple monitored object persons in monitored object region is monitored, wherein, described monitoring arrangement possesses processor and the storage device being connected with described processor, the locator data of the position of the mobile terminal that the described multiple monitored object persons of described storage device preservation expression carry, described processor is classified to described monitored object person's action according to described locator data, described processor changes the candidate of object from multiple on selections of having carried out described classification, described processor extracts the multiple locator datas corresponding with the action of selecting as described candidate, the output of described processor to described in the relevant information of multiple locator datas of extracting, in the time having inputted the instruction of classification of changing the action of selecting as described candidate, described processor changes the classification of the action of selecting as described candidate.
According to an embodiment of the invention, by the fineness of analyzing for each scope adjustment, can from positional information, extract the needed information of action for the person that analyzes monitored object.
Brief description of the drawings
Fig. 1 is the block diagram that represents the structure of the facility monitoring system of the 1st execution mode of the present invention.
Fig. 2 is the block diagram that represents the hardware configuration of the monitor server of the 1st execution mode of the present invention.
Fig. 3 is all flow charts that represent the action of the facility monitoring system of the 1st execution mode of the present invention.
Fig. 4 is the precedence diagram that is illustrated in the transmission processing of the positioning result of carrying out in the 1st execution mode of the present invention.
Fig. 5 is the precedence diagram that is illustrated in the transmission processing of the detection information of carrying out in the 1st execution mode of the present invention.
Fig. 6 is the key diagram from the positioning result of the mobile terminal of the 1st execution mode of the present invention or the transmission of ambient side positioner.
Fig. 7 is the key diagram from the sensor information of the transducer transmission of the 1st execution mode of the present invention.
Fig. 8 is the key diagram of the sensor information of storing in the sensor information DB of the 1st execution mode of the present invention.
Fig. 9 is the key diagram of the cartographic information stored in the indoor map DB of the 1st execution mode of the present invention.
Figure 10 is the key diagram of the sensor parameters stored in the indoor map DB of the 1st execution mode of the present invention.
Figure 11 is the flow chart that represents the action route dissection process of the monitor server execution of the 1st execution mode of the present invention.
Figure 12 A be represent the 1st execution mode of the present invention monitor server carry out the calculating of characteristic quantity and the flow chart of the generation processing of state grade.
Figure 12 B is the key diagram of the division of the locator data in the 1st execution mode of the present invention.
Figure 12 C is the key diagram of the calculating of the characteristic quantity in the 1st execution mode of the present invention.
Figure 12 D is the key diagram of the cluster (clustering) in the 1st execution mode of the present invention.
Figure 13 is the flow chart that represents the clustering processing of the monitor server execution of the 1st execution mode of the present invention.
Figure 14 is the key diagram of the state grade obtained by the monitor server of the 1st execution mode of the present invention.
Figure 15 is by the key diagram of the statistical model of the monitor server use of the 1st execution mode of the present invention.
Figure 16 is by the key diagram of the state transition extraction process of the monitor server execution of the 1st execution mode of the present invention.
Figure 17 is the key diagram of the State transition model stored in the analytical information DB of the 1st execution mode of the present invention.
Figure 18 is the key diagram of the cluster information of storing in the analytical information DB of the 1st execution mode of the present invention.
Figure 19 is by the key diagram of the output picture of the analysis conditions prompting processing of the picture display device demonstration of the 1st execution mode of the present invention.
Figure 20 is that the analysis condition that represents the monitor server execution of the 1st execution mode of the present invention is set the flow chart that picture prompting is processed.
Figure 21 is that the analysis condition that the monitor server of the 1st execution mode of the present invention is carried out is accepted the key diagram that picture prompting is processed.
Figure 22 is the flow chart that represents the analysis condition adjustment picture prompting processing of the monitor server execution of the 1st execution mode of the present invention.
Figure 23 is that the object pedestrian that the monitor server of the 1st execution mode of the present invention is carried out selects the key diagram of processing.
Figure 24 is the flow chart of transducer/location alignment processing of the monitor server execution of the 1st execution mode of the present invention.
Figure 25 is by the key diagram of the sensor information hint image of the picture display device demonstration of the 1st execution mode of the present invention.
Figure 26 is the flow chart that represents the analysis condition adjustment processing of the monitor server execution of the 1st execution mode of the present invention.
Figure 27 is the flow chart that represents the action route dissection process of the monitor server execution of the 2nd execution mode of the present invention.
Figure 28 is the key diagram of the condition judgement dictionary stored in the analytical information DB of the 2nd execution mode of the present invention.
Figure 29 is the key diagram of the condition judgement set information stored in the analytical information DB of the 2nd execution mode of the present invention.
Figure 30 A is the flow chart of inferring processing that represents the state grade based on condition judgement dictionary of the monitor server execution of the 2nd execution mode of the present invention.
Figure 30 B is the key diagram of the retrieval process of the project of the condition judgement dictionary in the 2nd execution mode of the present invention.
Figure 30 C is the key diagram of the distribution of the state grade in the 2nd execution mode of the present invention.
Figure 31 A is that the analytical parameters that represents the monitor server execution of the 2nd execution mode of the present invention is adjusted the flow chart that candidate selection is processed.
Figure 31 B is the key diagram of the interval retrieval process that has been assigned with same state grade in the 2nd execution mode of the present invention.
Figure 31 C is the key diagram of definite processing of the state of the thinner fineness in the 2nd execution mode of the present invention.
Figure 31 D is the key diagram of the selection processing of the state that the consistent degree in the 2nd execution mode of the present invention is high.
Figure 32 is by the key diagram of the sensor information hint image of the picture display device demonstration of the 2nd execution mode of the present invention.
Figure 33 A is that the analytical parameters that represents the monitor server execution of the 3rd execution mode of the present invention is adjusted the flow chart that candidate selection is processed.
Figure 33 B is the key diagram of definite processing of the state of the thinner fineness in the 3rd execution mode of the present invention.
Figure 34 is the key diagram of the condition judgement set information stored in the analytical information DB of the 4th execution mode of the present invention.
Symbol description
100 monitor servers
101 sensor information management departments
102 position the record management departments
103 transducer/location Synthesis Departments
104 action route analysis units
105 analysis condition adjustment parts
106 analysis conditions are set picture generating unit
107 analysis result picture generating units
111 sensor information database (DB)
112 location DB
113 indoor map DB
114 analytical information DB
115 user DB
120 picture display parts
130 transducers
140 mobile terminals
150 ambient side positioners
160A, 160B network
Embodiment
Below, use brief description of the drawings embodiments of the present invention.
< the 1st execution mode >
Fig. 1 is the block diagram that represents the structure of the facility monitoring system of the 1st execution mode of the present invention.
The facility monitoring system of present embodiment possesses: monitor server 100, picture display device 120, more than one transducer 130, more than one mobile terminal 140, more than one ambient side positioner 150 and by them interconnective network 160A and 160B.
Monitor server 100, according to the positional information of the mobile terminal 140 in monitor area, monitors the monitored object person's who keeps this mobile terminal 140 action.
At this, so-called monitor area is the region of the monitored object of the facility monitoring system of present embodiment, for example, be the facility such as factory or shop.In the time that monitor area is factory, monitored object person is for example the staff of factory, and in the time that monitor area is shop, monitored object person is for example the staff in shop or the client in shop.In addition, in the present embodiment, represent for example in factory etc. as the exemplary of monitor area indoor, but to outdoor supervision in the situation that, also can apply the present invention.
In order to carry out the supervision of monitor area, monitor server 100 possesses: sensor information management department 101, position the record management department 102, transducer/location Synthesis Department 103, action route analysis unit 104, analysis condition adjustment part 105, analysis condition are set picture generating unit 106, analysis result picture generating unit 107, sensor information database (DB) 111, location DB112, indoor map DB113, analytical information DB114 and user DB115.About the data of storing in the processing of described each portion execution, each DB and for realizing their hardware configuration, describe in the back.
Picture display device 120 is for example CRT (Cathode Ray Tube) or liquid crystal indicator etc.The example of the picture showing on picture display device 120 is described in the back.
In the example of Fig. 1, transducer 130 is connected on monitor server 100 via network 160A, and mobile terminal 140 and ambient side positioner 150 are connected on monitor server 100 via network 160B.Like this, independently network can be set, their kind can differ from one another, but also can share single network.These networks can be LAN (Local Area Network), WAN (Wide Area Network), public wireless net or internet etc.In addition, the form of network can be wired or wireless any form.
Mobile terminal 140 is devices that each monitored object person carries.In the present embodiment, utilize the positional information of mobile terminal 140, therefore, at least one party of mobile terminal 140 or ambient side positioner 150 need to have the function of the position of measuring mobile terminal 140.Such as mobile terminal 140 is portable telephone, PHS (Personal Handy phone System), computer, PDA (Personal Digital Assistants) with radio communication function or at least send wireless identification tag of intrinsic identifying information etc.
When mobile terminal 140 is while for example possessing the portable telephone of GPS (Global Positioning System) positioner or PHS, the information (id information, for example telephone number) of mobile terminal 140 to the positional information additional identification mobile terminal 140 of obtaining by GPS positioner, sends to monitor server 100 via network 160B by these information.Mobile terminal 140 can use the location of the signal based on for example sending from base station to replace the location of GPS positioner.In this case, sending for the base station of the signal of locating can be ambient side positioner 150.
In the time that mobile terminal 140 is for example computer or PDA etc., mobile terminal 140 can be according to the position from being arranged on electric wave that multiple Wireless LAN access points in monitor area or electric wave beacon send, calculating mobile terminal 140 from the intensity (or receiving timing) of the light signal of light beacon radiation, in the information that represents the position calculating, the id information of additional mobile terminal 140, sends to monitor server 100 by these information.In this case, Wireless LAN access point, electric wave beacon or light beacon can be ambient side positioners 150.
In the time that ambient side positioner 150 is Wireless LAN access point, can send framing signal together with id information by mobile terminal 140, measure by multiple ambient side positioners 150 moment that receives framing signal.If the position of known each ambient side positioner 150 in advance, poor according to the time of reception of this position and framing signal, by the position that uses for example same with triangulation method can calculate mobile terminal 140.As an example of this location technology, known AirLocation (registered trade mark).
In the time that mobile terminal 140 is wireless identification tag, ambient side positioner 150 is R-T units that mobile terminal 140 is conducted interviews.For example, in the time that mobile terminal 140 is so-called RFID (Radio Frequency Identification) label, multiple ambient side positioners 150 are arranged on the precalculated position of monitor area, in the time that the monitored object person that mobile terminal 140 is taken with oneself approaches certain ambient side positioner 150, ambient side positioner 150 use wireless signals are obtained the id information of mobile terminal 140.
Then, the information of ambient side positioner 150 to the additional position that can determine self of obtained id information, then sends to them monitor server 100.The information of the so-called position that can determine ambient side positioner 150 self, for example, can be the id information of this ambient side positioner 150, can be also the information that represents the coordinate of this ambient side positioner 150.If the id information of known each ambient side positioner 150 and the corresponding relation of coordinate in advance, can determine according to id information the position of ambient side positioner 150.Can determine the apparent position of the mobile terminal approaching with it 140 according to the position of ambient side positioner 150.
Or mobile terminal 140 can be the wireless identification tag that sends predetermined framing signal.Multiple ambient side positioners 150 are measured the moment that receives framing signal, can determine according to the difference in this moment the position of mobile terminal 140.If comprise the id information of mobile terminal 140 in framing signal, ambient side positioner 150 can send to monitor server 100 by this id information and the positional information of measuring.
If monitor server 100 has been preserved in advance the id information of each mobile terminal 140, carried out corresponding information with the monitored object person's who carries it identifying information, can determine the positional information receiving represents which monitored object person's position according to this information.
Specifically, in user DB115, logined the information that the information of the mobile terminal 140 of each to the each monitored object person's of identification information (such as monitored object person's name or staff's code etc.) and identification monitored object person maintenance is mapped.
Transducer 130 is obtained the information of the action that represents the monitored object person in monitor area, then obtained information is sent to monitor server 100 via network 160A.The example that transducer 130 is surveillance camera is below mainly described, but transducer 130 can be also other transducer, such as microphone, ultrasonic sensor or infrared ray sensor etc. can be also to possess to sell resume and send to the automatic vending machine etc. of the function of monitor server 100.
In the time that transducer 130 is surveillance camera, each transducer 130 for example, is taken the preset range of distributing to the each transducer 130 in monitor area with the timing (termly) of being scheduled to, and the view data obtaining is thus sent to monitor server 100.The view data sending is stored in sensor information DB111 by sensor information management department 101.This view data can be any one of Still image data or dynamic image data.In the time being dynamic image data, can take continuously with predetermined frame rate, also can repeat intermittently with predetermined time interval the shooting of certain hour.By with reference to the image that so photographs, can grasp the action of the monitored object person in certain scope of certain time period.
In the time that transducer 130 is microphone, include continuously or intermittently the sound of the preset range that is assigned to the each microphone in monitor area.The voice data of including is sent to monitor server 100, is stored in sensor information DB111 by sensor information management department 101.According to the stored voice data person's that can detect monitored object action (such as walking, stop, making action that object moves, switch gate action, lock/unlocking action, bale packing/open bag action etc.).
Fig. 2 is the block diagram that represents the hardware configuration of the monitor server 100 of the 1st execution mode of the present invention.
The monitor server 100 of present embodiment is the computer that possesses interconnective processor 201, main storage 202, input unit 203, interface (I/F) 205 and storage device 206.
Processor 201 is carried out the program of storage in main storage 202.
Main storage 202 is for example semiconductor memory, the program that storage is carried out by processor 201 and the data by processor 201 references.Specifically, in storage device 206, the program of storage and at least a portion of data are copied in main storage 202 as required.
Input unit 203 is accepted the input from the manager of facility monitoring system (with monitor server 100 person that monitors monitored object people).Input unit 203 for example can comprise keyboard and mouse etc.
I/F205 is connected with network 160A and 160B, the interface communicating with transducer 130, mobile terminal 140 and ambient side positioner 150.When network 160A and 160B mutual when independent, monitor server 100 possesses multiple I/F205, one of them is connected with network 160A, another is connected with network 160B.
Storage device 206 is for example hard disk unit (HDD) or the such non-volatile storage device of flash memory.At least storage in the storage device 206 of present embodiment: sensor information management department 101, position the record management department 102, transducer/location Synthesis Department 103, action route analysis unit 104, analysis condition adjustment part 105, analysis condition are set picture generating unit 106, analysis result picture generating unit 107, sensor information database (DB) 111, location DB112, indoor map DB113, analytical information DB114 and user DB115.
Sensor information management department 101, position the record management department 102, transducer/location Synthesis Department 103, action route analysis unit 104, analysis condition adjustment part 105, analysis condition setting picture generating unit 106 and analysis result picture generating unit 107 are programs of carrying out by processor 201.In the following description, in fact the processing that above-mentioned each portion carries out is carried out by processor 201.
Monitor server 100 shown in Fig. 1 as shown in Figure 2, can be made up of a computer, but also can be made up of multiple computers that can intercom mutually.For example can possess sensor information management department 101, position the record management department 102, sensor information DB111 and location DB112 by a side computer, possess remaining part by the opposing party's computer.Or can be possessed the handling parts such as sensor information management department 101 by a side computer, be possessed the databases such as sensor information DB111 by the opposing party's computer.
Fig. 3 is all flow charts that represent the action of the facility monitoring system of the 1st execution mode of the present invention.
Start the processing shown in Fig. 3 being equipped with after indoor map DB113.About the content of indoor map DB113, describe in the back (with reference to Fig. 9 and Figure 10).
The facility monitoring system executing data of present embodiment is collected step 310 and data analysis step 320.
Collect step 310 in order to collect testing result and positioning result executing data.
Specifically, transducer 130 detects (step 331), and its result is sent to sensor information management department 101.Mobile terminal 140 or ambient side positioner 150 position (step 332), and its result is sent to position the record management department 102.The locating result information sending at least comprises the information of the position that represents mobile terminal 140.About the detailed content of sent information, describe in the back (with reference to Fig. 6 and Fig. 7).
Sensor information management department 101 and position the record management department 102 are stored in the information receiving respectively in sensor information DB111 and location DB112 (step 311).
In order to analyze the testing result that is collected and is stored in database and positioning result and executing data analytical procedure 320.For example, monitor server 100 can be crossed over predetermined time and collect testing result and positioning result (data collection step 310), then, and the executing data analytical procedure 320 in order to analyze collected testing result and positioning result.
Specifically, initial, action route analysis unit 104 is according to the positional information of storing in the DB112 of the location route dissection process (step 321) that performs an action.Describe in the back this processing (with reference to Figure 11 etc.) in detail.
Then, analysis condition is set picture generating unit 106 according to the result of action route dissection process, and (step 322) processed in the prompting of execution analysis situation.Manager is with reference to no input instruction (step 333) of the analysis conditions of pointing out by this processing.Analysis condition is set picture generating unit 106 according to inputted instruction, decision analysis result whether appropriate (step 323).When analysis result being judged to be when appropriate, execution analysis results suggest processing (step 326), end process.
In the time analysis result being judged to be to imappropriate (needing correction analysis result), analysis condition is set picture generating unit 106 execution analysis conditions and is set picture prompting processing (step 324).Describe in the back this processing in detail.
Then, analysis condition adjustment part 105 execution analysis condition adjustment processing (step 325).Then, process and return to step 321.
Fig. 4 is the precedence diagram that is illustrated in the transmission processing of the positioning result of carrying out in the 1st execution mode of the present invention.
Specifically, the processing of carrying out for the collection of positioning result in the data collection step 310 of Fig. 4 presentation graphs 3.
Mobile terminal 140 positions, and the positioning result that comprises the information obtaining is thus sent to position the record management department 102 (step 401).Position the record management department 102 is stored in the locator data that comprises the information receiving in the DB112 of location (step 402).Repeat these and process (step 405,406), in the DB112 of location, put aside locator data.
On the other hand, ambient side positioner 150 also positions, and the information obtaining is therefrom sent to position the record management department 102 (step 403).Position the record management department 102 is stored in the information receiving in the DB112 of location (step 404).In Fig. 4, omit, but these processing are also repeated execution, in the DB112 of location, put aside locator data.
Fig. 4 has represented in facility monitoring system and with the example of multiple localization methods.Conventionally, the facility that comprises monitor area is used by multiple monitored object persons, therefore, can have multiple mobile terminals 140 in monitor area.These multiple mobile terminals 140 are all not necessarily devices of the same race.For example certain mobile terminal 140 is the portable telephones that possess GPS positioner sometimes, and other mobile terminal 140 is wireless identification tags.In this case, the positioning result of the positional information that comprises portable telephone, as shown in step 401, sends from portable telephone self, and the positioning result of the positional information that comprises wireless identification tag, as shown in step 403, sends from ambient side positioner 150.
In addition,, in the situation that only using a kind of localization method, positioning result only sends from a side of mobile terminal 140 or ambient side positioner 150.
Fig. 5 is the precedence diagram that is illustrated in the transmission processing of the sensor information of carrying out in the 1st execution mode of the present invention.
The processing of carrying out in order to collect detection information specifically, in the data collection step 310 of Fig. 5 presentation graphs 3.
Transducer 130 detects, and the information obtaining is thus sent to sensor information management department 101 (step 501).Sensor information management department 101 is stored in the sensor information that comprises the testing result receiving in sensor information DB111 (step 502).The sensor information (with reference to Fig. 8) of storing in sensor information DB111 is described in the back.Repeat these and process (step 503~506), in sensor information DB111, put aside testing result.
Then, the example of the data that use in the facility monitoring system of present embodiment is described with reference to Fig. 6~Figure 10.
Fig. 6 is the key diagram of the positioning result that sends from the mobile terminal 140 of the 1st execution mode of the present invention or ambient side positioner 150.
As mentioned above, mobile terminal 140 or ambient side positioner 150 are measured the position of mobile terminal 140, and the positional information obtaining as its result is sent to monitor server 100.Fig. 6 represents the example of the positioning result sending like this.
Positioning result 600 comprises pedestrian ID601, navigation system ID602, X coordinate 603, Y coordinate 604 and moment 605.
Pedestrian ID601 is the person's that identifies monitored object uniquely information, for example, can be the identifying information of the mobile terminal 140 that carries of the such monitored object person of identifying information of telephone number, MAC (Media Access Control) address or RIFD label.
Navigation system ID602 is the code that represents positioning means.For example, as comprising the navigation system ID602 that locates the positioning result 600 of the coordinate figure of obtaining by GPS, can give " 0 ", as the navigation system ID602 of the positioning result 600 that comprises the coordinate figure of obtaining by ambient side positioner 150, can give " 1 ".
X coordinate 603 and Y coordinate 604 are that (carrying its monitored object person) position coordinate figure in two-dimensional direct angle coordinate system of the mobile terminal in monitor area 140 is carried out to definite information.Also can process three dimensional local information to X coordinate 603 and the further Z coordinate that increases of Y coordinate 604.These coordinate figures are examples, not necessarily will use rectangular coordinate system.For example, in the time that mobile terminal 140 possesses GPS positioner, as X coordinate 603 and Y coordinate 604, can comprise the information that represents longitude and latitude.And, if desired can also comprise the information that represents height.
Moment 605 represents the moment (in other words, obtaining the moment of positioning result 600) positioning.Moment 605 can also comprise the information that represents the date positioning as required.
Position the record management department 102 generates locator data and is stored in the DB112 of location in the time receiving positioning result 600.Locator data is at least included in the positional information comprising in positioning result 600.But, in the time using different coordinates for each navigation system, position the record management department 102 can, by the coordinate figure comprising in positioning result 600 (being X coordinate figure 603 and Y coordinate figure 604) is converted, be certain coordinate system by the coordinate figure unification comprising in locator data.This conversion can be carried out by known method, therefore omits detailed explanation.In addition, such as, when use different types of pedestrian ID (telephone number, MAC Address or RFID etc.) for each navigation system, position the record management department 102 can be transformed to these ID the unified pedestrian ID using in monitor server 100.
The form of the locator data so generating substantially can be identical with the form of positioning result 600, but due to the conversion corresponding with navigation system that be through with as described above, therefore locator data can not comprise navigation system ID602.The positional information in the monitored object person's that one group of positioning result 600 shown in Fig. 6 comprises a people a moment, it is corresponding to one group of locator data.The locator datas of the many groups of storage, the i.e. positional information in the multiple moment relevant to multiple monitored object persons in the DB112 of location.
Fig. 7 is the key diagram of the sensor information that sends from the transducer 130 of the 1st execution mode of the present invention.
As mentioned above, transducer 130 is carried out and is detected and its result (testing result) is sent to monitor server 100.Fig. 7 represents the example of the sensor information sending like this.
Sensor information 700 comprises sensor ID 701 and data 702.
Sensor ID 701 is to identify uniquely the information of the transducer 130 that has sent sensor information 700.
Data 702 are data that the transducer 130 that sent sensor information 700 is obtained as the result detecting, for example, be view data or voice data.These data 702 can be by DSP (digital signal processor) data after treatment, can be also undressed initial data, in addition, can be packed datas, can be also unpacked datas.
Transducer 130 can send the view data etc. as the result detecting simply, but also can judge whether this testing result from the testing result of last time, variation has occurred, then the result of this judgement is included in data 702 and is sent, also can only the result of this judgement be sent as data 702 in addition.
Fig. 8 is the key diagram of the sensor information of storing in the sensor information DB111 of the 1st execution mode of the present invention.
Sensor information management department 101, in the time of receiving sensor information 700, is stored in sensor information DB111 after this sensor information 700 is applied to predetermined processing.At least sensor information management department 101 need to be by sensor information 700 and moment association store.
Sensor information 800 comprises sensor ID 801, categories of sensors 802, moment 803 and data 804.Wherein, sensor ID 801 and data 804 are equivalent to sensor ID 701 and the data 702 of Fig. 7., sensor information management department 101, in the time of receiving sensor information 700, is stored in the sensor ID wherein comprising 701 and data 702 in sensor information DB111 as sensor ID 801 and data 804 respectively.
Categories of sensors 802 represents the kind of the transducer 130 that has sent sensor information 700, for example, represent that it is surveillance camera, microphone or other transducer.
Moment 803 is to determine the information of having carried out the moment of detecting.For example, in the time that data 804 are Still image data, the moment 803 can be the moment that it is taken.When data 804 be the process such as dynamic image data or voice data certain hour and obtain data time, the moment 803 can be group, the zero hour of detection and the group of the finish time, the moment that represents this detection time or their combination of the time of zero hour of for example detecting and detection.
In addition, transducer 130 testing result corresponding with moment that obtain, is stored as one group of sensor information 800.The many groups of storage sensor informations 800 in sensor information DB111, represent multiple transducers 130 multiple moment detect and the information of result.
The example of the data of storing in indoor map DB113 then, is described.Storing map information 900 and sensor parameters 1000 in indoor map DB113.
Fig. 9 is the key diagram of the cartographic information 900 stored in the indoor map DB113 of the 1st execution mode of the present invention.
Cartographic information 900 comprises atural object ID901, class code 902 and shape 903.
Atural object ID901 is the information that is identified in uniquely the atural object of monitor area or the existence of its periphery.At this, so-called atural object, the object that comprises floor, wall, post, object accepting rack, dividing plate and hang down from ceiling (such as air-conditioner pipe, loud speaker or ligthing paraphernalia) etc.
Class code 902 is the information that represents the kind of atural object.Class code 902 can be for example to identify which the information that atural object is wall, post, beam, frame, door etc.Can judge according to the class code 902 of each atural object whether this atural object hinders detection.
Shape 903 is to determine the shape of atural object and the information of size, for example, comprise the coordinate point range of the shape that shows this atural object.
The information relevant to atural object is stored as chart portfolio information 900.The many groups of storage cartographic informations 900, the i.e. cartographic information 900 relevant to multiple atural object in indoor map DB113.
Figure 10 is the key diagram of the sensor parameters 1000 stored in the indoor map DB113 of the 1st execution mode of the present invention.
Sensor parameters 1000 comprises sensor ID 1001, class code 1002, position 1003 and sensor parameters 1004 is set.
Sensor ID 1001 is the information that is identified in uniquely the each transducer 130 arranging in monitor area.Class code 1002 is the information that represents the kind of transducer.They can be identical with the sensor ID 801 of sensor information 800 and categories of sensors 802 respectively information.
It is the information that position is set of determining the transducer 130 in monitor area that position 1003 is set, for example, be two dimension or three-dimensional coordinate figure.
Sensor parameters 1004 is to determine by the information in transducer 130 detectable regions.For example, in the time that transducer 130 is surveillance camera, sensor parameters 1004 can comprise the information of direction, the angle of visual field and the exploring degree etc. of determining surveillance camera.In the time that transducer 130 is microphone, sensor parameters 1004 can comprise the information of direction, directive property and the sensitivity etc. of determining microphone.
The information relevant to transducer 130 is stored as one group of sensor parameters 1000.The many groups of storage sensor parameters 1000 in indoor map DB113, the i.e. sensor parameters 1000 relevant to multiple transducers 130.
Then, the details of the step shown in key diagram 3.
Figure 11 is the flow chart of the action route dissection process that represents that the monitor server 100 of the 1st execution mode of the present invention carries out.
In the step 321 of Fig. 3, carry out the processing shown in Figure 11.
At first, the characteristic quantity of action route analysis unit 104 compute location data, generates state grade (step 1101).Specifically, action route analysis unit 104 is carried out cluster (clustering) by the characteristic quantity for calculating, and the monitored object person's that action route is represented action is categorized as multiple states, gives grade to these states.This detailed step (with reference to Figure 12 A~Figure 12 D etc.) is described in the back.
Then, the migration row of state grade are applied to statistical model (step 1102) by action route analysis unit 104.The step that this is detailed is described in the back.
Then, action route analysis unit 104 extracts information (step 1103) from statistical model.The step that this is detailed is described in the back.
Above, action route dissection process finishes.
The calculating of characteristic quantity and the generation of state grade that with reference to Figure 12 A~Figure 12 D, explanation is carried out in step 1101.
Figure 12 A represents the calculating of characteristic quantity that the monitor server 100 of the 1st execution mode of the present invention is carried out and the flow chart of the generation processing of state grade.
Figure 12 B is the key diagram of the division of the locator data in the 1st execution mode of the present invention.
Figure 12 C is the key diagram of the calculating of the characteristic quantity in the 1st execution mode of the present invention.
Figure 12 D is the key diagram of the cluster in the 1st execution mode of the present invention.
At first, locator data is separated in time (step 1201) by action route analysis unit 104.Specifically, for example, in the time calculating the characteristic quantity of locator data in certain moment, from the DB112 of location, obtain as the locator data within the scope of the locator data in this moment of calculating object and scheduled time of comprising this moment.Figure 12 B represents the example of the locator data so obtaining.The position of the black circle of Figure 12 B is equivalent to the position represented as the locator data of calculating object, and the position of white circle is equivalent to the represented position of locator data within the scope of predetermined time.
Then, action route analysis unit 104 is according to the locator data calculated characteristics amount (step 1202) obtaining in step 1201.The monitored object person's that characteristic quantity is for example the positional information that comprises in locator data, go out according to this positional information calculation speed and acceleration etc.
Specifically, the positional information that comprises multiple moment due to the locator data obtaining in step 1201, therefore according to their persons' that can calculate monitored object translational speed, and then can be according to the change calculations acceleration of this translational speed.The characteristic quantity representing for example in Figure 12 C is:
Speed before (1) 10 second
Acceleration before (2) 10 seconds
(3) current speed
(4) current acceleration
Speed after (5) 10 seconds
Acceleration after (6) 10 seconds
(7) average speed
Distance between point before (8) 10 seconds and current point.
At this, so-called " current ", is the moment (i.e. the moment 605 corresponding with this locator data) of having obtained as the determination data of obtaining object of characteristic quantity, and the datum mark of " before 10 seconds " and " after 10 seconds " is above-mentioned " current ".
Action route analysis unit 104, can be used any one of the characteristic quantity that calculates as above as the value of feature that represents a locator data, still, conventionally use the group (for example group of above-mentioned 8 kinds of characteristic quantities) of multiple characteristic quantities.The group of this multiple characteristic quantities is general as processing taking these characteristic quantities as the vector (being characteristic quantity vector) that will usually comprise.The point that n dimension (being 8 dimensions in above-mentioned example) characteristic quantity vector can be used as in n-dimensional space shows.Below explanation is used the example of this characteristic quantity vector (being the group of characteristic quantity).
Above-mentioned characteristic quantity is an example, also can calculate further feature amount.In addition, above-mentioned 8 dimensions are also examples, can use the characteristic quantity vector of any dimension.For example, as the key element of characteristic quantity vector, can further comprise the characteristic quantity that represents current position.
Then, action route analysis unit 104 is carried out cluster to the multiple characteristic quantity vectors that are included in the characteristic quantity vector calculating in step 1202, generates state grade (step 1203).(with reference to Figure 13) as described later, carries out this cluster by known algorithms such as k-means methods.
For example, action route analysis unit 104, by repeating above-mentioned steps 1201 and 1202 for each of multiple monitored object persons and multiple moment, can be calculated the characteristic quantity vector in the each moment relevant to each monitored object person.Carry out cluster using the multiple characteristic quantity vectors that so calculate as object.
For example the multiple characteristic quantity vectors (multiple stains of marking and drawing) in n dimension (being 2 dimensions) space are categorized as to multiple trooping (being the A that troops, B and C in the example of Figure 12 D) in Figure 12 D in the example of Figure 12 D." A " shown in Figure 12 D~" C " is generated state grade.
Near distance between two characteristic quantity vectors means that these characteristic quantity vectors are similar.Two characteristic quantity vectors are similar means that the identical possibility of the monitored object person's corresponding with these characteristic quantity vectors action is high.
Therefore, be judged as by above-mentioned cluster and belong to multiple characteristic quantity vectors of trooping, for example, in the time that they are relevant to multiple monitored object persons, the possibility corresponding with these monitored objects person's same action is high.In other words, the cluster of present embodiment is equivalent to according to locator data, monitored object person's action be classified, and state grade is the mark of the action after discriminator.
At this, the monitored object person's corresponding with certain characteristic quantity vector action, is that the person of being managed is judged as in the time obtaining the locator data relevant to certain the monitored object person on calculating basis who becomes this characteristic quantity vector, the action of being undertaken by this monitored object person.In addition, in the present embodiment, so-called same action, is the action that is classified as identical action, may not mean identical action.And, as mentioned above, judge the object left and right that two whether identical benchmark of action are subject to monitored object person's action to monitor.In other words, belong to multiple characteristic quantity vectors of trooping and be analogous to each other, therefore high corresponding to the possibility of identical action, but in fact also likely corresponding to the action that should be classified as difference action.
In the present embodiment, the result of the cluster by further adjustment based on k-means method etc., can make to troop and manager wants the action of classification appropriately corresponding.This adjustment is described in the back.
As shown in above-mentioned (1)~(8), do not comprise the information that represents current position at characteristic quantity vector, in the time that two characteristic quantity vectors are analogous to each other, no matter where the action corresponding with these characteristic quantity vectors is carried out in monitor area, these characteristic quantity vectors are all likely classified as one and troop.For example this means no matter which stair of " passing through stair " such action in monitor area carry out, all they are categorized as to identical action.Therefore, in the judgment standard of wanting classification in action, be included in while where action (for example, while wanting the action of the action of the stair by certain position and the stair by the position different from it to be categorized as different action), need to take into account the cluster of the moving position of every trade itself.
For example in characteristic quantity vector, also can comprise the information that represents current position.Or first action route analysis unit 104 can carry out cluster by locator data according to the position of its expression, for each spatial the trooping obtaining thus, calculates the characteristic quantity vector of the locator data wherein comprising, and carries out their cluster.Present embodiment is to carry out the cluster of considering above-mentioned position as prerequisite.
Figure 13 is the flow chart of the clustering processing that represents that the monitor server 100 of the 1st execution mode of the present invention carries out.
In the step 1203 of Figure 12 A, carry out the processing shown in Figure 13.
At first, action route analysis unit 104, with reference to analytical information DB114, determines whether the cluster information (being whether these characteristic quantity vectors are by cluster) (step 1301) existing with the characteristic quantity vector correlation as clustering object.In the time that the characteristic quantity vector for obtained is carried out cluster at first, in step 1301, be judged to be not exist cluster information.On the other hand, as described later, in the time cluster result having been carried out to again move route dissection process after parameter adjustment (with reference to Figure 26), in step 1301, be judged to be to exist cluster information.
While being judged to be not have cluster information in step 1301, action route analysis unit 104 determines the initial value (step 1302) of cluster centers randomly.For example, action route analysis unit 104 can determine the initial value of the cluster centers of the quantity of being specified by manager.Or action route analysis unit 104 is used the troop known benchmark of several appropriate property of evaluation such as AIC (akaike information criterion) to decide the number of trooping, so that this benchmark reaches the best.
Then, action route analysis unit 104 is calculated the point corresponding with each characteristic quantity vector and the distance of cluster centers, belong to the hypothesis of troop (the trooping recently) that comprise the cluster centers nearest with it based on each characteristic quantity vector, characteristic quantity vector is carried out to cluster, and then the mean value that belongs to the characteristic quantity vector of respectively trooping is determined as cluster centers (step 1303).
Then, action route analysis unit 104 judge the new cluster centers that whether changed cluster centers in step 1303, determine in step 1303 and before whether different (step 1304) of cluster centers.In the situation that having changed cluster centers, process and return to step 1303, carry out the calculating that uses the cluster of new cluster centers and the cluster centers based on its result.
On the other hand, in the situation that not changing cluster centers, action route analysis unit 104 is according to locator data calculated characteristics amount vector, and the ID that troop nearest to each characteristic quantity vector assignment.In addition, if the characteristic quantity vector calculating before residue can use it.The ID trooping so distributing is used as state grade.
While being judged to be to have cluster information in step 1301, action route analysis unit 104 does not perform step 1302~1304 and execution step 1305.
Above, clustering processing finishes.
In addition, step 1302~1304th, in the past as k-means method and known algorithm.The clustering processing of present embodiment can be carried out by known method.K-means method is an example wherein, but also can apply inferring of such as, the regular distribution of mixing based on other algorithm, EM (Expectation-Maximization) method etc.
Figure 14 is the key diagram of the state grade obtained by the monitor server 100 of the 1st execution mode of the present invention.
The layout 1401 that Figure 14 comprises monitor area, many action routes 1402 that show on this layout 1401.
In Figure 14, represent the plane graph of the atural object in monitor area as the example of layout 1401, but as long as grasping the figure of configuration of atural object, also can use stereogram or birds-eye view etc.The wall 1413 in room 1411, corridor 1412, separation room 1411 and corridor 1412, the gateway 1414 arranging and the article 1415 (material such as, using in the commodity of selling in shop or factory etc.) that configure in the layout 1401 representing for example in Figure 14, have been shown in monitor area on wall 1413 as atural object.And can show above-mentioned atural object (if for example monitor area is outdoor, being signal lamp or crossing etc.) in addition.
By mark and draw the coordinate figure comprising in the locator data relevant to mobile terminal 140 on layout 1401, show each action route 1402., an action route 1402 is equivalent to a people monitored object person's motion track.But, repeat by monitor area in the case of a people monitored object person, this monitored object person's movement locus can be shown as to many action routes 1402.In Figure 14, show many action routes 1402 of the motion track that is equivalent to multiple monitored object persons.
What the state 1403A~1403L showing by ellipse was equivalent to obtain by cluster troops.In other words, each action corresponding to the monitored object person who is classified by cluster of state 1403A~1403L.
As illustrated with reference to Figure 12 B~Figure 12 D, the characteristic quantity vector comprising in respectively trooping according to multiple locator datas calculating of the locator data that comprises calculating object.Therefore, can on layout 1401, mark and draw the represented coordinate figure of locator data of the calculating object of the characteristic quantity vector comprising in respectively trooping.The summary of the scope of the represented coordinate figure of the locator data of the calculating object of the characteristic quantity vector that the ellipse representation showing in Figure 14 has comprised in having marked and drawed and respectively having trooped.
Give respectively identifier " a "~" l " to state 1403a~14031.These identifiers are state grades, are the identifiers of distributing in the step 1305 of described Figure 13.
Figure 15 is the key diagram of the statistical model that uses by the monitor server 100 of the 1st execution mode of the present invention.
Specifically, Figure 15 represents the example of the statistical model to the state grade migration row application in the step 1102 of Figure 11.Represent in the present embodiment as statistical model and the example of application mix markov (Markoff) model, but also can apply other model (such as hidden Markov model or Bayes (Bayesian) network etc.).
Action route analysis unit 104 is according to the result of cluster, can determine locator data on each action route 1402 is from which state transition to which state, in other words, the monitored object person's corresponding with each action route 1402 action is from which state transition to which state.And action route analysis unit 104 is by carrying out determining of this state transition for many action routes, the migration probability between can computing mode.
The tendency of monitored object person's action is generally subject to this monitored object person's feature institute left and right.And, sometimes can the tendency using the tendency of this action as state transition observe, at this, so-called monitored object person's feature, for example, in the time of the staff in monitored object Zhe Shi factory or shop, operation object of the person that is monitored object etc., in the time that monitored object person is the client in shop, the person that is monitored object is for hobby of commodity etc.In addition, such as, when monitored object person has certain intention when (intention that post is abandoned or stolen etc.), it also can become monitored object person's feature.,, by monitored object person being classified according to the tendency of state transition, sometimes can determine the group of the monitored object person with similar characteristics.
In addition, this monitored object person's feature sometimes with monitored object person's Attribute Association.At this, so-called monitored object person's attribute, for example, in the time of the staff in monitored object Zhe Shi factory or shop, person's the affiliated post that is monitored object, responsible business or post etc., in the time that monitored object person is the client in shop, the person's that is monitored object age level or sex etc.More particularly, for example can exist client women frequent near the specific sales counter in shop but male customers hardly near the difference of the such action tendencies of this sales counter.In this case, by the classification of the tendency based on state transition as described above, the person's that can also analyze monitored object attribute is associated with this action tendency.
The action route analysis unit 104 of present embodiment can be categorized as multiple patterns by the state transition relevant to multiple monitored object persons.
In Figure 15, represent the state transition diagram of each pattern.The state transition diagram of a pattern shows the migration between the state shown in Figure 14 for each moment.For example state 1403a-1 is equivalent to the state 1403a (with reference to Figure 14) in certain moment, and state 1403a-2 is equivalent to the state 1403a in its next moment.Similarly, state 1403b-1 is equivalent to the state 1403b in certain moment, and state 1403b-2 is equivalent to the state 1403b in its next moment.State 1403k-1 is equivalent to the state 1403k in certain moment, and state 1403k-2 is equivalent to the state 1403k in its next moment.State 1403l-1 is equivalent to the state 1403l in certain moment, and state 1403l-2 is equivalent to the state 1403l in its next moment.Like this, the state 1403a~1403l shown in Figure 14 shows for each moment, shows the state transition between them by arrow.And calculate state transition probability separately.
In addition, state 1502S represents that each monitored object person enters the state of the moment in monitor area, and state 1502G represents the state of each monitored object person from moment out in monitor area.Below, be also state S and state G by state 1502S and state 1502G brief note.Equally, state 1403a~1403l is also noted by abridging respectively as state a~state l.
Figure 15 only represents the state transition diagram of pattern 1501a, but other pattern (for example pattern 1501b and 1501c) also can show by same state transition diagram.But the value of state transition probability is for each pattern and difference.In addition, the quantity of pattern is not limited to 3 (pattern 1501a~1501c), can set any amount (for example k).The quantity of pattern is for example also specified by manager.
The migration probability of each pattern is deferred to Markov model.When being made as P (μ to the migration probability of state ν from state μ, ν)=when ω μ ν, for example obtain the probability of the action route that state S-state a-state b-state c-state e-state i-state G moves like that, by P (S, a) P (a, b) P (b, c) P (c, e) multiplying of P (e, i) P (i, G) is calculated.The value L calculating thus represents to move route and is applicable to the degree of the pattern of probabilistic model.In general, the Logarithmic calculation ∑ logP of probability of use.
The model of present embodiment shows by the add operation of k Markov model.π is each additional weight to them., obtain log-likelihood by log ∑ (π ∏ P).
The parameter that represents state transition probability is for example calculated by EM method.For example, move route analysis unit 104 initial values as the state transition probability in each pattern and set random value.And the state transition of inferring each action route according to this value is applicable to which pattern (E step).Then, action route analysis unit 104 is used the result calculating parameter (M step) again of E step.And then action route analysis unit 104 is used the parameter calculating again again to carry out E step.Like this, repeat E step and M step, until parameter convergence.
Figure 16 is the key diagram of the state transition extraction process carried out by the monitor server 100 of the 1st execution mode of the present invention.
Specifically, Figure 16 is illustrated in the example of the state transition extracting in the step 1103 of Figure 11.
In the example of Figure 16, show the arrow 1601 that points to state 1403c from state 1403a.There is the migration from state 1403a to state 1403c in this expression., the migration probability between these states is larger than 0%.For example, about other arrow (pointing to each the arrow of state 1403d, 1403e and 1403f from state 1403c) also identical.On the other hand, do not show the arrow that points to state 1403d from state 1403a.In this action route that represents at least to show, there is not the migration from state 1403a to state 1403d in Figure 14., the migration probability between these states is 0%.
The information that represents the state transition so extracting is stored in analytical information DB114.With reference to Figure 17 and Figure 18, its details is described.
Figure 17 is the key diagram of the State transition model stored in the analytical information DB114 of the 1st execution mode of the present invention.
The state transition probability of each pattern of calculating by the method shown in Figure 15 is stored as shown in Figure 17.Specifically, State transition model 1700 comprises pattern ID1701, beginning state grade 1702, whole state grade 1703 and probability 1704.
The pattern of pattern ID1701 identification statistical model.For example the value of pattern ID1701 is corresponding with " k " shown in Figure 15.
Beginning state grade 1702 and whole state 1703 are respectively the starting point of state transition and the grade of terminal (being ID).They are for example corresponding to " S " shown in Figure 14~Figure 16, " G " and " a "~" l ".
Probability 1704 is the probability that occur by the definite state transition of pattern ID1701, beginning state grade 1702 and whole state grade 1703.
One group of information shown in Figure 17 corresponding to a state transition, example 1 arrow as shown in figure 15.In analytical information DB114, store the group of the information corresponding with each state transition of each pattern.
Figure 18 is the key diagram of the cluster information of storing in the analytical information DB114 of the 1st execution mode of the present invention.
Cluster information 1800 shown in Figure 18 is and the relevant information of trooping obtaining by cluster.Specifically, cluster information 1800 comprises troop shape 1801 and the ID1802 that troops.
The shape of trooping 1801 represents the center of respectively trooping.
The ID1802 that troops identifies the information of respectively trooping.In addition, as mentioned above, troop corresponding to a state for one in principle, the value of the ID1802 that therefore troops is for example, with state grade (" a " shown in Figure 14~Figure 16~" l ") corresponding.
One group of cluster information 1800 shown in Figure 18 corresponding to a state, an example ellipse as shown in figure 14.In analytical information DB114, store the group of the information corresponding with each state.In addition, as shown in figure 15, even in the situation that statistical model is categorized as to multiple pattern, the center of trooping corresponding with each state do not have difference for each pattern yet.Therefore, cluster information 1800 does not comprise pattern ID.
As illustrated in present embodiment and other execution mode, the adjustment of the fineness of analyzing in the present invention, its result, sometimes by one troop be divided into multiple, or by multiple troop comprehensive come corresponding to a state.The result of such adjustment is also all reflected in the analytical information DB114 shown in Figure 17 and Figure 18.In analytical information DB114, canned data is read out as required, is used to move route dissection process by action route analysis unit 104.And also can show by picture display device 120 as required.
Figure 19 is the key diagram of the output picture processed of the analysis conditions prompting that shows by the picture display device 120 of the 1st execution mode of the present invention.
Analysis condition is set picture generating unit 106 and in the step 332 of Fig. 3, on picture display device 120, is shown the picture 1900 shown in Figure 19.Picture 1900 comprises layout 1901, adjusts button 1903, conclusion button 1904 and model selection frame 1905.
Layout 1901 is identical with the layout 1401 shown in Figure 14.On layout 1901, show the state 1403a~1403l same with Figure 14 and represent the arrow 1601 of state transition.
Manager can select the pattern showing by operator scheme choice box 1905.In the example of Figure 19, show " Mode A ".This pattern is for example the some of the multiple pattern 1501a~1501c shown in Figure 15.In the example of Figure 16, show the migration from state 1403e to state 1403i, still, for example, when the probability that this migration occurs in Mode A is 0%, also show state 1403i and point to arrow there not.The arrow about other state and sensing there is also identical.
The state transition that manager shows in reference to picture 1900 determines analysis condition when appropriate (step 323), EO button 1904.In this case, process and advance to step 326.On the other hand, manager is when being judged to be the improper operation adjustment at that time of analysis condition button 1903.In this case, process and advance to step 324.
For example, when certain of state 1403a~1403l of showing excessive (corresponding with it troop excessive), can be judged to be analysis condition imappropriate in manager thinks picture 1900.In more detail, for example, want the place of the action of labor only to show a state the person of managing, manager wants this state to be divided into when multiple, can be judged to be analysis condition imappropriate.
In addition, adjust the operation (for example mouse click) that the operation of button 1903, conclusion button 1904 and model selection frame 1905 is input units 203 of carrying out of manager.
Figure 20 represents that the analysis condition that the monitor server 100 of the 1st execution mode of the present invention is carried out sets the flow chart that picture prompting is processed.
In the step 324 of Fig. 3, set picture generating unit 106 by analysis condition and carry out the processing shown in Figure 20.
At first, analysis condition is set picture generating unit 106 execution analysis conditions and is accepted picture prompting processing (step 2001).With reference to Figure 21, this processing is described in the back.
Then, analysis condition is set the 106 execution analysis conditions adjustment picture promptings of picture generating unit and is processed (step 2002).With reference to Figure 22 etc., this processing is described in the back.
Above, analysis condition setting picture prompting processing finishes.
Figure 21 is that the analysis condition that the monitor server 100 of the 1st execution mode of the present invention is carried out is accepted the key diagram that picture prompting is processed.
Analysis condition is set picture generating unit 106 in the step 2001 of Figure 20, shows the picture 2100 shown in Figure 21 on picture display device 120.
Picture 2100 comprises layout 2101, analyzes fineness configuration part 2102 and conclusion button 2104.
Layout 2101 is identical with the layout 1401 shown in Figure 14.On layout 2101, with similarly show state 1403a~1403l and many action routes 1402 of Figure 14.And, on layout 2101, show adjusting range 2111 and scope appointment cursor 2112.
Manager wants to adjust from it adjusting range 2111 of analyzing the trooping of fineness (be state 1403a~1403l at least one) by using input unit 203 opereating specifications to specify cursors 2112, can specify to comprise.,, with corresponding the trooping of state 1403 comprising, analyze the object of fineness and designated as adjusting in adjusting range 2111.
For example, manager can by monitor area, particularly want the region (the such as sales counter of the specific commodity in shop etc.) of labor monitored object person's action to be appointed as adjusting range 2111.
Analyze fineness configuration part 2102 and comprise fineness setting knob 2103.Manager can designated analysis fineness by operate fineness setting knob 2103 with input unit 203.For example, when wanting to make to analyze fineness when thinner, manager can make fineness setting knob 2103 be moved to the left.In addition, the appointment of the fineness of this use knob is an example, also can be by other method, for example specify fineness by operation with the icon that makes fineness more carefully or thicker instruction is corresponding.
Manager is EO button 2104 in the time that the appointment of adjusting range 2111 and analysis fineness finishes.Thus, the step 2001 of Figure 20, the appointment of adjusting the appointment of trooping of object and the analysis fineness that this is trooped finish.
Whole the trooping comprising in adjusting range 2111 about such appointment, can carry out the adjustment of specified analysis fineness (, fineness is attenuated or chap), but, troop about at least one in them, manager can determine whether adjustment fineness according to sensor information.The step of the adjustment of such judgement and fineness is below described.
Figure 22 is the flow chart that represents that analysis condition adjustment picture prompting that the monitor server 100 of the 1st execution mode of the present invention is carried out is processed.
In the step 2002 of Figure 20, carry out the processing shown in Figure 22.
At first, analysis condition setting picture generating unit 106 is carried out object pedestrian and is selected to process (step 2201).With reference to Figure 23, this processing is described in the back.In addition, this so-called pedestrian person that means monitored object.
Then, transducer/location Synthesis Department 103 carries out transducer/location alignment processing (step 2202).With reference to Figure 24, this processing is described in the back.
Then, analysis condition is set picture generating unit 106 and is carried out sensor information hint image generation processing (step 2203).With reference to Figure 25, this processing is described in the back.
Above, analysis condition adjustment finishes with picture prompting processing.
Figure 23 is that object pedestrian that the monitor server 100 of the 1st execution mode of the present invention is carried out selects to process the key diagram of (step 2201).
At first, the highest the trooping of possibility (in other words, comprised action can be categorized as to high the trooping of possibility of multiple action) that analysis condition sets in trooping that picture generating unit 106 selects to comprise in adjusting range 2111, comprise multiple action is as the candidate of change object.
Specifically, for example can select to comprise in adjusting range 2111 troop in, size exceedes the more than one of predetermined threshold and troops, or can only select maximum trooping.The size of trooping, the quantity of the characteristic quantity vector for example comprising by the radius of trooping or in trooping and determining.At this, the radius that what is called is trooped is for example the center of trooping and from this distance of trooping between the characteristic quantity vector farthest of Nei center, still, can calculate the value (value of for example standard deviation) of the deviation of the coordinate figure represented based on characteristic quantity vector, as the radius of trooping.What radius was larger troop comprises the characteristic quantity vector that difference is larger.Larger, corresponding from them the different higher supposition of possibility of action of difference based on two characteristic quantity vectors, can select trooping as the candidate that changes object of radius maximum.Or the higher supposition of possibility that more the trooping of quantity of the characteristic quantity vector based on comprising comprises multiple action, can select maximum the trooping of quantity of the characteristic quantity vector comprising.
Then, analysis condition set picture generating unit 106 from selected go out troop select two in multiple characteristic quantity vectors of comprising.For example can select selected go out troop in farthest two in the characteristic quantity vector that comprises.Or, can using selected go out troop in multiple characteristic quantity vectors of comprising as object, analysis condition is set picture generating unit 106 and is further carried out clusters and generate two and troop, and selects two characteristic quantity vectors at the most approaching center separately.
Represent to select the example of two characteristic quantity vectors at this, but also can select three above characteristic quantity vectors.For example can by using selected go out troop and set picture generating unit 106 as object by analysis condition and carry out cluster and generate more than three trooping.
The characteristic quantity vector of selecting is like this to calculate by the mode shown in Figure 14, therefore can determine the locator data corresponding with these characteristic quantity vectors.Can determine that this locator data represents which monitored object person at which, which position (with reference to Fig. 6) in moment according to determined locator data.
Figure 24 is the flow chart of transducer/location alignment processing of representing that the monitor server 100 of the 1st execution mode of the present invention carries out.
In the step 2302 of Figure 22, carry out the processing shown in Figure 24.
At first, transducer/location Synthesis Department 103 obtains the surveyed area (step 2401) of the each transducer 130 arranging in monitor area.So-called surveyed area, is the region that can detect by each transducer 130, specifically, according in sensor parameters 1000, comprise position 1003 is set and sensor parameters 1004 is determined.More particularly, for example, in the time that transducer 130 is surveillance camera, determine the scope of taking by this surveillance camera, obtain and represent that the information of determined scope is used as surveyed area.
Then, definite transducer 130 (step 2402) corresponding to locator data in the step 2201 of transducer/location Synthesis Department 103 retrieval and Figure 22.Specifically, transducer/location Synthesis Department 103 according to the surveyed area of obtaining in step 2401 and in step 2201 definite represented moment and the position of locator data, determine the sensor ID of the transducer 130 position that moment of representing in this locator data can represent this locator data detecting.
Then, transducer/location Synthesis Department 103, by the transducer 130 of specified sensor ID identification, obtains the obtained sensor information (step 2403) of moment representing in this locator data from sensor information DB111.For example, in the time that determined transducer 130 is surveillance camera, the sensor information obtaining in step 2403 is the view data constantly that represent in this locator data being photographed by this surveillance camera.
Above, transducer/location alignment processing finishes.
Figure 25 is the key diagram of the sensor information hint image that shows by the picture display device 120 of the first execution mode of the present invention.
Analysis condition is set picture generating unit 106 makes picture display device 120 show sensor information hint image 2500 in the step 2203 of Figure 22.Sensor information hint image 2500 comprises: first sensor information display section 2501, first step passerby's information display section 2502, the second sensor information display part 2503, second step passerby information display section 2504, " having any different " button 2505, " failing to understand " button 2506 and " indistinction " button 2507.
As mentioned above, in the step 2201 of Figure 22, determine two locator datas, in step 2202, obtain the sensor information corresponding with each locator data.In first sensor information display section 2501 and the second sensor information display part 2503, show the sensor information corresponding with each locator data (being the image being photographed by surveillance camera in the example at Figure 25) so obtaining.
In first step passerby information display section 2502 and second step passerby information display section 2504, show respectively the information that the monitored object person corresponding to the sensor information showing is relevant in first sensor information display section 2501 and the second sensor information display part 2503.As mentioned above, corresponding locator data in each sensor information, can determine that each locator data is relevant to which monitored object person, therefore, in first step passerby information display section 2502 and second step passerby information display section 2504, show the information relevant to each monitored object person, such as each monitored object person's sex, each monitored object person enter the moment of monitor area and time of stopping in monitor area etc.
In addition,, for the person's that shows monitored object sex, monitor server 100 needs to preserve each monitored object person's identifier (the pedestrian ID shown in Fig. 6) information corresponding with this monitored object person's sex.Similarly, in the case of having preserved each monitored object person and its age, sex, be responsible for information that the attributes such as business or post are corresponding, these attributes may be displayed in first step passerby information display section 2502 and second step passerby information display section 2504.
In addition, the time that each monitored object person enters the moment of monitor area and stops in monitor area, the moment of obtaining of the locator data that can comprise according to the action route corresponding with each monitored object person is determined.
As mentioned above, in first sensor information display section 2501 and the second sensor information display part 2503, show the sensor information corresponding with two locator datas definite in the step 2201 of Figure 22, take the image of the region gained that comprises the represented position of each locator data at the moment surveillance camera of obtaining each locator data., the monitored object person corresponding with each locator data to be photographed possibility in these images high., manager can to determine that each monitored object person carries out the possibility of what kind of action with reference to these images high.
Therefore, manager is with reference to the image showing in first sensor information display section 2501 and the second sensor information display part 2503, determine whether be different action (, whether should be categorized into identical action) two action differences corresponding from two characteristic quantity vectors determining in step 2201.Manager operates " having any different " button 2505 in the time being judged to be distinguish them, operation " indistinction " button 2507 in the time being judged to be should not distinguish.
On the other hand, when be difficult to determine whether that distinguish time, manager operates " failing to understand " button 2506 according to shown image.In this case, analysis condition is set picture generating unit 106 and is again performed step 2201, selects two the characteristic quantity vectors different from last time.For example, analysis condition set picture generating unit 106 can select selected go out troop in the group of characteristic quantity vector in the characteristic quantity vector that comprises, that distance second is far away compared with the characteristic quantity vector of selecting last time.Then, again perform step 2202 and 2203, show the sensor information corresponding with the characteristic quantity vector of newly selecting.
In general,, when the characteristic quantity vector based on only calculating according to locator data analyzes monitored object person's action, manager may not classify to action like that by hope.But as mentioned above, manager determines whether two action is classified with reference to the sensing data corresponding with locator data, can carry out thus the appropriate action analysis corresponding with gerentocratic object.
In addition, Figure 25 has represented to show as sensor information the example of the image that surveillance camera photographs, and still, also can point out sensor information in addition.For example, in the time that transducer 130 is microphone, can be by sound reproduction (with reference to the 2nd execution mode) in step 2203.In this case, manager can determine the action that each monitored object person carries out according to the sound bearing again, and determines whether two action of difference based on this.
Or, in the time that transducer 130 is the automatic vending machine of record sale resume, in sensor information hint image 2500, can show these sale resume.In this case, manager for example can judge monitored object according to shown sale resume, and whether person has bought commodity, and determines whether two action of difference based on this.
In addition, manager also can not use sensor information as described above and determines whether and distinguish two action.For example, two locator datas that monitor server 100 can be determined to manager's prompting in step 2201 are respectively to represent when which monitored object person is positioned at the information of information where.Manager can ask clearly to have carried out what kind of action in determined moment and position from the each monitored object person who so determines, and determines whether two action of difference based on this.
Figure 26 is the flow chart of the analysis condition adjustment processing that represents that the monitor server 100 of the 1st execution mode of the present invention carries out.
In sensor information hint image 2500, execution analysis condition adjustment processing in the time of operation " having any different " button 2505.
At first, action route analytic parameter (step 2601) is calculated in analysis condition adjustment part 105.Specifically, analysis condition adjustment part 105 is divided trooping of selecting in the step 2201 of Figure 22, determines each center of trooping after dividing, and these are trooped and give new identifier (being state grade).
The division of trooping can be undertaken by the whole bag of tricks.For example in step 2201, selected this troop in when two characteristic quantity vectors farthest, analysis condition adjustment part 105 this can be trooped be divided into selected go out corresponding new two of two characteristic quantity vectors troop, the remaining characteristic quantity vector in this is trooped be categorized into selected go out nearest corresponding the trooping of a side of two characteristic quantity vector middle distances.In this case, analysis condition adjustment part 105 is calculated two new centers of trooping and is given state grade.
Or, action route analysis unit 104 can be only taking this troop as object carry out for by this Further Division of trooping as two clusters of trooping, state grade is also given in the center of trooping that calculate after dividing analysis condition adjustment part 105.Now, action route analysis unit 104 can be carried out cluster as initial cluster centers using two characteristic quantity vectors selecting in step 2201.
Then, analysis condition adjustment part 105 is reflected to the action route analytic parameter calculating in analytical information DB114 (step 2602).Specifically, analysis condition adjustment part 105, using the new center of trooping calculating in step 2601 and identifier that they are given as cluster information 1800, is stored in analytical information DB114.Now, from analytical information DB114, delete and become (trooping before dividing) the relevant information of trooping of dividing object.
Then, analysis condition adjustment part 105 is to the carrying out again of action route analysis unit 104 request action route dissection process (step 2603).Accept the action route analysis unit 104 of this request, according to the analytical information DB114 route dissection process (Figure 11 etc.) that performs an action after upgrading in step 2602.But in this case,, because the cluster information 1800 after above-mentioned such renewal has been stored in analytical information DB114, therefore in the step 1301 of clustering processing (Figure 13), be judged to be " having cluster information ".Therefore, omit the execution (step 1302~1304) of cluster, in the later processing of step 1102, with reference to the analytical information DB114 after upgrading in step 2602.
Above, analysis condition adjustment processing finishes.
Manager, with reference to the result (Figure 19) of carrying out again of action route processing, can judge that whether this result is abundant, specifically, can judge manager wants the action of difference whether to correspond respectively to different conditions (trooping).For this judgement also can be according to described step with reference to the picture shown in Figure 25.When the result that is judged to be to carry out is again fully when (need not by the further disaggregated classification of action), EO button 1904 or " indistinction " button 2507, the whole processing shown in Fig. 3 finish.
According to above the 1st execution mode of the present invention, in the time of action based on the action route person that analyzes monitored object, not only come automatically action classification by cluster, can also be according to gerentocratic appointment, adjust the fineness (, the fine degree of the classification of action) of analyzing.Thus, can, according to gerentocratic analysis purpose, from monitored object person's positional information, extract necessary information.
< the 2nd execution mode >
Then, the 2nd execution mode of the present invention is described.In the 2nd execution mode, omit the part relevant explanation identical with the 1st execution mode, below difference is only described.
The summary of the 2nd execution mode is described at first.
In the 1st execution mode, as mentioned above, according to locator data calculated characteristics amount vector, determine troop (, " state " corresponding with the action being classified), the migration probability between computing mode by multiple characteristic quantity vectors being carried out to cluster.And, can troop according to gerentocratic appointment Further Division.
But, in fact sometimes do not wish specific state transition to process as a state.As an example, in outdoor monitor area, monitored object person is described through the action of the crossing that has signal lamp.According to the locator data corresponding with this action, can extract by above-mentioned cluster: the state a corresponding with the action of one end of crossing that moves to signal lamp; The state b corresponding with the action of waiting under the state stopping at this place (traffic light system advance before); The state c corresponding with the action that moves to the other end on crossing.But, sometimes do not need to extract and each corresponding state of taking action of a series of like this action, and want to extract a state A corresponding with the overall action of " through crossing ".
In the 2nd execution mode, based on for example, for example, by the row (" abc ") of the multiple state grades corresponding with above-mentioned such a succession of action and information (be condition judgement dictionary) corresponding to a state grade (" A ") corresponding with it, from the multiple states that extract by cluster, infer a state.
In the following description, in order to distinguish as the result of the cluster of characteristic quantity vector and above-mentioned " a " " b " " c " that obtain such state and state grade thereof and distribute to " A " such state and state grade thereof of their row, in order conveniently the former to be recited as to " mobile status " and " mobile status grade ", the latter is recited as to " state " and " state grade "." mobile status " and " mobile status grade " is equivalent to " state " and " state grade " of the first execution mode.Be not assigned with " the mobile status grade " of " state ", in the application (step 1102 of Figure 27) of the statistical model of the migration row to state grade described later and go out (step 1103 of Figure 27) from the information extraction of statistical model, processed as " state grade ".
Then, with reference to the accompanying drawings of the details of the 2nd execution mode.
Figure 27 is the flow chart of the action route dissection process that represents that the monitor server 100 of the 2nd execution mode of the present invention carries out.
Processing shown in Figure 27 is identical with the action route dissection process (Figure 11) of the 1st execution mode, in the step 321 of Fig. 3, carries out.
At first, the characteristic quantity of action route analysis unit 104 compute location data, generates mobile status grade (step 1101).This step identical with the 1st execution mode (with reference to Figure 12 A etc.).
Then, mobile status grade and condition judgement dictionary that action route analysis unit 104 relatively generates in step 1101, infer state grade (step 2701) according to its result.
In the analytical information DB114 of present embodiment, except the information shown in Figure 17 and Figure 18, also store status is judged dictionary 2800 and condition judgement set information 2900.About the details of these information, describe in the back with reference to Figure 28 and Figure 29, about the processing of inferring of the state grade based on these information, describe in the back with reference to Figure 30 A~Figure 32.
Then, action route analysis unit 104, according to the state grade of inferring out in step 2701, is applied to statistical model (step 1102) by state grade migration row.Then, action route analysis unit 104 extracts information (step 1103) from statistical model.These steps identical with the 1st execution mode (with reference to Figure 15).
Above, the action route dissection process of the 2nd execution mode finishes.
Figure 28 is the key diagram of the condition judgement dictionary stored in the analytical information DB114 of the 2nd execution mode of the present invention.
Condition judgement dictionary 2800 comprises state grade 2801, mobile status mark row 2802, steric requirements 2803 and condition fineness 2804.
State grade 2801 is the information of status recognition uniquely.It is the state grade that be assigned to the row of mobile status grade, the state grade that is equivalent to infer out in step 2701.With the example of above-mentioned crossing, " A " is equivalent to state grade 2801.
Mobile status mark row 2802 are mark column informations of the mobile status corresponding with the state of identifying by state grade 2801 (i.e. this state).With the example of above-mentioned crossing, " abc " is equivalent to mobile status mark row 2802.More detailed example is described in the back.
Steric requirements 2803 is the character strings of specifying the condition of atural object around.As the condition of atural object around, for example, comprise the information of the direction etc. that represents the point with the kind of the atural object of the surrounding of the corresponding action route of row of condition judgement dictionary 2800 mobile status grade relatively, distance from this atural object to this action route and from this atural object to this action route.This character string can be recorded by any grammer.One example is XML (Extensible Markup Language).With the example of above-mentioned crossing, represent that the attribute of atural object is signal lamp, is equivalent to steric requirements 2803 from the distance of this signal lamp and the character string of leaving the direction etc. of this signal lamp.
Condition fineness 2804 is grades of the fineness (being fine degree) of performance condition.For example described above, sometimes wish to extract the state A corresponding with the action of " through crossing ", but, in the time wanting to extract state with larger (thick) fineness, more particularly, for example wish sometimes to extract the state B that is equivalent to the action that moves to certain other place (and passing through crossing its way) from certain place.In this case, the row of the mobile status grade corresponding with this action that " moves to certain other place from certain place " (comprise above-mentioned " abc ", than its longer row) are logined as mobile status mark row 2802, as the condition fineness 2804 corresponding with it, login represents the value of the fineness larger than the condition fineness 2804 corresponding with above-mentioned " through crossing ".
But the size of fineness may not depend on the length of the row of the mobile status grade corresponding with it.For example, " through crossing " action sometimes can further be categorized as monitored object person moves to the action of the other end and mobile action after one end of crossing stops with stopping.In this case, become larger than the fineness of the fineness of " passing crossing after temporarily stopping " action and " through crossing " action with " through crossing " the corresponding fineness of taking action with stopping.
The value of condition fineness 2804 can manually be set by manager, but also can automatically be set by monitor server 100.In the situation that automatically setting, the size of the atural object of for example specifying by steric requirements 2803 is less, can set the value of the fineness that expression is thinner, and the length of mobile status mark row 2802 is shorter, can set the value of the fineness that expression is thinner.
As mentioned above, for example determining the distance of liftoff thing " signal lamp " and the scope of direction as steric requirements 2803, login respectively in the situation of " A " and " abc " as the state grade corresponding with it 2801 and mobile status mark row 2802, when action route when the distance from from signal lamp and direction in the scope in this decision extracts the row " abc " of mobile status grade, be judged to be this action route corresponding with the action of monitored object person " through crossing ".
As condition judgement dictionary 2800, can login state grade 2801~condition fineness 2804 as described above multiple groups.For example, can be used as other steric requirements 2803 and login atural object " bookshelf ", login the value of the value of the predetermined mobile status mark row 2802 corresponding with it, the action corresponding with it state grade 2801 that " to get book from bookshelf " corresponding, group with the value of their corresponding condition finenesses 2804.In the following description, each group in these group is recited as to dictionary project.
Figure 29 is the key diagram of the condition judgement set information stored in the analytical information DB114 of the 2nd execution mode of the present invention.
Condition judgement set information 2900 comprises steric requirements 2901 and condition fineness 2902.They are identical with steric requirements 2803 and the condition fineness 2804 of condition judgement dictionary 2800 respectively.But condition judgement set information 2900 is empty under initial condition, in the time having set condition fineness by its result store in condition judgement set information 2900.
Then, the processing of carrying out in the step 2701 of Figure 27 is described.In step 2701, first as shown in Figure 30 A~Figure 30 C, carry out the processing of inferring of state grade based on condition judgement dictionary.
Figure 30 A is the flow chart of inferring processing of the state grade based on condition judgement dictionary that represents that the monitor server 100 of the 2nd execution mode of the present invention carries out.
Figure 30 B is the key diagram of the retrieval process of the project of the condition judgement dictionary in the 2nd execution mode of the present invention.
Figure 30 C is the key diagram of the distribution of the state grade in the 2nd execution mode of the present invention.
At first, the dictionary project (step 3001) of the configuration meeting spatial condition 2803 of the atural object around action route analysis unit 104 retrieval from condition judgement dictionary 2800 action route corresponding with the row of the mobile status grade of generation in step 1101.For example, during when the central authorities near of this action route by bookshelf, room and then by near other some atural objects, judge whether the position relationship of these atural objects and this action route meets the steric requirements 2803 of each dictionary project of login in condition judgement dictionary 2800, obtain and be judged as the result (with reference to Figure 30 B) of satisfied dictionary project as retrieval.
Then, the row of the mobile status mark row 2802 of the dictionary project that action route analysis unit 104 relatively retrieves in step 3001 and the mobile status grade generating in step 1101, in the time that they are similar, the state grade of this dictionary project 2801 is distributed to (step 3002) as the state grade corresponding with the row of this mobile status grade.
Similarly whether this comparison and judgement can be undertaken by known method.For example, the interval that calculating marks off from the row of this mobile status grade and the consistent degree of mobile status mark row 2802, if this consistent degree is higher than predetermined threshold value, can distribute the state grade 2801 (with reference to Figure 30 C) corresponding with mobile status mark row 2802 to this interval.Can carry out such comparison for the whole intervals that mark off from the row of mobile status grade, select the highest interval of consistent degree.For example, according to number of mobile status grade consistent in these intervals etc., calculate two interval consistent degree.
Specified fineness the retrieval dictionary project corresponding with the fineness of this appointment in step 3001 in the condition fineness 2902 as condition judgement set information 2900.But, under initial condition, do not specify fineness as condition fineness 2902.In this case, the thickest dictionary project of retrieval fineness.Therefore,, by the fineness of processing the state grade of inferring out of inferring of the state grade shown in Figure 30 A~Figure 30 C, for manager, likely the fineness than hope is large.Therefore, action route analysis unit 104 is carried out and is allowed manager determine whether to infer the processing of the state grade of thinner fineness.To this, describe with reference to Figure 31 A~Figure 32.
Figure 31 A is that the analytical parameters that the monitor server 100 of the 2nd execution mode of the present invention is carried out is adjusted the flow chart that candidate selection is processed.
The key diagram of the interval retrieval process of Figure 31 B is distribution in the 2nd execution mode of the present invention same state grade.
Figure 31 C is the key diagram of definite processing of the state of the thinner fineness in the 2nd execution mode of the present invention.
Figure 31 D is the key diagram of the selection processing of the state that the consistent degree in the 2nd execution mode of the present invention is high.
About inferring after processing of the state grade shown in many action route execution graph 30A~Figure 30 C, execution analysis parameter adjustment candidate is selected to process.
At first, action route analysis unit 104 is adjusted candidate by analytical parameters and is selected to process, and selects multiple intervals (step 3101) of the row of the mobile status grade that has been assigned with same state grade.For example, when state grade " A " has been distributed in the interval " abbbabbbaa " of the row to certain mobile status grade (being mobile status migration row), also when distribution state grade " A ", can select these intervals (with reference to Figure 31 B) to other interval " abbabbbbaa ".
Then, action route analysis unit 104, for each interval of selecting in step 3101, judge with than the state grade of last time infer processing in mobile status mark row 2802 whether consistent (more accurate, whether highly than predetermined threshold value unanimously to spend) (step 3102) corresponding to the thinner condition fineness 2804 of the condition fineness applied.In the example of Figure 31 C, be judged to be state grade " C " the corresponding mobile status mark row 2802 corresponding with the condition fineness 2804 thinner than state grade " A ", consistent with above-mentioned interval " abbbabbbaa ", be judged to be the mobile status mark row corresponding with state grade " D " 2802 consistent with interval " abbabbbbaa ".
Action route analysis unit 104 is for example for being processed and distributed multiple intervals of state grade " A " to carry out same processing by inferring of state grade.Its result, it is consistent with the mobile status mark row 2802 corresponding to state grade " C " that for example supposition determines several in these multiple intervals, other several consistent with the mobile status mark row 2802 corresponding to state grade " D ", several are with consistent corresponding to the corresponding mobile status mark of state grade " E " row 2802 in addition.In this case, move route analysis unit 104 according to being judged as consistent interval quantity from many for example, to two state grades of few selective sequential (" C " and " D "), and then select the high interval (step 3103 and Figure 31 D) of consistent degree of the mobile status mark row 2802 corresponding with each.
Then, monitor server 100 to manager prompting with selected go out sensor information corresponding to interval.This prompting is processed with the 1st execution mode and is similarly carried out (with reference to Figure 24).With reference to Figure 32 explanation example of the sensor information of prompting thus.
Figure 32 is the key diagram of the sensor information hint image that shows by the picture display device 120 of the 2nd execution mode of the present invention.
Sensor information hint image 3200 shown in Figure 32 comprises: first sensor information display section 3201, first step passerby's information display section 2502, the second sensor information display part 3203, second step passerby information display section 2504, " having any different " button 2505, " failing to understand " button 2506 and " indistinction " button 2507.Wherein, first step passerby's information display section 2502, second step passerby information display section 2504, " having any different " button 2505, " failing to understand " button 2506 and " indistinction " button 2507 with in the 1st execution mode, illustrate identical, therefore description thereof is omitted.
First sensor information display section 3201 and the second sensor information display part 3203 are except comprising respectively the first sound reproduction button 3202 and second sound reproduction button 3204 these points, identical with first sensor information display section 2501 and the second sensor information display part 2503 of the 1st execution mode.The first sound reproduction button 3202 and the second sound reproduction button 3204 are used in the situation that being provided with microphone as transducer 130.In the time that manager operates the first sound reproduction button 3202 and the second sound reproduction button 3204, the corresponding sound sound of the recording of the moment corresponding with the above-mentioned state grade of selecting " C " and " D " and position (for example) of regenerating respectively.
Manager with reference to suggested image or sound, for example, determines whether to be state C, D etc. by state A difference.Operation " having any different " button 2505 in the time being judged to be to distinguish, action route analysis unit 104 is distributed the state grade of thinner fineness according to the result of step 3102.For example replace state grade " A ", distribution state grade " C ", " D " and " E " etc.And, in this case, the value of the condition fineness 2804 corresponding with state grade " C " etc. is logined as the condition fineness 2902 of condition judgement set information 2900.
In addition, also can be by same method regeneration sound in the 1st execution mode.
The result of above-mentioned processing, for example, in the time that differentiation has distributed state grade " C " for the row " abbbabbbaa " to mobile status grade, mobile status a and mobile status b are comprehensively new state C.That is, generate comprise whole characteristic quantity vectors of comprising in corresponding with mobile status a trooping and with mobile status b corresponding troop in new the trooping of whole characteristic quantity vectors of comprising, as trooping corresponding state grade and distributed " C " with this.Calculate migration probability between this center of trooping and trooping around etc., be stored in (with reference to Figure 17 and Figure 18) in analytical information DB114.
According to above the 2nd execution mode of the present invention, can automatically extract state corresponding with the row of specific mobile status, that fineness is larger, and manager can adjust so that this fineness becomes not excessive suitable value.
< the 3rd execution mode >
Then, the 3rd execution mode of the present invention is described.In the 3rd execution mode, omit the explanation about the part identical with the 1st or the 2nd execution mode, below difference is only described.
In the 1st and the 2nd execution mode, adjust the fineness of the action route for analyzing multiple monitored object persons.On the other hand, in the 3rd execution mode, adjust the fineness of the action route of the monitored object person for analyzing a people.
Figure 33 A represents that analytical parameters that the monitor server 100 of the 3rd execution mode of the present invention is carried out adjusts candidate and select the flow chart of processing.
Figure 33 B is the key diagram of definite processing of the thinner state of the fineness in the 3rd execution mode of the present invention.
For example, in the time that the method by identical with the 2nd execution mode is determined the state grade (" A ") for example, with the interval (" abbbabbbaa ") of the row of mobile status grade corresponding, thinner state (step 3301) of definite this fineness consistent with interval of action route analysis unit 104 of the 3rd execution mode.For example, in condition judgement dictionary 2800, login with each corresponding mobile status mark of state grade " F ", " G ", " H " and be listed as " ab ", " bba " and " bbbaa ", ahead two of the row " abbbabbbaa " of above-mentioned mobile status grade are corresponding with state grade " F ", ensuing three corresponding with state " G ", remaining five corresponding with state grade " H " (with reference to Figure 33 B).
For example, as state A with the action of " through crossing " at once, state F can be corresponding with the action of " moving to one end of crossing ", state G can be corresponding with the action of " waiting for until traffic light system advances under halted state ", and state H can be corresponding with the action of " moving to the other end on crossing ".
In this case, two (step 3302) that action route analysis unit 104 is selected in determined state grade.In the example shown in Figure 33 B, the concordance rate of any one state is all 100% (in full accord, still, in the discrepant situation of concordance rate, can to select two from the high side of concordance rate).For two states so selecting, point out sensor information by the step same with the 2nd execution mode to manager.Manager, with reference to suggested sensor information, can determine whether fineness is attenuated.
According to above the 3rd execution mode of the present invention, manager can adjust the fineness that extracts state from a people monitored object person's action route.Thus, can determine how to divide the action that a people personage continues to carry out analyzes.
< the 4th execution mode >
Then, the 4th execution mode of the present invention is described.In the 4th execution mode, omit the part relevant explanation identical with the 1st to the 3rd execution mode, below difference is only described.
In the 1st execution mode, from troop (being state) obtained by cluster, select as the candidate of dividing object, in the situation that manager has indicated division, this is trooped and is divided into two.On the other hand, in the 4th execution mode, low the possibility of classification that contributes to pattern two are trooped comprehensive.
For example, carry out whole monitored object persons of certain action in certain place, if must carry out after this other certain action, even if these action are differentiated to the classification that is also helpless to pattern.In the 4th execution mode, extracting such action comes comprehensive.
Figure 34 is the key diagram of the condition judgement set information stored in the analytical information DB114 of the 4th execution mode of the present invention.
Condition judgement set information shown in Figure 34 comprises comprehensive Obj State grade 3401.It is the arrangement of the state grade selected as comprehensive object.
Then the object pedestrian who, present embodiment is described selects to process.The object pedestrian of present embodiment selects to process, and can replace the object pedestrian of the 1st execution mode to select processing (or therewith) to be performed in the step 2201 of Figure 22.
Analysis condition is set the result that picture generating unit 106 obtains with reference to generated state grade migration row are applied to statistical model.For example, in the statistical model shown in Figure 15, in the case of the difference value little, these ω between the value of ω of probability that represents the specific state transition in each pattern be roughly 1 and the mean place of the state of the front and back of this specific state transition near, distinguishing these states, to be helpless to the possibility of classification of pattern high.
More particularly, analysis condition is set picture generating unit 106 and is obtained the ω μ ν relevant to specific μ and ν for whole k (k)value, at these ω μ ν (k)poor (fluctuation) of value below predetermined threshold value, these ω μ ν (k)value more than predetermined threshold value and with these ω μ ν (k)distance between the mean value of the position that the corresponding locator data of state of the front and back of corresponding state transition represents, in the case of below predetermined threshold value, is chosen as object walking by the monitored object person at the most approaching center of trooping corresponding with these states and.
After, about selected go out object pedestrian, similarly point out sensor information (with reference to Figure 25) with the 1st execution mode, selected " indistinction " button 2507 during manager, be comprehensively a state by these states.
According to above the 4th execution mode of the present invention, even by distinguishing, contribute to pattern classification possibility also low two troop comprehensive, can collating condition migration.

Claims (16)

1. the monitoring arrangement action of the multiple monitored object persons in monitored object region being monitored, is characterized in that,
Described monitoring arrangement possesses processor and the storage device being connected with described processor,
The locator data of the position of the mobile terminal that the described multiple monitored object persons of described storage device preservation expression carry,
Described processor is classified to described monitored object person's action according to described locator data,
The on selection action of described processor from being classified described in multiple, as the candidate that changes object,
Described processor extracts the multiple locator datas corresponding with the action of selecting as described candidate,
The output of described processor to described in the relevant information of multiple locator datas of extracting,
In the time having inputted the instruction of classification of changing the action of selecting as described candidate, described processor changes the classification of the action of selecting as described candidate.
2. monitoring arrangement according to claim 1, is characterized in that,
Described storage device is preserved the first information, and this first information comprises information that the multiple transducers by being arranged in described monitored object region obtain and expression obtains the information in the moment of described information,
Described storage device is preserved the second information, and this second information can obtain the region of the described first information for determining described each transducer,
Described processor is according to the described first information and described the second information, the moment of each locator data of the multiple locator datas that extract described in obtaining, the information obtaining by described transducer in the region of the output position that each locator data represents described in comprising.
3. monitoring arrangement according to claim 2, is characterized in that,
Described processor calculates the characteristic quantity of the locator data relevant to described each monitored object person,
Described processor, by described characteristic quantity is carried out to cluster, generates corresponding with the described action being classified respectively multiple trooping,
The size of trooping described in described processor basis is selected the candidate of described change object.
4. monitoring arrangement according to claim 3, is characterized in that,
Described processor, by the multiple characteristic quantities that comprise in corresponding with the action of selecting as the candidate of described change object trooping are divided into multiple the trooping comprising respectively with the described locator data characteristic of correspondence amount extracting, changes the classification of the action of selecting as described candidate.
5. monitoring arrangement according to claim 4, is characterized in that,
Described processor is only carried out cluster using the described multiple characteristic quantities that comprise in corresponding with the action of selecting as described candidate trooping as object, generates thus trooping after multiple divisions,
Described processor extracts the locator data corresponding with the characteristic quantity that approaches most the center of trooping after described multiple division as described multiple locator datas,
Instruction in the time having inputted the classification of changing the action of selecting as described candidate, described processor, by using being divided into by only using this described multiple trooping that cluster was generated of carrying out as object of trooping with corresponding the trooping of action of selecting as described candidate, changes the classification of the action of selecting as described candidate.
6. monitoring arrangement according to claim 4, is characterized in that,
Described processor extracts farthest two of described multiple characteristic quantity middle distances of comprising in corresponding with the action of selecting as described candidate trooping, the locator data corresponding with described two characteristic quantities that extract extracted as described multiple locator datas
In the time having inputted the instruction of classification of changing the action of selecting as described candidate, the characteristic quantity comprising during what described processor was pair corresponding with the action of selecting as described candidate troop is divided, so that be comprised in respectively in corresponding new the trooping of a side near with described two characteristic quantity middle distances that extract.
7. monitoring arrangement according to claim 3, is characterized in that,
The quantity of the described described characteristic quantity comprising in respectively trooping is more, described processor judge this troop larger,
Trooping as the candidate of described change object of the multiple middle maximums of trooping that generate described in described processor selection.
8. monitoring arrangement according to claim 3, is characterized in that,
Described radius of respectively trooping is larger, described processor judge this troop larger,
Trooping as the candidate of described change object of the multiple middle maximums of trooping that generate described in described processor selection.
9. monitoring arrangement according to claim 2, is characterized in that,
Described multiple transducer is the surveillance camera that the image in described region is taken or the microphone that the sound in described region is included,
The described first information comprises described image or sound.
10. monitoring arrangement according to claim 1, is characterized in that,
Described storage device is preserved the action dictionary that multiple described action and an action that comprises described multiple action are mapped,
In the time that the consistent degree of the multiple action that are classified according to described locator data and multiple action of preserving in described action dictionary is higher than predetermined threshold value, described processor, by the described multiple action that are classified are replaced with to an action corresponding with multiple action of preserving in described action dictionary, changes the classification of described action.
11. monitoring arrangements according to claim 10, is characterized in that,
Described action dictionary is also preserved the information of the fineness that represents the each action that comprises described multiple action,
When what preserve in the multiple action that are classified according to described locator data and described action dictionary, when the consistent degree of the multiple action corresponding with the first fineness is higher than predetermined threshold value, described processor calculates the consistent degree of a part for these multiple action that are classified and second fineness corresponding multiple action thinner than described the first fineness, in the time that the described consistent degree calculating is higher than predetermined threshold value, by a part for the described multiple action that are classified being replaced with to the corresponding action of the multiple action corresponding with described the second fineness, change the classification of described action.
12. monitoring arrangements according to claim 3, is characterized in that,
Migration probability between the multiple action that are classified described in described processor calculates according to described locator data,
Described processor is categorized as multiple patterns according to predetermined statistical model by described migration probability,
Migration probability when between specific two action in described each pattern is larger than predetermined threshold value, and the predetermined threshold value hour of fluctuation ratio of the migration probability between these two action in described each pattern, described processor carries out the classification of described action by these two action are changed to an action.
13. 1 kinds of methods that monitoring arrangement monitors the action of the multiple monitored object persons in monitored object region, is characterized in that,
Described monitoring arrangement possesses processor and the storage device being connected with described processor, the locator data of the position of the mobile terminal that the described multiple monitored object persons of described monitoring arrangement preservation expression carry,
Described method comprises following steps:
First step, described monitoring arrangement is classified to described monitored object person's action according to described locator data;
Second step, the on selection action of described monitoring arrangement from being classified described in multiple, as changing the candidate of object, extract the multiple locator datas corresponding with the action of selecting as described candidate, output to described in the relevant information of multiple locator datas of extracting;
Third step, in the time having inputted the instruction of classification of changing the action of selecting as described candidate, described monitoring arrangement changes the classification of the action of selecting as described candidate.
14. methods according to claim 13, is characterized in that,
Described storage device is preserved the first information, this first information comprises information that the multiple transducers by being arranged in described monitored object region obtain and expression obtains the information in the moment of described information, and described storage device is preserved the second information, this second information can obtain the region of the described first information for determining described each transducer
In described second step, described monitoring arrangement is according to the described first information and described the second information, the moment of each locator data of the multiple locator datas that extract described in obtaining, the information obtaining by described transducer in the region of the output position that each locator data represents described in comprising.
15. methods according to claim 14, is characterized in that,
In described first step, described monitoring arrangement calculates the characteristic quantity of the locator data relevant to described each monitored object person, by described characteristic quantity is carried out to cluster, generates corresponding with the described action being classified respectively multiple trooping,
In described second step, described monitoring arrangement according to described in the size of trooping select the candidate of described change object.
16. methods according to claim 15, is characterized in that,
In described third step, described monitoring arrangement, by the multiple characteristic quantities that comprise in corresponding with the action of selecting as the candidate of described change object trooping are divided into multiple the trooping comprising respectively with the described locator data characteristic of correspondence amount extracting, change the classification of the action of selecting as described candidate.
CN201110397969.3A 2010-12-02 2011-12-02 Apparatus and method for monitoring motion of monitored objects Expired - Fee Related CN102572390B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010269032A JP5495235B2 (en) 2010-12-02 2010-12-02 Apparatus and method for monitoring the behavior of a monitored person
JP2010-269032 2010-12-02

Publications (2)

Publication Number Publication Date
CN102572390A CN102572390A (en) 2012-07-11
CN102572390B true CN102572390B (en) 2014-10-29

Family

ID=46416707

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110397969.3A Expired - Fee Related CN102572390B (en) 2010-12-02 2011-12-02 Apparatus and method for monitoring motion of monitored objects

Country Status (2)

Country Link
JP (1) JP5495235B2 (en)
CN (1) CN102572390B (en)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6103896B2 (en) * 2012-11-22 2017-03-29 株式会社アイリッジ Information providing apparatus, information output apparatus, and control program for information output apparatus
JP6150516B2 (en) * 2012-12-21 2017-06-21 アズビル株式会社 Facility management system, portable terminal, facility management apparatus, and facility management method
US9679252B2 (en) * 2013-03-15 2017-06-13 Qualcomm Incorporated Application-controlled granularity for power-efficient classification
JP2014182621A (en) * 2013-03-19 2014-09-29 Fuji Xerox Co Ltd Portable terminal equipment, portable terminal program, service management server, service management program and service providing system
GB2516865A (en) * 2013-08-02 2015-02-11 Nokia Corp Method, apparatus and computer program product for activity recognition
JP5877825B2 (en) * 2013-11-25 2016-03-08 ヤフー株式会社 Data processing apparatus and data processing method
JP5842255B2 (en) * 2013-12-12 2016-01-13 国立大学法人東京工業大学 Apparatus and method for generating logic circuit from logic circuit description in programming language
JP6321388B2 (en) * 2014-01-31 2018-05-09 株式会社野村総合研究所 Information analysis system
JP5679086B1 (en) * 2014-10-07 2015-03-04 富士ゼロックス株式会社 Information processing apparatus and information processing program
JP2017068335A (en) * 2015-09-28 2017-04-06 ルネサスエレクトロニクス株式会社 Data processing device and on-vehicle communication device
CN107438171A (en) * 2016-05-28 2017-12-05 深圳富泰宏精密工业有限公司 Monitoring system and method
JP6250852B1 (en) * 2017-03-16 2017-12-20 ヤフー株式会社 Determination program, determination apparatus, and determination method
JP6621962B2 (en) * 2017-03-21 2019-12-18 三菱電機株式会社 Monitoring screen data generation device, monitoring screen data generation method, and monitoring screen data generation program
JP7060014B2 (en) 2017-04-21 2022-04-26 ソニーグループ株式会社 Information processing equipment, information processing methods and programs
JP6560321B2 (en) * 2017-11-15 2019-08-14 ヤフー株式会社 Determination program, determination apparatus, and determination method
WO2020003951A1 (en) 2018-06-26 2020-01-02 コニカミノルタ株式会社 Program executed by computer, information processing device, and method executed by computer
JP7052604B2 (en) * 2018-07-05 2022-04-12 富士通株式会社 Business estimation method, information processing device, and business estimation program
JP7329825B2 (en) * 2018-07-25 2023-08-21 公立大学法人岩手県立大学 Information provision system, information provision method, program
JP2020119164A (en) * 2019-01-23 2020-08-06 オムロン株式会社 Motion analysis device, motion analysis method, motion analysis program and motion analysis system
JP6696016B2 (en) * 2019-02-20 2020-05-20 能美防災株式会社 Support system
CN111723617B (en) * 2019-03-20 2023-10-27 顺丰科技有限公司 Method, device, equipment and storage medium for identifying actions
JP7373187B2 (en) * 2019-09-19 2023-11-02 株式会社Local24 Flow line analysis system and flow line analysis method
JP2021148705A (en) * 2020-03-23 2021-09-27 公立大学法人岩手県立大学 Behavior estimation system, model learning system, behavior estimation method, model learning method, and program
JP7473899B2 (en) * 2021-11-16 2024-04-24 ジョージ・アンド・ショーン株式会社 Information processing device, program, and information processing system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101040554B (en) * 2004-10-14 2010-05-05 松下电器产业株式会社 Destination prediction apparatus and destination prediction method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3349033B2 (en) * 1996-03-15 2002-11-20 株式会社東芝 Purchasing behavior prediction device
WO2008146456A1 (en) * 2007-05-28 2008-12-04 Panasonic Corporation Information search support method and information search support device
JP2010049295A (en) * 2008-08-19 2010-03-04 Oki Electric Ind Co Ltd Information providing device and information providing method
EP2418849B1 (en) * 2009-04-10 2013-10-02 Omron Corporation Monitoring system, and monitoring terminal

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101040554B (en) * 2004-10-14 2010-05-05 松下电器产业株式会社 Destination prediction apparatus and destination prediction method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JP特开2010-49295A 2010.03.04
JP特开平9-251450A 1997.09.22

Also Published As

Publication number Publication date
CN102572390A (en) 2012-07-11
JP5495235B2 (en) 2014-05-21
JP2012118838A (en) 2012-06-21

Similar Documents

Publication Publication Date Title
CN102572390B (en) Apparatus and method for monitoring motion of monitored objects
CN108414971B (en) Method and apparatus for determining position information for positions in a multi-storey building
CN1578530B (en) System and methods for determining the location dynamics of a portable computing device
CN106462627B (en) Analyzing semantic places and related data from multiple location data reports
US11393212B2 (en) System for tracking and visualizing objects and a method therefor
CN105190233B (en) Position determines that processing unit, position determine that processing method, position determine processing routine, portable information processor, mobile message processing method, mobile message processing routine and storage medium
US8385943B1 (en) Method and apparatus for determining location information of a position in a multi-storey building
CN106796550B (en) Information delivery device and method
CN103488666B (en) Information processing equipment and method, electronic device and computer readable storage medium
US20100228602A1 (en) Event information tracking and communication tool
US20030048926A1 (en) Surveillance system, surveillance method and surveillance program
WO2020220629A1 (en) Method and apparatus for acquiring number of floor, and electronic device and storage medium
US20100198690A1 (en) Event information tracking and communication tool
JP2011215829A (en) Monitoring device and suspicious behavior detection method
WO2021203728A1 (en) Site selection method and apparatus for service development area, and computer device and medium
US20200074175A1 (en) Object cognitive identification solution
US20160241993A1 (en) Marker Based Activity Transition Models
JP2019174164A (en) Device, program and method for estimating terminal position using model pertaining to object recognition information and received electromagnetic wave information
US9851784B2 (en) Movement line conversion and analysis system, method and program
CN117520662A (en) Intelligent scenic spot guiding method and system based on positioning
CN114547386A (en) Positioning method and device based on Wi-Fi signal and electronic equipment
CN111523830B (en) Method, device, equipment and medium for guiding market supervision based on multi-dimensional data
US11252379B2 (en) Information processing system, information processing method, and non-transitory storage medium
EP4261760A1 (en) Hybrid segmented machine learning based method for performing highly accurate valuation of real estate
US20210124955A1 (en) Information processing system, information processing method, and non-transitory storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20141029

Termination date: 20211202