JP2008152655A - Information service provision system, object behavior estimation apparatus and object behavior estimation method - Google Patents

Information service provision system, object behavior estimation apparatus and object behavior estimation method Download PDF

Info

Publication number
JP2008152655A
JP2008152655A JP2006341630A JP2006341630A JP2008152655A JP 2008152655 A JP2008152655 A JP 2008152655A JP 2006341630 A JP2006341630 A JP 2006341630A JP 2006341630 A JP2006341630 A JP 2006341630A JP 2008152655 A JP2008152655 A JP 2008152655A
Authority
JP
Japan
Prior art keywords
information
movement
target
means
object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2006341630A
Other languages
Japanese (ja)
Other versions
JP4861154B2 (en
Inventor
Kayoko Fukazawa
香代子 深澤
Original Assignee
Ntt Docomo Inc
株式会社エヌ・ティ・ティ・ドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ntt Docomo Inc, 株式会社エヌ・ティ・ティ・ドコモ filed Critical Ntt Docomo Inc
Priority to JP2006341630A priority Critical patent/JP4861154B2/en
Publication of JP2008152655A publication Critical patent/JP2008152655A/en
Application granted granted Critical
Publication of JP4861154B2 publication Critical patent/JP4861154B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide an object behavior estimation apparatus that can estimate not only a user's position and a vehicle for movement but also the purpose of the user's behavior. <P>SOLUTION: An information service provision system comprises an object information acquisition part 3 for collecting movement information that is information including information on at least one of the position, moving speed, acceleration, moving direction and moving time zone of an object, a movement means extraction part 202 for determining whether or not the object is in the process of movement from the information collected by the object information acquisition part 3 and, if determining any movement, determining the means of movement of the object, a behavior information construction part 201 for estimating the purpose of the object's behavior from the determined means of movement of the object, and an information delivery server 1 for selecting information matching the estimated purpose and providing the object with a service based on the selected information. <P>COPYRIGHT: (C)2008,JPO&INPIT

Description

  The present invention relates to an information service providing system, a target behavior estimation apparatus, and a target behavior estimation method, and in particular, an information service provision system that estimates a purpose of a target behavior and provides information according to the estimated purpose. The present invention relates to a behavior estimation device and a target behavior estimation method.

Currently, there are services that provide information such as map information and timetables to users via portable information terminal devices such as information terminal devices. In such services, it is necessary to more accurately estimate the current state of the user in order to determine appropriate information to provide to the user.
For example, Patent Literature 1, Patent Literature 2, and Non-Patent Literature 1 are cited as conventional techniques for automatically estimating a user's situation. Patent Document 1 defines a movement state of a target whose behavior is to be estimated, and defines the movement as a series of operations from the stationary state to the movement state of the target. Patent Document 1 describes that the sampling interval for position detection is changed depending on the state of movement of the object in order to further improve the accuracy of estimation.

In addition to the history information of each target, Patent Literature 2 estimates the behavior by comparing the movement of the target with the history information of a plurality of other targets grouped. Patent Document 2 aims to increase the accuracy of action estimation by increasing the number of history samples.
Further, Non-Patent Document 1 describes that the behavior of a target is estimated using information acquired by sensors such as an acceleration sensor and a pedometer, in addition to information related to the position of the target.

In each of the above-described conventional technologies, only the information such as whether or not the user is moving or whether the user is moving on a train is determined, and a service that seems to be necessary for the current user is selected. . According to such a conventional technique, for example, whether or not the user is moving, the means of movement, and the current location can be acquired, and restaurants and map information close to the current location can be provided.
JP 2001-14297 A JP 2001-56805 A Tamura et al .: Toward the realization of context-sensitive content formulation Project / MODE initiatives, Information Processing Society of Japan Research Report 2003-UBI-1 p41-p47

  However, information necessary for the user is not determined only by the means and position of the user's movement. In other words, when the user is moving on point A on foot, if the user is moving for the purpose of commuting or going to school, the point A is an area familiar to the user, such as restaurant information, etc. Seems unnecessary. In addition, when the user goes to another place further through the point A on another day, it may be necessary that the user needs information relating to a destination map in advance.

In the case of the above-described conventional technology for determining whether to provide information using only the position of the user or the means of movement, there is a case where it is erroneously determined whether or not the user needs information. When unnecessary information is provided due to misjudgment, it is necessary for the user to stop the information providing service and, when necessary, to perform an operation to restart the stopped service. For this reason, the prior art not only does not sufficiently fulfill the object of the invention of automatically providing information required by the user, but may cause the user to feel troublesome to use the system.
The present invention has been made in view of such points, and an information service providing system capable of estimating not only the user's position and the vehicle used for movement, but also the purpose of the user's action, An object of the present invention is to provide a target behavior estimation device and a target behavior estimation method.

  In order to solve the above-described problems, the information service providing system according to the first aspect of the present invention provides at least one of information on a target position, a moving speed, a time taken for moving, an acceleration, a moving direction, and a moving time zone. Information collecting apparatus that collects movement information that includes information, and whether or not the target is moving based on the information collected by the movement information collecting apparatus, and is determined to be moving The target movement means is determined, and the target behavior estimation device for estimating the purpose of the target behavior based on the movement information and the determined target movement means is estimated by the target behavior estimation device. And a service providing apparatus that selects information according to the purpose and provides a service based on the selected information to a target.

According to such an invention, based on the collected movement information, it is possible to determine whether the target is moving or not and the moving means used by the target. Further, it is possible to estimate the purpose of the target action based on the movement information and the movement means. Furthermore, information can be selected according to the estimated purpose, and a service based on the selected information can be provided to the target.
Further, the target behavior estimation apparatus according to claim 2 is movement information that is information including at least one of information on a target position, a movement speed, a time taken for movement, acceleration, a movement direction, and a time zone for movement. Based on the movement determining means for determining whether or not the object is moving, and moving means determination for determining the moving means of the moving object when the movement determining means determines that the object is moving Means for estimating the purpose of the target action based on the movement information and the movement means information that is information on the movement means of the target determined by the movement means determination means. Features.

According to such an invention, based on the collected movement information, it is possible to determine whether the target is moving or not and the moving means used by the target. Further, it is possible to estimate the purpose of the target action based on the movement information and the movement means.
Moreover, the target behavior estimation apparatus according to claim 3 further includes movement information collection means for collecting the movement information for each specific target and supplying the movement information to the movement determination means in the invention according to claim 2. It is characterized by.

According to such an invention, it is possible to collect movement information of a moving object in real time and determine whether or not the object is moving.
According to a fourth aspect of the present invention, there is provided the target behavior estimation apparatus according to the third aspect of the present invention, wherein the movement information collecting means collects the position of the target as latitude and longitude by a communication system.

According to such an invention, the position of the target can be acquired accurately in real time.
Further, the target behavior estimation apparatus according to claim 5 is the invention according to claim 3 or 4, wherein the movement information collection unit adjusts a time interval for acquiring the movement information based on the movement information. It is characterized by that.

According to such an invention, the movement information collecting means acquires movement information at short intervals when the object moves at a high speed, for example, and acquires movement information when the movement speed of the object is low. The interval can be extended. By such adjustment, the accuracy of the movement information can be improved when necessary, and the energy required for acquiring the position information can be saved when unnecessary.
Moreover, the target action estimation apparatus according to claim 6 is the invention according to any one of claims 2 to 5, wherein the moving means determining means determines that the target is moving by the movement determining means. If the target is moved, the movement of the target is based on at least one of the dispersion of the movement speed of the target, the relationship of the intersection of the movement vectors, and the situation until the target moves from one position to another position within a predetermined time. The means is determined.

According to such an invention, the moving means can be determined using not only the speed of movement of the object but also information representing a more detailed state of the movement. Therefore, the moving means used by the object can be determined more accurately.
Moreover, the target action estimation apparatus according to claim 7 is the invention according to any one of claims 2 to 6, wherein the determination result of the moving means by the moving means determination means for a plurality of objects is collectively displayed. It further comprises at least one of general data storage means for moving means for storing, and personal data storage means for moving means for storing the determination result of the moving means by the moving means determining means for each object.

According to such an invention, the determination results of the moving means are collectively stored to increase the number of determination samples to be stored to reduce variation, and the determination accuracy after using the stored determination results is increased. Can be increased. Moreover, the accuracy of subsequent determination for a specific target can be increased by storing the determination result of the moving means for each target.
In addition, the target behavior estimation apparatus according to claim 8 is the purpose of the invention according to any one of claims 2 to 7, wherein the results of estimation by the purpose estimation means for a plurality of targets are collectively stored. It further comprises at least one of general data storage means and purpose personal data storage means for storing the result of estimation by the purpose estimation means for each object.

  According to such an invention, the results of estimation by the purpose estimation means are collectively stored to increase the number of determination samples to be stored, thereby reducing variation, and the determination after using the stored determination results Accuracy can be increased. Moreover, the accuracy of the subsequent determination for a specific target can be improved by storing the estimation result by the purpose estimation unit for each target.

The target behavior estimation apparatus according to claim 9 is the target behavior estimation apparatus according to any one of claims 2 to 8, wherein the purpose estimation unit stores the movement information and the movement in a preset behavior information database. It is characterized in that the purpose of the target behavior is estimated by comparing the means information.
According to such an invention, it is possible to arbitrarily set a behavior information database suitable for the setting of movement information and the actual situation of use of the apparatus.

The target behavior estimation apparatus according to a tenth aspect of the invention is the invention according to the ninth aspect, wherein the behavior information database further includes environmental information including at least one of temperature, humidity, and weather of an environment where the target exists. When the environmental information is inputted, the purpose estimating means estimates the purpose of the target action by comparing the inputted environmental information with the movement information against the action information database.
According to such an invention, not only the movement information but also the environment information can be used in estimating the target action purpose. For this reason, the action purpose of the target can be estimated based on more detailed information, and information according to the purpose of the target can be selected and provided.

In addition, the target behavior estimation method according to claim 11 is movement information that is information including at least one of information on a target position, a movement speed, a time taken for movement, acceleration, a movement direction, and a time period for movement. Information, a movement determination step for determining whether or not the object is moving based on the movement information input in the movement information input step, and a movement of the object in the movement determination step If it is determined, the moving means determining step for determining the moving means of the moving object, the moving information, and the moving means information that is information relating to the moving means of the target determined in the moving means determining step And a purpose estimating step for estimating the purpose of the target action based on the above.
According to such an invention, based on the collected movement information, it is possible to determine whether the target is moving or not and the moving means used by the target. Further, it is possible to estimate the purpose of the target action based on the movement information and the movement means.

  According to the present invention described above, it is not only determined whether the target is moving based on the collected movement information, the moving means used by the target, but also the purpose of the target action is estimated. can do. For this reason, it is possible to detect the target state in more detail, and by selecting a service to be provided based on the detection result, it is possible to appropriately select and provide information necessary for the user. A system, a target behavior estimation apparatus, and a target behavior estimation method can be provided.

Embodiments 1 and 2 of an information service providing system, a target behavior estimation device, and a target behavior estimation method according to the present invention will be described below with reference to the drawings. The first embodiment will be described.
Embodiment 1
(Outline of the system)
FIG. 1 is a diagram for explaining an outline of an information service providing system according to the present invention, and is a configuration common to Embodiments 1 and 2 together with FIG. 2 described later. As shown in the figure, the information service providing system of the present invention includes an information distribution server 1, a target behavior estimation unit 2, and a target information acquisition unit 3. In the first embodiment, an information service providing system is constructed by an information terminal device and an information distribution server 1 that provides services to the information terminal device, and the information terminal device includes a target behavior estimation unit 2 and a target information acquisition unit 3. Shall.

In the information service providing system using the information terminal device, the information terminal device and the user of the information terminal device are assumed to be present at the same position at the same time, and the user is set as a “target” for behavior estimation.
In the first embodiment, the target information acquisition unit 3 collects information (movement information) that includes at least one of information on the position, movement speed, acceleration, movement direction, and movement time zone of the object. Functions as a device. The target behavior estimation unit 2 determines whether the target is moving based on the collected information. When the target behavior estimation unit 2 determines that the target is moving, the target behavior estimation unit 2 determines the target moving means, It functions as a target behavior estimation device that estimates the purpose of the target behavior based on the determined target moving means.

Furthermore, the information distribution server 1 functions as a service providing apparatus that selects information according to the estimated purpose and provides a service based on the selected information. The information distribution server 1 is a computer that accumulates information such as HTML documents and images and transmits the accumulated information in response to a request from client software such as a Web browser.
The target information acquisition unit 3 includes an existing GPS (Global Positioning System) terminal 303 provided in the information terminal device and a position information DB 302. The GPS terminal 303 collects target positions as latitude and longitude for each target by the communication system. In the position information DB 302, data collected by the GPS terminal 303 is accumulated.

  Further, in the first embodiment, the sensor device 301 is provided in addition to the GPS terminal 303 provided in the information terminal device. In the first embodiment, the sensor device 301 detects the moving speed and acceleration of the target. The detection of the moving speed and acceleration may be performed by directly measuring the moving speed of the information terminal device and the acceleration applied to the information terminal device using a Doppler effect, an optical filter, or the like. Further, for example, the target position obtained from the GPS terminal 303 may be recorded in association with the time, and the speed and acceleration may be calculated.

Needless to say, the sensor device 301 is not limited to a device that detects speed and acceleration, and may be, for example, a device that detects temperature, humidity, ultraviolet intensity, etc. Good. Further, the weather at the current location may be obtained by calculation based on the directly detected temperature, humidity, ultraviolet intensity, and the like.
In the first embodiment including the sensor device 301, the information detected or acquired by the sensor device 301 is sent to the position information DB 302 and the target behavior estimation unit 2 in the same manner as the information related to the position.

Note that the GPS terminal 303 basically acquires position information from the GPS or the like at regular time intervals (cycles). However, in the first embodiment, it is assumed that the target information acquisition unit 3 can adjust the period for acquiring the movement information based on the acquired position coordinates, speed, acceleration, and the like.
This adjustment is performed so that the acquisition cycle is shortened as the moving speed of the target increases. That is, when the position of the object changes greatly, the position is detected at a short time interval, and the position detection accuracy can be prevented from changing depending on the speed of the object.
The above target information acquisition unit 3 functions as the movement information collection unit of the first embodiment.

  The target behavior estimation unit 2 determines whether or not the target is moving based on the movement information, and determines the moving means of the moving target when it is determined that the target is moving. The extraction unit 202 includes a movement information and a behavior information construction unit 201 that estimates the purpose of the target behavior based on information on the target movement unit determined by the movement unit extraction unit 202 (movement unit information). The behavior information construction unit 201 outputs a result of behavior purpose estimation as behavior information.

In the above configuration, the movement means extraction unit 202 functions as movement determination means and movement means determination means, and the behavior information construction unit 201 functions as purpose estimation means. Information detected and acquired by the target information acquisition unit 3 is collected and stored in the movement information collection / storage unit 204.
Information about the moving means and behavior information obtained by the moving means extraction unit 202 and the behavior information construction unit 201 are stored in the personal model DB 210 and general model DB 211 of the moving means, and the personal model DB 208 and general model DB 209 of the behavior information.

  The personal model DB 210 of the mobile means is a personal data storage means for mobile means for storing the judgment results of the mobile means for each target, and the general model DB 211 is a general data storage means for mobile means for storing the judgment results of the mobile means collectively. Function. Also, the personal model DB 208 for behavior information is a personal data storage unit for purpose for storing the estimation result by the behavior information construction unit 201 for each target, and the general model DB 209 collectively stores the estimation result by the behavior information construction unit 201. It functions as general purpose data storage means.

  In addition, the target behavior estimation unit 2 determines whether the target is moving by comparing the target position information with the moving means information stored in the personal model DB 210 and the general model DB 211 of the moving means. Unit 213, behavior information stored in the personal model DB 208 and general model DB 209 of behavior information, the target position information is contrasted, the behavior information determination unit 212 that determines the purpose of the target behavior, the movement means determination unit 213, the behavior information The determination unit 212 includes a behavior estimation unit 207 that uses the determination result to estimate the target behavior using an arbitrary estimation model.

When the first embodiment is configured as an information service providing system, the personal model DB 210, the general model DB 211, the mobile means determination unit 213, the personal model DB 208, the general model DB 209, the behavior together with the above-described movement means extraction unit 202 and behavior information construction unit 201. The information determination unit 212 functions as a target behavior estimation device.
The target behavior estimation unit 2 described above is a part of a computer constituting the control unit of the information terminal device, and each component such as the moving means extraction unit 202 is a software program for operating the computer.

The information distribution server 1 selects information according to the purpose of the target behavior estimated by the behavior estimation unit 207, and provides a service based on the selected information to the target. For example, when the object is moving for the purpose of commuting, it is possible to determine that the moving location is well known for the object and that information on the map and timetable is unnecessary. Furthermore, it is possible to determine that irregular traffic information such as today's news and train delays should be provided.
In addition, when the object is moving for the purpose of shopping, it is also possible to determine that the timetable of the moving area and information on the store are useful and provide them. On the other hand, it is possible to determine that information such as today's news is unnecessary.

(Processing by the target behavior estimation unit)
Hereinafter, the function and operation of the above-described configuration will be described along the procedure of action estimation processing.
FIG. 2 is a diagram for explaining a series of processing performed by each component of the target behavior estimation unit 2. FIG. 2 shows processing until generation of a construction model stored in the personal model DB 210, the general model DB 211, the personal model DB 208, and the personal model DB 210.

As shown in FIG. 2, the movement information collection / storage unit 204 stores the target position P (x, y), the detection time t, the movement speed v, the acceleration a, and other data o detected by the sensor device 301. Are collected from the target information acquisition unit 3 as movement information and output to the movement means extraction unit 202.
The movement information collection / storage unit 204 manages access information indicating the external access status of each component that acquires movement information of the target information acquisition unit 3 by the DB 205. The access information is information indicating the access status between the movement information collection / storage unit 204 and the target information acquisition unit 3 or the access status to the outside of the target information acquisition unit 3, and the time and date when the access occurred, Information such as the type of information acquired by access is included.

The movement information collection / storage unit 204 uses the access information and captures the information collected by the target information acquisition unit 3 at a predetermined time interval. The captured information is output to the movement means extraction unit 202 and stored in the DB 206. At this time, information acquired by the GPS terminal 303 is managed as movement information, and information such as humidity and temperature is managed as time information in time series. The management of movement information and spatial information by the movement information collection / storage unit 204 may be performed for each target.
The movement means extraction unit 202 generates data Tra indicating the movement means based on the inputted movement information. The generated data Tra is stored as a construction model in the personal model DB 210 and the general model DB 211 together with the movement information, and is output to the behavior information construction unit 201.

In order to generate the data Tra, the movement means extraction unit 202 determines whether the target is moving from the movement information or the spatial information stored by the movement information collection / storage unit 204. When it is determined that the object is moving, it is determined which of the moving means such as a train or a bus the object is moving.
This determination is made based on at least one of the dispersion of the movement speed of the object, the relationship of the intersection of the movement vectors, and the situation until the object moves from one position to another position within a predetermined time. Specific examples of the determination will be described below.

(Cluster reference data for movement information)
FIG. 3 is a diagram showing cluster reference data used by the moving means extraction unit 202 to determine moving means, and shows data stored in the cluster reference DB 203. For the determination of the moving means, the distribution and acceleration when the speed of the target and the position of the target are recorded in time series in the movement information are used. In the first embodiment, for example, a series of data groups in which the speed is in the range of 30 km / h to 80 km / h, the position distribution is linearly indicated, and the acceleration is 5 km / h to 7 km / h is targeted by the train. The data indicates that the user is moving.

In addition, for example, a series of data groups in which the speed is in the range of 1 km / h to 4 km / h, the position distribution is concentrated, and the acceleration is 0 km / h to 2 km / h is not moving ( Stoppage).
The distribution shown in FIG. 3 is an element indicating a situation until the object moves from one position to another position within a predetermined time. Further, the speed range indicates the dispersion of the moving speed of the object. Furthermore, in the first embodiment, it is possible to extend the plurality of vectors connecting the target positions described in time series and determine the moving means based on whether or not they intersect each other.

  If the vectors do not intersect, it is considered that the object is moving along a straight line over a long distance. In such a case, the subject may be moving by train. In addition, when vectors frequently intersect with each other, it is possible to use an element such as a possibility that the object is moving on foot or stopped, for estimation of the moving means.

(Movement information clustering)
FIG. 4 is a diagram exemplifying movement information clustered using the cluster reference data shown in FIG. 3 as teacher data, and shows a construction model referred to in the first embodiment. According to the example shown in FIG. 4, the information on the position P1 (x, y) collected at times 5:44, 3:23, and 9:02 is classified as passing the train. From this, information is extracted that the object has passed the position P1 (x, y) on the train at time t3.
In the example shown in FIG. 4, the environment information (spatial information) including at least one of the temperature, humidity, and weather of the environment in which the target exists is acquired by the target information acquisition unit 3. In addition, clustering was also performed.

Further, the cluster reference data for determining the moving means of the first embodiment is not limited to that shown in FIG. For example, items of moving means such as buses and trains in the cluster reference data can be further added or deleted by the system administrator such as airplanes and motorcycles.
The behavior information construction unit 201 inputs the data Tra and movement information output by the behavior information construction unit 201, and constructs behavior information Beh from the input information. The behavior information Beh is stored in the personal model DB 208 and the general model DB 209.

(Behavior information cluster reference data)
FIG. 5 is a diagram illustrating cluster reference data used by the behavior information construction unit 201 to determine a target behavior. The cluster reference data shown in FIG. 5 is stored in the cluster reference DB 203 in the same manner as the cluster reference data shown in FIG. The cluster reference data shown in FIG. 5 corresponds to the behavior information database of the first embodiment. The behavior information construction unit 201 estimates the purpose of the target behavior by comparing the movement information and the movement means information with the cluster reference data in FIG.

In the first embodiment, the cluster administrator data used for estimating the behavioral purpose is set in advance by the system administrator and registered in the cluster reference DB. Further, the system administrator can arbitrarily set the extraction and definition of the relationship between the data clustered by the cluster reference data.
According to the cluster reference data shown in FIG. 5, for example, the object moves by walking, train-bus, or walking from 6 o'clock to 10 o'clock or from 18 o'clock to 20 o'clock. When the distribution data described in time series matches the commuting behavior distribution in FIG. 5, this movement is classified as commuting behavior for the purpose of commuting.

Further, for example, the object moves after walking from 10:00 to 12:00 or from 16:00 to 19:00, and further stops, then moves by walking, and the data of the position distribution of the object is the shopping in FIG. If it matches the behavior distribution, this movement is classified as a shopping behavior intended for shopping.
If the cluster reference data includes spatial information, the behavior information construction unit 201 compares the spatial information report acquired by the target information acquisition unit 3 with the cluster reference data together with the movement information, and estimates the purpose of the target behavior. May be.

(Generate construction model)
FIG. 6 is a diagram illustrating data clustered in accordance with the cluster reference data of FIG. 5, and shows the construction model referred to in the first embodiment. The clustered data estimates the purpose of one target action based on the movement information and movement means information, and stores the estimation result in association with the movement information and movement means information. According to the example shown in FIG. 6, it is estimated that the action of moving the object from the position P1 to the position P2 is a commuting action.
Further, the cluster reference data for estimating the purpose of the action according to the first embodiment is not limited to that shown in FIG. For example, the items for commuting and attending school in the cluster reference data can be further added or deleted by the system administrator such as gym and travel.

(Other clustering examples)
FIG. 7 illustrates cluster reference data for estimating the purpose of another action and data clustered by the cluster reference data. FIG. 7A shows the cluster reference data used for clustering. (B) shows data clustered using (a). In the cluster reference data shown in (a), the parameter Smn is determined by the position P (x, y) where the object is located and the time when the object is located at the position P (x, y).
Then, the target action information is extracted using the parameter Smn together with the position information such as speed and acceleration, the movement means extracted by the movement means extraction unit 202 using the cluster reference data shown in FIG. judge.

(Programming the target behavior estimation method)
FIG. 8 is a flowchart for explaining a program for causing a computer to execute the target behavior estimation method executed by the information service providing system or the target behavior estimation apparatus according to Embodiment 1 described above.
In the flowchart of FIG. 8, the movement information collection / storage unit 204 acquires position information and spatial information from the GPS terminal 303 and the sensor device 301 (S1), and stores them in the DB 206 (S2). Steps S1 and S2 correspond to the movement information input step of the first embodiment.

  Next, the movement means extraction unit 202 determines whether the target is moving based on the movement information input in the movement information input step (S3). In order to determine whether or not the vehicle is moving, the moving means extraction unit 202 compares the speed, distribution, and acceleration information included in the target position information with the cluster reference data shown in FIG. As a result of the comparison, when the target movement information corresponds to a state other than the stationary state, the moving means extraction unit 202 determines that the target is moving. The above determination corresponds to the movement determination step of the first embodiment.

When it is determined that the target is moving, the moving means extraction unit 202 further determines the moving means used by the target from among moving means that are on the train, on foot, or stopped.
More specifically, the moving means extraction unit 202 determines whether or not the target speed, distribution, and acceleration correspond to the data on the train (S4). If applicable (S4: Yes), the moving means extraction unit 202 determines that the target is moving using a train (S7). If it is determined that the data does not correspond to data on the train (S4: No), it is determined whether the data corresponds to data that is moving on foot (S5).

  When the speed, distribution, and acceleration of the target correspond to the walking data (S5: Yes), the moving means extraction unit 202 determines that the target is moving by walking (S7). If it is determined in step 5 that the data does not correspond to the walking data (S5: No), the moving means extraction unit 202 determines whether the target speed, distribution, and acceleration correspond to the data being stopped ( S6). If applicable (S6: Yes), the moving means extraction unit 202 determines that the object is stationary (S7). If not applicable (S6: No), it is determined that there is no applicable data, and the next movement information is acquired.

Note that the flowchart shown in FIG. 8 illustrates the determination processing for only three of the five items of the moving means shown in FIG. 3 for the sake of simplicity. In the first embodiment, as a matter of course, the moving means can be determined by comparing the target moving information with the conditions of buses and bicycles as other moving means.
The above processing is a moving means determination step for determining the moving means of the moving object in the first embodiment.

  Next, the moving means extraction unit 202 determines whether or not to store the moving means data obtained by the determinations in steps S4, S5, and S6 for each target (S8). If it is determined in step S8 that the data is stored for each object (S8: Yes), the result of the determination relating to the moving means is stored in an area corresponding to the corresponding object in the personal model DB 210 (S10). On the other hand, if it is determined in step S8 that the information on the moving means is not stored for each target (S8: No), the result of the determination on the moving means is stored in the general model DB 211 (S9).

The behavior information construction unit 201 compares the determination result of the moving means and the movement information with the cluster reference data shown in FIG. 5 stored in the cluster reference DB 203.
More specifically, the behavior information construction unit 201 determines whether or not the target movement information and the information on the movement means correspond to commuting behavior data (S11). If applicable (S4: Yes), the behavior information construction unit 201 determines that the target is acting for the purpose of commuting (S14). If it is determined that the data does not correspond to the commuting behavior data (S11: No), it is determined whether the data corresponds to the commuting behavior data (S12).

  The behavior information construction unit 201 determines that the subject is acting for the purpose of attending school (S14) when the target movement information and the information of the transportation means correspond to the data of the attending school behavior (S12: Yes). Further, when it is determined in step 12 that the data does not correspond to the school data (S14: No), the behavior information construction unit 201 determines whether the target movement information and the information on the movement means correspond to the data of resting behavior. Judgment is made (S13). If applicable (S13: Yes), the behavior information construction unit 201 determines that the target is resting (S14). If not applicable (S13: No), it is determined that there is no applicable data, and the next movement information is acquired.

Note that the flowchart of FIG. 8 illustrates determination processing for only three of the five items of the purpose of action shown in FIG. 5 for the sake of simplicity. Naturally, the first embodiment can also determine the purpose of other actions such as shopping and work.
In the first embodiment, the above processing is a purpose estimation step for estimating the purpose of the target action based on the travel information and the travel means information that is information related to the target travel means.

  Next, the behavior information construction unit 201 determines whether or not to store the behavior purpose data obtained by the determinations of Step S11, Step S12, and Step S13 for each target (S15). If it is determined in step S15 that the data is stored for each target (S15: Yes), the result of the determination related to the action purpose is stored in the area corresponding to the target in the personal model DB 208 (S16). On the other hand, when it is determined in step S15 that the action purpose is not stored for each target (S16: No), the result of the determination related to the purpose of the action is stored in the general model DB 209 (S17).

After the above processing, in the first embodiment, the process returns to step S1 to collect the next position information and space information. According to the flowchart shown in FIG. 8, the target moving means and the purpose of action can be stored in the DB in association with the position information. Since such data is based on position information obtained by actually measuring the position and speed of the target, the actual behavior of the target is accurately reflected.
This effect is enhanced when the moving means and the purpose of action are stored in the individual model DBs 208 and 210 for each target. On the other hand, when the target moving means and the purpose of the action are collectively stored in the general model DBs 209 and 211, since the number of stored data is large, it is advantageous to suppress errors due to data variation and the like.

The behavior estimation unit 207 compares the target movement information with the general model DB 211, the personal model DB 210, the general model DB 209 of the behavior information, and the personal model DB 208 constructed by the method described above, and estimates the behavior. Yes.
That is, the behavior estimation unit 207 acquires target movement information from one or a plurality of movement information collection / storage units 204. The acquired movement information is used to estimate the action in the movement information collection / storage unit 204 and is sent to the action information determination unit 212 and the movement means determination unit 213.

The movement means determination unit 213 acquires a construction model used to estimate the target movement means from the general model DB 211 or the personal model DB 210. Then, the target moving means is determined by comparing the movement information with the construction model that acquired the movement information in the general model DB 211. The result of the determination is sent to the behavior information determination unit 212.
The behavior information determination unit 212 acquires a construction model used to estimate the target moving means from the personal model DB 208 or the general model DB 209. Then, the action information including the target action purpose is determined against the result of determination of the moving means and the construction model from which the movement information is acquired.

The behavior estimation unit 207 estimates the final behavior of the target based on the determination result of the behavior information determination unit 212. The target behavior estimation by the behavior estimation unit 207 is executed by an estimation model that can be set by the system administrator for each behavior estimation application. The configuration of the first embodiment can be configured to have a function of receiving an estimation model from an external application.
The result of the behavior estimation is provided to an external application together with the behavior information obtained as a result of the determination by the behavior information determination unit 212, the movement means information obtained as a result of the determination by the movement means determination unit 213, and the position information. The At this time, the probability of estimation may also be provided to the external application, and the external application stores and records the provided information.

The above-described program includes a movement information input function for inputting movement information, which is information including at least one of information on a target position, movement speed, acceleration, movement direction, and movement time zone, to a computer; Based on movement information input by the movement information input function, a movement determination function for determining whether or not the object is moving, and a movement when the movement determination function determines that the object is moving A purpose estimation function for estimating the purpose of the target action based on the travel information and the travel means information that is information related to the determined travel means; It is a program for realizing.
With such a program, it is possible to determine whether or not the target is moving based on the collected movement information and the moving means used by the target. In addition, the purpose of the target action can be estimated based on the movement information and the movement means.

Embodiment 2
Next, Embodiment 2 of the present invention will be described. The information service providing system including the target behavior estimation device and the target behavior estimation device according to the second embodiment estimates the purpose of the target movement based on the movement information related to the time taken to move the target. In addition, since the information service providing system including the target behavior estimation device and the target behavior estimation device according to the second embodiment has the same configuration as that described in the first embodiment, illustration and description of the configuration are omitted. To do.

First, a process in which the target behavior estimation apparatus according to the second embodiment generates behavior information from position information and information related to movement will be described. The target information acquisition unit 3 acquires position information from the sensor device 301 and the GPS terminal 303 and the movement information collection / storage unit 204. The position information referred to in the second embodiment includes latitude and longitude obtained by the GPS terminal 303, target speed and acceleration obtained by the sensor device 301, angular acceleration, and the like.
In the second embodiment, the purpose of movement of the target is estimated based on the history of the position where the target has moved from the collected position information and the movement time of the target set in advance corresponding to this history.

In the second embodiment, the position information collection interval by the sensor device 301 and the GPS terminal 303 can be adjusted by controlling an initial value specific to the object according to the movement state of the object. The acquisition interval information necessary for the adjustment is determined through negotiation between the movement information collection / storage unit 204 and the sensor device 301 in cooperation. The change of the position information acquisition interval using the acquisition interval information may be performed at the time of negotiation, or may be executed by using a change in the position information of the object as a trigger after determination by negotiation.
In addition, the acquisition interval of the target position information by the sensor device 301 and the GPS terminal 303 is held by any of the devices that collect the movement information such as the movement information collection / storage unit 204, the sensor device 301 and the GPS terminal 303, It is possible to manage.

(Move information processing)
In the second embodiment, the collected movement information is processed by two processes. One of the two processes is a process of processing the collected movement information into cluster reference data and storing it as a personal model database. The cluster reference data is information used to determine target movement information and behavior information in the second embodiment. The second is processing for analyzing the movement of the object and estimating behavior information.

In the first embodiment described above, the information shown in FIG. 5 is registered in advance by the object, the movement information acquired by the sensor device 301 or the GPS terminal 302 is classified, and the determination is made for the behavior information and the movement information. Cluster reference data was created. Further, the purpose of the target behavior such as commuting or resting is estimated using the created cluster reference data.
On the other hand, the second embodiment is intended to determine the presence or absence of the object of the object based on the time taken for the movement of the object (inter-data inning). In the second embodiment, the reference time for determining whether or not the object is acting with a purpose can be changed according to the individuality of each object and the surrounding situation. In this regard, the second embodiment can more accurately determine the presence or absence of a purpose in the target search behavior.

(Generation of cluster reference data)
FIG. 9 is a diagram for explaining generation of cluster reference data according to the second embodiment. FIG. 9A is a diagram for explaining a plurality of areas set in a predetermined range. Further, (b) is a diagram showing that the state of movement of the target is estimated from the history of movement of the target area and the time taken for the movement. In the second embodiment, the target state is classified into multiple stages depending on whether or not the target movement behavior (referred to as search behavior) is moving with interest. Then, it is determined that the search behavior made with interest is acting with the object, and the search behavior that is not made with interest is determined as the action with no purpose.

The area shown in FIG. 9A is not limited to that defined on the ground. That is, in a high-rise building or the like, an upper floor area may be set on the first floor area. Therefore, in the second embodiment, a plurality of areas are set in a range represented by the same latitude and longitude, and different IDs are assigned to the plurality of areas.
An ID assigned to each area is referred to as an area ID. In addition, among the area IDs, an ID that is represented by the same latitude and longitude as the area on the ground and that is assigned to another area on the upper floor is referred to as a space area ID.

  The area ID can be configured to be attached in the movement information collection / storage unit 204, for example. That is, the DB 206 shown in FIG. 1 stores area ID data that associates the location information of latitude and longitude acquired by the GPS terminal 303 with the area ID. When the location information is stored in the DB 206, the movement information collection / storage unit 204 can also convert the location information into an area ID and store it.

(Determination of behavior information)
Next, a procedure for determining behavior information according to the second embodiment will be described.
In the second embodiment, the purpose of the target action is determined based on the movement history and the time taken for the movement. The object is considered to move in order in each area in the space shown in FIG. In the second embodiment, the pattern is set in advance according to the area in which the target has moved and the order of the moved areas. For example, the movement in the order of area ID g → a → b → c → d → e → f shown in FIG.

The order of areas in which the object moves, such as area ID g → a → b → c → d → e → f, is referred to as a movement information history. Patterns are also set for the other movement information histories, and as a result, a large number of patterns are set in the second embodiment.
In the example shown in (b), the time taken for the movement is set in multiple stages such as 5 minutes or less, 5 to 8 minutes, and 8 to 15 minutes. Each stage of time is associated with each pattern, and for each combination of time and pattern, the target behaviors such as “search behavior that is not so interesting”, “search behavior that is hanging around”, and “search behavior that is of interest” Set the purpose of the action or the means of moving the target.

  Such a setting, as shown in FIG. 9B, uses the movement history information of the target and the elapsed time information for the movement to determine the movement behavior information related to the behavior of the moving target. I can say that. Also, each combination of time and pattern, and the purpose of the action and the means of movement associated with this combination are the cluster reference data of the second embodiment.

(Cluster reference data)
In the second embodiment, three methods are proposed for defining such cluster reference data. Hereinafter, three types of definition method I, definition method II, and definition method III will be described.

・ 1 Definition method I
FIGS. 10A and 10B are diagrams for explaining the cluster reference data definition method I. FIG. FIG. 10A shows cluster reference data for determining the purpose of the action of a target moving by walking, and FIG. 10B shows cluster reference data for determining means for moving the target. All the cluster reference data are defined by the definition method I.

  The cluster reference data shown in (a) is stored as the movement behavior information table A for determining the purpose of the target behavior (the status of the search behavior). In the movement behavior information table A, elapsed time information such as 3 minutes or less, 5 minutes or less, 15 minutes or less, or 20 minutes or less is associated with each pattern such as pattern 1 and pattern 2. The elapsed time information set in multiple stages is collectively referred to as an elapsed time set. According to the illustrated example, for example, if 5 minutes or more and 10 minutes or less have passed while the target moves in each area in the order of the pattern 1, it is determined that the target action corresponds to the hanging action state. Is done.

  The cluster reference data shown in (b) is stored as a movement action information table B for determining the target moving means (movement state). In the movement behavior information table B, elapsed time information such as 3 minutes or less, 5 minutes or less, 15 minutes or less, or 20 minutes or less is associated with each pattern such as the pattern 15 and the pattern 34. According to the illustrated example, for example, if 5 minutes or more and 10 minutes or less have elapsed while the object moves in each area in the order of the pattern 15, it is determined that the object is moving on a train. .

  The purpose of the action and the moving means corresponding to the combination of the pattern and the elapsed time are preset by the management side of the system. As a setting method, it is conceivable that the management side investigates the time required for a pedestrian or passenger to move in each area and associates the elapsed time with the purpose. Survey methods include, for example, conducting a questionnaire for pedestrians and train passengers who use the space, or observing pedestrians and passengers in the space, and collecting tickets collected in conjunction with transportation. It is conceivable to determine the time for passengers to get on and off from such records.

The method for determining the moving behavior information shown in FIG. 10 includes a set of target position information recorded in time series (position information series set (X)) and a set of times at which the target is located (elapsed time set). It can be said that (Y)) is compared with the function of the table or the like for determining the movement behavior shown in FIGS. 10A and 10B to obtain the target behavior information. This point is schematically shown in FIG.
Further, the cluster reference data shown in FIGS. 10A and 10B is set on the administrator side. For this reason, in the present embodiment, once set, the cluster reference data can be arbitrarily adjusted by the user.

FIG. 12 is a diagram for illustrating the adjustment of the cluster reference data made by the user. In the example shown in the figure, an example is shown in which when the pattern 1 is moved over 10 minutes, the setting that is determined to be the “hanging behavior state” is adjusted to be further increased by 3 minutes.
As a result of the adjustment, when the object moves from 5 minutes to 10 minutes before the adjustment, the object is determined to be in an “interesting search behavior state”, whereas after the adjustment, the object changes to the pattern 1 after the adjustment. When the user moves within a period of 5 minutes or more and 13 minutes or less, it is determined that the user is in a “hanging behavior state”.

With this configuration, the present embodiment can also reflect the surrounding conditions such as differences in walking speed of individual objects and the congestion status of the area in the cluster reference data. The adjusted cluster reference data can be stored in the personal model DBs 208 and 210 shown in FIG.
Note that such adjustment of the cluster reference data can be manually adjusted by a user using a general operation unit of the information terminal device. Further, when the adjustment is automatically performed, it is possible to automatically adjust the travel time by extracting information such as age from the user information registered in advance. Specifically, when the user is an elderly person, the time required for movement is adjusted to be long as shown in FIG. Furthermore, it is also conceivable to automatically adjust the travel time by acquiring traffic information and the like related to the area from the information distribution server 1.

・ 2 Definition Method II
FIG. 13 is a diagram for explaining the cluster reference data definition method II. In the definition method II, when generating the cluster reference data, the moving speed of the pedestrian who actually moves in the area and the time taken from departure to arrival of the vehicle are investigated. Then, the distribution of the movement path of the target and the distribution of the time taken are recorded for each purpose of the target action and used as cluster reference data.

The example shown in FIG. 13 is cluster reference data indicating the distribution of “search behavior state with little interest”. In FIG. 13, the x-axis indicates a pattern such as 1, 2,..., And the y-axis indicates time. Further, the z-axis indicates the frequency at which it is determined that the “search behavior state with little interest” corresponds to the patterns 1, 2,.
A curve 131 is a set of frequencies determined as “a search behavior state in which there is not a very interesting target” when the target moves in each of the patterns 1, 2,. αxi represents the frequency α corresponding to the pattern xi.

A curve 132 is a set of frequencies determined as “a search behavior state in which there is not a very interesting target” corresponding to the time of movement of the target. βyi represents a frequency β corresponding to time yi.
In the second embodiment, there is a possibility that the action when the object moves over the pattern xi and the time yi after adding the values of αxi and βyi is the “search action state with little interest target”. Use as an indicator of

  In the second embodiment, the same cluster reference data is stored for the purpose of other actions such as “the hanging action state” and “the action state with the object of interest”. Further, for all the purposes of the action, an added value ai obtained by adding the z-axis value of the curve 131 corresponding to the moved pattern of the target and the z-axis value of the curve 132 corresponding to the time taken for the movement. Is calculated. Then, by setting the purpose of the action for which the largest added value ai is obtained as the purpose of the target action, it is possible to statistically estimate the purpose of the target action.

The added value ai indicates the possibility of an action purpose such as “a search action state in which there is not much interest” or “a hanging action state”. From this, it is assumed that actions for each action purpose can occur in the same manner, and as shown in FIG. 13, a value C obtained by adding all ai for each action purpose is set to a constant value.
Such a definition method II can reduce the amount of data to be stored compared to the definition method I. Therefore, the present invention is particularly advantageous when the present invention is applied to a small information terminal device that is easy for the user to carry.

  Further, the second embodiment is not limited to the one that investigates the moving speed of the pedestrian and the time from the departure to the arrival of the vehicle and sets the cluster reference data. The distribution of FIG. 13 may be set using the data in the tables a) and (b) as teacher data. For example, in the cluster reference data of “search behavior state in which there is not much interest” in the cluster reference data, when xi is pattern 1 and yi is 3 minutes or 5 minutes, the addition is larger than other action purpose cluster reference data This can be realized by setting the frequency corresponding to xi and yi so that a value can be obtained.

・ 3 Definition Method III
The definition method III deals with a case where the same addition value ai is obtained in two or more action-purpose cluster reference data in the definition method II. For example, the movement pattern of an object and the time it takes are compared with the cluster reference data, and the same cluster reference data for two action objectives, “search action state with less interest” and “hanging action state” It is conceivable that a value addition value ai is obtained. In the second embodiment, in such a case, if the moving time of the target is different, any action purpose is selected based on this difference.

FIG. 14 is a diagram for explaining the definition method III. FIG. 14A is a diagram showing the order of the areas to which the object has moved. Also in the second embodiment, as shown in FIG. 14 (), the movement pattern is determined using the ID of the moved area as the movement history information. Then, the target action purpose is determined by comparing the pattern and the time taken to move the pattern against the cluster reference data shown in FIG.
As a result of the comparison with the cluster reference data, for example, when the same added value ai is obtained in “search behavior state with little interest” and “behavior with interest”, in the second embodiment, the definition method III The purpose of the action is determined by the graph defined by. A graph defined by the definition method III is shown in FIG.

  Specifically, the time taken for the movement of the target is the time range of “search behavior state with little interest target” (indicated as “not interested” in the figure), the time range of “behavior state with interest target” (Indicated in the figure as interested) is set in advance for each pattern. Then, if the purpose of the target action cannot be determined by the definition method II, the action depends on whether the travel time is in the time range of “search action state with little interest target” or “behavior state with interest target”. Determine the purpose of the.

  Furthermore, in the second embodiment, not only the time range is set in advance, but also the personal characteristics such as the moving speed of the target are reflected in the determination of the action purpose. Set it up. The straight line y is represented by the equation y = αt + ε, where α is the slope, t is the movement time, and ε is the error. Also, a value y1 of y corresponding to the time range of “search behavior state in which there is not much interest target” and “behavior state in which interest is interesting” is set in advance.

  In the second embodiment, the travel time is not directly compared with the graph of FIG. 14C, but is substituted for t in the expression representing y. When the value of y obtained at time t is equal to or less than y1, it is determined that the target behavior is “a search behavior state in which there is not much interest.” On the other hand, when the value of y obtained at time t is equal to or greater than y1, it is determined that the target behavior is the “behavioral state where there is a target of interest”.

In the definition method III, for example, the value of α can be increased for an object with a fast walking speed, and the value of α can be decreased for an object with a slow walking speed. Such a definition method III can adjust the cluster reference data so as to be more suitable for the characteristics of each object.
FIGS. 15A to 15D are diagrams showing the data structure of information defined by the above-described definition method I, definition method II, and definition method III. (A) is a table of cluster reference data defined by the definition method I. In the second embodiment, for such cluster reference data, the purpose of an action such as “search action state in which there is not a very interesting object” is determined by comparing the pattern in which the object has moved with the time taken for the movement.

  FIG. 15B is a diagram for illustrating cluster reference data defined by the definition method II. The cluster reference data defined by the definition method II includes an expression (modeling formula) for obtaining an addition value, a value obtained by examining αxi and βyi in advance, and a value estimated from the table of the definition method I. Saved by recording. FIG. 15C is a diagram for illustrating cluster reference data defined by the definition method III. As shown in the figure, in the definition method III, for example, when it is determined whether the purpose of the target behavior is “the search behavior state where there is not much interest target” or “the behavior state where there is interest target”. It can be defined by recording the slope α and error ε of the straight line y used.

The cluster reference data defined by the above definition methods I, II, and III is stored in the cluster reference data DB 203. Furthermore, in the second embodiment, together with the above-mentioned Kuta star reference data, data for adjusting the cluster reference data according to the characteristics of the target and the time zone of movement can be stored in advance as personal cluster reference data. .
FIG. 15D illustrates the personal cluster reference data. As shown in the figure, the personal cluster reference data includes data for adjusting the moving speed according to the time zone, the place, and the situation corresponding to the individual such as the individual A and the person B. In the case where such cluster reference data is stored, in the second embodiment, the target moving speed is adjusted in accordance with the target individual, the moving time zone and area, and the situation. Then, the purpose of the target action is determined using the adjusted moving speed as the moving time of the definition method I.

Further, in the definition method II, αxi, βyi, and constant C corresponding to individual objects such as the individual A and the individual B can be used as personal cluster reference data. In the definition method III, the slope α and the error ε of the straight line y corresponding to each object such as the person A and the person B can be used as personal cluster reference data.
The purpose of the target action determined by the second embodiment described above can also be used to select information to be provided, as in the first embodiment. In other words, when the object is “searching behavior that is not very interesting”, it is possible to determine that it is not appropriate to provide area information to the object. Further, when the target is “behavioral state with the target of interest”, it may be determined to provide information about stores in the area to the target.

  The present invention can appropriately select information necessary for a target and flexibly respond to user needs by considering the purpose of the target action in the information providing service.

It is a figure for demonstrating the outline of the information service provision system of one Embodiment of this invention. It is a figure for demonstrating a series of processes performed by each structure of the target action estimation part shown by FIG. It is a figure which shows the cluster reference data which the moving means extraction part shown by FIG. 1 uses in order to determine a moving means. FIG. 4 is a diagram illustrating movement information clustered using the cluster reference data shown in FIG. 3 as teacher data. It is the figure which showed the cluster reference | standard data used in order for the action information construction part shown by FIG. 1 to determine target action. It is the figure which illustrated the data clustered according to the cluster reference data of FIG. FIG. 6 is a diagram illustrating cluster reference data different from the cluster reference data shown in FIG. 5 and data clustered by the cluster reference data. It is a flowchart for demonstrating the program at the time of making a computer perform the target action estimation method of one Embodiment of this invention. It is a figure for demonstrating the production | generation of the cluster reference data of Embodiment 2 of this invention. It is a figure for demonstrating the definition method I of the cluster reference data of Embodiment 2 of this invention. It is a figure for demonstrating the table defined by the definition method I of the cluster reference data of Embodiment 2 of this invention. It is a figure for demonstrating about adjustment of the cluster reference data made by the user of Embodiment 2 of this invention. It is a figure for demonstrating the definition method II of the cluster reference data of Embodiment 2 of this invention. It is a figure for demonstrating the definition method III of the cluster reference data of Embodiment 2 of this invention. It is the figure which showed the structure of the data of the information defined by the definition method I of cluster reference data of Embodiment 2 of this invention, the definition method II, and the definition method III.

Explanation of symbols

DESCRIPTION OF SYMBOLS 1 Information distribution server 2 Target action estimation part 3 Target information acquisition part 201 Action information construction part 202 Moving means extraction part 203 Cluster reference | standard DB
204 Movement information collection / storage unit 207 Action estimation unit 208, 210 Personal model DB
209, 211 General model DB
212 Action information determination unit 213 Moving means determination unit 301 Sensor device 303 GPS terminal

Claims (12)

  1. A movement information collection device that collects movement information that is information including at least one of information on the position, movement speed, time taken for movement, acceleration, movement direction, and time period of movement;
    It is determined whether or not the object is moving based on the information collected by the movement information collection device. If it is determined that the object is moving, the moving means of the object is determined, and the movement information and A target behavior estimation device that estimates the purpose of the target behavior based on the determined target moving means;
    A service providing device that selects information according to the purpose estimated by the target behavior estimation device and provides a service based on the selected information to the target;
    An information service providing system comprising:
  2. Whether or not the object is moving based on movement information, which is information including at least one of the position, movement speed, time taken for movement, acceleration, movement direction, and information concerning the time zone of movement. Movement determining means for determining;
    A moving means determining means for determining a moving means of the moving object when the moving determining means determines that the object is moving;
    Purpose estimation means for estimating the purpose of the target action based on the movement information and movement means information that is information relating to the target movement means determined by the movement means determination means;
    A target behavior estimation apparatus comprising:
  3.   The target behavior estimation apparatus according to claim 2, further comprising a movement information collection unit that collects the movement information for each specific target and supplies the movement information to the movement determination unit.
  4.   The target movement estimation apparatus according to claim 3, wherein the movement information collection unit collects the position of the target as latitude and longitude by GPS.
  5.   5. The target behavior estimation apparatus according to claim 3, wherein the movement information collection unit adjusts a time interval for acquiring the movement information based on the movement information.
  6.   When the movement determination means determines that the object is moving, the movement means determination means distributes the movement speed of the object, the relationship between the movement vectors, and other objects within a predetermined time from one position. The target behavior estimation apparatus according to any one of claims 2 to 5, wherein a target moving means is determined based on at least one of a situation until the position is moved.
  7.   General means for storing moving means for collectively storing determination results of the moving means by the moving means determining means for a plurality of objects, and moving means for storing the determination results of the moving means by the moving means determining means for each target 7. The target behavior estimation apparatus according to claim 2, further comprising at least one of personal data storage means.
  8.   General purpose data storage means for collectively storing the estimation results by the purpose estimation means for a plurality of objects, and at least personal data storage means for purpose for storing the estimation results by the object estimation means for each object The target behavior estimation apparatus according to claim 2, further comprising one side.
  9.   9. The objective estimation unit according to claim 2, wherein the target estimation unit estimates a target behavior target by comparing the movement information and the movement unit information against a preset behavior information database. The target behavior estimation apparatus according to the item.
  10.   The behavior information database further includes environmental information including at least one of temperature, humidity, and weather of the environment in which the target exists, and the purpose estimation unit, when the environmental information is input, the input environmental information The target behavior estimation apparatus according to claim 9, wherein the purpose of the target behavior is estimated with reference to the behavior information database together with movement information.
  11.   The object estimation means estimates the object's purpose of movement based on a history of a position where the object has moved and a movement time of the object set in advance corresponding to the history of the position. The target behavior estimation apparatus according to any one of claims 2 to 10.
  12. A movement information input step for inputting movement information, which is information including at least one of the position, movement speed, time taken for movement, acceleration, movement direction, and information concerning the movement time zone;
    A movement determination step for determining whether the target is moving based on the movement information input in the movement information input step;
    When it is determined in the movement determination step that the object is moving, a moving means determination step for determining a moving means of the moving object;
    A purpose estimation step for estimating the purpose of the target action based on the movement information and the movement means information that is information relating to the target movement means determined in the movement means determination step;
    The target action estimation method characterized by including this.
JP2006341630A 2006-12-19 2006-12-19 Information service providing system, target behavior estimation device, target behavior estimation method Expired - Fee Related JP4861154B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006341630A JP4861154B2 (en) 2006-12-19 2006-12-19 Information service providing system, target behavior estimation device, target behavior estimation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2006341630A JP4861154B2 (en) 2006-12-19 2006-12-19 Information service providing system, target behavior estimation device, target behavior estimation method

Publications (2)

Publication Number Publication Date
JP2008152655A true JP2008152655A (en) 2008-07-03
JP4861154B2 JP4861154B2 (en) 2012-01-25

Family

ID=39654746

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2006341630A Expired - Fee Related JP4861154B2 (en) 2006-12-19 2006-12-19 Information service providing system, target behavior estimation device, target behavior estimation method

Country Status (1)

Country Link
JP (1) JP4861154B2 (en)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009190668A (en) * 2008-02-18 2009-08-27 Toyota Motor Corp Transportation information acquisition device, on-vehicle device, and transportation information providing system
JP2010014592A (en) * 2008-07-04 2010-01-21 Softbank Telecom Corp Moving means determining device and moving means determination method
JP2010019811A (en) * 2008-07-14 2010-01-28 Nippon Telegr & Teleph Corp <Ntt> Device and method for determining movement means
JP2010074278A (en) * 2008-09-16 2010-04-02 Toshiba Corp Information processing apparatus, method and program
JP2010112750A (en) * 2008-11-04 2010-05-20 Nippon Telegr & Teleph Corp <Ntt> Device, method and program for deciding means of movement, and recording medium therefor
JP2010176228A (en) * 2009-01-27 2010-08-12 Softbank Telecom Corp Server, method and program for creating user traffic line
JP2011053819A (en) * 2009-08-31 2011-03-17 Inkurimento P Kk Information processor, information processing method, and information processing program
JP2011180936A (en) * 2010-03-03 2011-09-15 Aisin Aw Co Ltd Device, method and program for determining moving method
JP2012059023A (en) * 2010-09-09 2012-03-22 Casio Comput Co Ltd Crime-preventing terminal, portable terminal with crime-preventing function, control method for crime-preventing terminal and program
WO2012047977A3 (en) * 2010-10-05 2012-06-21 Google Inc. System and method for predicting behaviors of detected objects
JP2013088500A (en) * 2011-10-14 2013-05-13 Omron Corp Map creation system, map creation device, and portable terminal
JP2014007451A (en) * 2012-06-21 2014-01-16 Sony Corp Control apparatus, control method, program, and recording medium
US8718861B1 (en) 2012-04-11 2014-05-06 Google Inc. Determining when to drive autonomously
JP5513669B1 (en) * 2013-11-11 2014-06-04 株式会社野村総合研究所 Content transmission / reception system, content reception method, and content transmission method
JP2014119798A (en) * 2012-12-13 2014-06-30 Kddi Corp Apparatus, program and method for estimating mobile target boarded by user carrying portable terminal
CN104101349A (en) * 2013-04-09 2014-10-15 索尼公司 Navigation apparatus and storage medium
JP2014229200A (en) * 2013-05-24 2014-12-08 日本電信電話株式会社 Action purpose estimation device, action purpose estimation method, and action purpose estimation program
US8949016B1 (en) 2012-09-28 2015-02-03 Google Inc. Systems and methods for determining whether a driving environment has changed
CN104361205A (en) * 2014-10-21 2015-02-18 上海动盟网络技术有限公司 System and method for processing visitor information
JP2015066625A (en) * 2013-09-27 2015-04-13 株式会社国際電気通信基礎技術研究所 Attention object estimation system, robot, and control program
JP2015114840A (en) * 2013-12-11 2015-06-22 日本信号株式会社 Parking lot management system
WO2015177858A1 (en) * 2014-05-20 2015-11-26 株式会社日立製作所 Trip attribute estimating system, trip attribute estimating method, trip attribute estimating program, and travel behavior survey system
WO2015178065A1 (en) * 2014-05-22 2015-11-26 ソニー株式会社 Information processing device and information processing method
WO2015194215A1 (en) * 2014-06-20 2015-12-23 ソニー株式会社 Information processing device, information processing method, and program
US9248834B1 (en) 2014-10-02 2016-02-02 Google Inc. Predicting trajectories of objects based on contextual information
US9321461B1 (en) 2014-08-29 2016-04-26 Google Inc. Change detection using curve alignment
US9633564B2 (en) 2012-09-27 2017-04-25 Google Inc. Determining changes in a driving environment based on vehicle behavior
US9729712B2 (en) 2014-06-30 2017-08-08 Kabushiki Kaisha Toshiba Electronic device and method for filtering notification information
JP2017156792A (en) * 2016-02-29 2017-09-07 株式会社日立製作所 Travel information classification device and travel information classification method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10111877A (en) * 1996-10-07 1998-04-28 Casio Comput Co Ltd Action analysis system
JP2001014297A (en) * 1999-06-28 2001-01-19 Sony Corp Method and device for predicting action and providing information
JP2001101563A (en) * 1999-10-01 2001-04-13 Toshi Kotsu Keikaku Kenkyusho:Kk Data processor and recording medium storing data processing program
JP2002373222A (en) * 2001-06-15 2002-12-26 Survey Research Center Co Ltd Investigation system and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10111877A (en) * 1996-10-07 1998-04-28 Casio Comput Co Ltd Action analysis system
JP2001014297A (en) * 1999-06-28 2001-01-19 Sony Corp Method and device for predicting action and providing information
JP2001101563A (en) * 1999-10-01 2001-04-13 Toshi Kotsu Keikaku Kenkyusho:Kk Data processor and recording medium storing data processing program
JP2002373222A (en) * 2001-06-15 2002-12-26 Survey Research Center Co Ltd Investigation system and program

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009190668A (en) * 2008-02-18 2009-08-27 Toyota Motor Corp Transportation information acquisition device, on-vehicle device, and transportation information providing system
JP2010014592A (en) * 2008-07-04 2010-01-21 Softbank Telecom Corp Moving means determining device and moving means determination method
JP2010019811A (en) * 2008-07-14 2010-01-28 Nippon Telegr & Teleph Corp <Ntt> Device and method for determining movement means
JP2010074278A (en) * 2008-09-16 2010-04-02 Toshiba Corp Information processing apparatus, method and program
JP2010112750A (en) * 2008-11-04 2010-05-20 Nippon Telegr & Teleph Corp <Ntt> Device, method and program for deciding means of movement, and recording medium therefor
JP2010176228A (en) * 2009-01-27 2010-08-12 Softbank Telecom Corp Server, method and program for creating user traffic line
JP2011053819A (en) * 2009-08-31 2011-03-17 Inkurimento P Kk Information processor, information processing method, and information processing program
JP2011180936A (en) * 2010-03-03 2011-09-15 Aisin Aw Co Ltd Device, method and program for determining moving method
JP2012059023A (en) * 2010-09-09 2012-03-22 Casio Comput Co Ltd Crime-preventing terminal, portable terminal with crime-preventing function, control method for crime-preventing terminal and program
CN103370249A (en) * 2010-10-05 2013-10-23 谷歌公司 System and method for predicting behaviors of detected objects
US10372129B1 (en) 2010-10-05 2019-08-06 Waymo Llc System and method of providing recommendations to users of vehicles
US8509982B2 (en) 2010-10-05 2013-08-13 Google Inc. Zone driving
WO2012047977A3 (en) * 2010-10-05 2012-06-21 Google Inc. System and method for predicting behaviors of detected objects
US10198619B1 (en) 2010-10-05 2019-02-05 Waymo Llc System and method for evaluating the perception system of an autonomous vehicle
US8634980B1 (en) 2010-10-05 2014-01-21 Google Inc. Driving pattern recognition and safety control
US8660734B2 (en) 2010-10-05 2014-02-25 Google Inc. System and method for predicting behaviors of detected objects
US8688306B1 (en) 2010-10-05 2014-04-01 Google Inc. Systems and methods for vehicles with limited destination ability
US9911030B1 (en) 2010-10-05 2018-03-06 Waymo Llc System and method for evaluating the perception system of an autonomous vehicle
US9679191B1 (en) 2010-10-05 2017-06-13 Waymo Llc System and method for evaluating the perception system of an autonomous vehicle
US9268332B2 (en) 2010-10-05 2016-02-23 Google Inc. Zone driving
US8825264B2 (en) 2010-10-05 2014-09-02 Google Inc. Zone driving
US9120484B1 (en) 2010-10-05 2015-09-01 Google Inc. Modeling behavior based on observations of objects observed in a driving environment
US8874305B2 (en) 2010-10-05 2014-10-28 Google Inc. Diagnosis and repair for autonomous vehicles
US8965621B1 (en) 2010-10-05 2015-02-24 Google Inc. Driving pattern recognition and safety control
US8948955B2 (en) 2010-10-05 2015-02-03 Google Inc. System and method for predicting behaviors of detected objects
US9122948B1 (en) 2010-10-05 2015-09-01 Google Inc. System and method for evaluating the perception system of an autonomous vehicle
US9658620B1 (en) 2010-10-05 2017-05-23 Waymo Llc System and method of providing recommendations to users of vehicles
JP2013088500A (en) * 2011-10-14 2013-05-13 Omron Corp Map creation system, map creation device, and portable terminal
US8954217B1 (en) 2012-04-11 2015-02-10 Google Inc. Determining when to drive autonomously
US8718861B1 (en) 2012-04-11 2014-05-06 Google Inc. Determining when to drive autonomously
JP2014007451A (en) * 2012-06-21 2014-01-16 Sony Corp Control apparatus, control method, program, and recording medium
US10192442B2 (en) 2012-09-27 2019-01-29 Waymo Llc Determining changes in a driving environment based on vehicle behavior
US9633564B2 (en) 2012-09-27 2017-04-25 Google Inc. Determining changes in a driving environment based on vehicle behavior
US8949016B1 (en) 2012-09-28 2015-02-03 Google Inc. Systems and methods for determining whether a driving environment has changed
JP2014119798A (en) * 2012-12-13 2014-06-30 Kddi Corp Apparatus, program and method for estimating mobile target boarded by user carrying portable terminal
CN104101349A (en) * 2013-04-09 2014-10-15 索尼公司 Navigation apparatus and storage medium
JP2014229200A (en) * 2013-05-24 2014-12-08 日本電信電話株式会社 Action purpose estimation device, action purpose estimation method, and action purpose estimation program
JP2015066625A (en) * 2013-09-27 2015-04-13 株式会社国際電気通信基礎技術研究所 Attention object estimation system, robot, and control program
JP5513669B1 (en) * 2013-11-11 2014-06-04 株式会社野村総合研究所 Content transmission / reception system, content reception method, and content transmission method
JP2015114840A (en) * 2013-12-11 2015-06-22 日本信号株式会社 Parking lot management system
WO2015177858A1 (en) * 2014-05-20 2015-11-26 株式会社日立製作所 Trip attribute estimating system, trip attribute estimating method, trip attribute estimating program, and travel behavior survey system
US10165412B2 (en) 2014-05-22 2018-12-25 Sony Corporation Information processing device and information processing method
WO2015178065A1 (en) * 2014-05-22 2015-11-26 ソニー株式会社 Information processing device and information processing method
WO2015194215A1 (en) * 2014-06-20 2015-12-23 ソニー株式会社 Information processing device, information processing method, and program
US9729712B2 (en) 2014-06-30 2017-08-08 Kabushiki Kaisha Toshiba Electronic device and method for filtering notification information
US9836052B1 (en) 2014-08-29 2017-12-05 Waymo Llc Change detection using curve alignment
US9321461B1 (en) 2014-08-29 2016-04-26 Google Inc. Change detection using curve alignment
US9669827B1 (en) 2014-10-02 2017-06-06 Google Inc. Predicting trajectories of objects based on contextual information
US9914452B1 (en) 2014-10-02 2018-03-13 Waymo Llc Predicting trajectories of objects based on contextual information
US10421453B1 (en) 2014-10-02 2019-09-24 Waymo Llc Predicting trajectories of objects based on contextual information
US9248834B1 (en) 2014-10-02 2016-02-02 Google Inc. Predicting trajectories of objects based on contextual information
CN104361205A (en) * 2014-10-21 2015-02-18 上海动盟网络技术有限公司 System and method for processing visitor information
JP2017156792A (en) * 2016-02-29 2017-09-07 株式会社日立製作所 Travel information classification device and travel information classification method

Also Published As

Publication number Publication date
JP4861154B2 (en) 2012-01-25

Similar Documents

Publication Publication Date Title
Zhang et al. Data-driven intelligent transportation systems: A survey
Chen et al. A survey of traffic data visualization
US7469827B2 (en) Vehicle information systems and methods
Mun et al. PEIR, the personal environmental impact report, as a platform for participatory sensing systems research
JP2015501459A (en) Computing platform for the development and deployment of sensor-driven vehicle telemetry applications and services
Jiang et al. A review of urban computing for mobile phone traces: current methods, challenges and opportunities
EP2455713B1 (en) Building directory aided navigation
CN102682041B (en) User behavior identification equipment and method
Zheng et al. Urban computing: concepts, methodologies, and applications
JP5096396B2 (en) Traffic information management server, navigation terminal and method
JP2018504661A (en) Mobile self-service system and method
ES2366875T3 (en) Device and procedure for updating cartographic data.
JP2002140362A (en) System and method for providing information to moving body
US20110313956A1 (en) Information processing apparatus, information processing method and program
JP2013003649A (en) Information processing device, information processing method, and computer program
CN104813186A (en) Location determination using fingerprint data
TW201115119A (en) Method and apparatus for temporal slicing of datasets including mobile GPS traces
Mazimpaka et al. Trajectory data mining: A review of methods and applications
EP2769574B1 (en) Tracking activity, velocity, and heading using sensors in mobile devices or other systems
Ramezani et al. Queue profile estimation in congested urban networks with probe data
US20140288821A1 (en) Vehicle Arrival Prediction
CN104813185A (en) Location determination using a state space estimator
Chen et al. Probabilistic multimodal map matching with rich smartphone data
Gong et al. Deriving personal trip data from GPS data: A literature review on the existing methodologies
WO2012098651A1 (en) Mobile information terminal, information management device, and mobile information terminal information management system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20090916

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110801

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110809

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20111005

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20111101

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20111104

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20141111

Year of fee payment: 3

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: R3D04

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20141111

Year of fee payment: 3

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

LAPS Cancellation because of no payment of annual fees