WO2017056774A1 - Dispositif de traitement d'informations, procédé de traitement d'informations, et programme d'ordinateur - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations, et programme d'ordinateur Download PDF

Info

Publication number
WO2017056774A1
WO2017056774A1 PCT/JP2016/074157 JP2016074157W WO2017056774A1 WO 2017056774 A1 WO2017056774 A1 WO 2017056774A1 JP 2016074157 W JP2016074157 W JP 2016074157W WO 2017056774 A1 WO2017056774 A1 WO 2017056774A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
unit
moving body
sensor
terminal device
Prior art date
Application number
PCT/JP2016/074157
Other languages
English (en)
Japanese (ja)
Inventor
呂尚 高岡
倉田 雅友
由幸 小林
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2017056774A1 publication Critical patent/WO2017056774A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/005Traffic control systems for road vehicles including pedestrian guidance indicator

Definitions

  • the present disclosure relates to an information processing apparatus, an information processing method, and a computer program.
  • GNSS Global Navigation Satellite System
  • GPS Global Positioning System
  • the first sensing data provided by one or more sensors carried or worn by the user is converted into the second sensing data provided by one or more sensors provided in the moving body on which the user is riding.
  • an information processing apparatus comprising: a correction unit that corrects using a correction unit; and a processing unit that executes a process for obtaining the position of the user in the moving body using a correction result of the correction unit.
  • the first sensing data provided by one or more sensors carried or worn by the user is changed to the second sensing data provided by one or more sensors provided on the moving body on which the user is riding.
  • An information processing method is provided that includes: correcting using the correction result; and executing a process for determining the position of the user in the moving body using the correction result.
  • the first sensing data provided by the one or more sensors carried or attached to the computer by the user is provided to the computer by the one or more sensors provided by the moving body on which the user is riding.
  • a computer program for executing correction using two sensing data and executing a process for obtaining the position of the user in the moving body using a result of the correction is provided.
  • a processing device, an information processing method, and a computer program can be provided.
  • FIG. 3 is an explanatory diagram illustrating a configuration example of a terminal device 200 according to the first embodiment of the present disclosure.
  • FIG. 3 is an explanatory diagram illustrating a configuration example of a server device 100 according to the first embodiment of the present disclosure.
  • FIG. 3 is an explanatory diagram illustrating a configuration example of a server device 100 according to the first embodiment of the present disclosure.
  • It is explanatory drawing which shows the example of the moving body context map 120 and the moving body POI map 145.
  • FIG. It is explanatory drawing which shows the structural example of the server apparatus 100 which concerns on 2nd Embodiment of this indication.
  • FIG. 1 is an explanatory diagram illustrating a configuration example of a positioning system according to the first embodiment of the present disclosure.
  • the positioning system includes a server device 100 and a terminal device 200.
  • the terminal device 200 is a device that is carried or worn by a user who rides on a movable body that can freely move inside, such as a ship (particularly a large passenger ship) or a railway.
  • the terminal device 200 can perform wireless communication with the server device 100.
  • the terminal device 200 transmits the sensor data acquired by the sensor provided therein to the server device 100. Further, the terminal device 200 receives a positioning result using the sensor data sent from the server device 100.
  • the sensor included in the terminal device 200 includes an acceleration sensor, a gyro sensor, a geomagnetic sensor, an illuminance sensor, a temperature sensor, an atmospheric pressure sensor, and the like. Detects angular velocity, azimuth, illuminance, temperature, pressure, etc.
  • the various sensors described above can detect various information as information related to the user, for example, information indicating the user's movement or orientation.
  • the sensor may include a sensor that detects user's biological information such as pulse, sweat, brain wave, touch, smell, and taste.
  • the terminal device 200 includes a processing circuit that acquires information indicating the user's emotions by analyzing information detected by these sensors and / or image or sound data detected by a camera or microphone, which will be described later. May be.
  • the senor may acquire an image or sound near the user or the device as data using a camera, a microphone, the various sensors described above, or the like.
  • the sensor may include a position detection function for detecting an indoor or outdoor position.
  • the position detection function may include a GNSS (Global Navigation Satellite System) receiver and / or a communication device.
  • the GNSS can include, for example, GPS (Global Positioning System), GLONASS (Global Navigation Satellite System), BDS (BeiDou Navigation Satellite System), QZSS (Quasi-Zenith Satellite Systems), or Galileo.
  • GPS Global Positioning System
  • GLONASS Global Navigation Satellite System
  • BDS BeiDou Navigation Satellite System
  • QZSS Quadasi-Zenith Satellite Systems
  • Galileo Galileo
  • Communication devices include, for example, Wi-fi, MIMO (Multi-Input Multi-Output), cellular communication (for example, position detection using a mobile base station, femtocell), or short-range wireless communication (for example, BLE (Bluetooth Low Energy), The position is detected using a technique such as Bluetooth (registered trademark).
  • MIMO Multi-Input Multi-Output
  • cellular communication for example, position detection using a mobile base station, femtocell
  • short-range wireless communication for example, BLE (Bluetooth Low Energy)
  • BLE Bluetooth Low Energy
  • the device including the sensor When the sensor as described above detects a user's position and situation (including biological information), the device including the sensor is carried or worn by the user, for example. Alternatively, even when a device including a sensor is installed in the user's living environment, it may be possible to detect the user's position and situation (including biological information). For example, the user's pulse can be detected by analyzing an image including the user's face acquired by a camera fixed in a room or the like.
  • the server apparatus 100 is a server apparatus provided inside a moving body such as an automobile, a ship, or a railway, or outside the moving body.
  • the server device 100 is a device that measures the current position of the terminal device 200 located inside the mobile body and distributes the positioning result to the terminal device 200.
  • fingerprinting positioning is performed using sensor data output from multiple sensors in a moving body such as a ship or train.
  • the terminal device 200 is carried in the moving body due to acceleration / deceleration of the moving body itself, change in the direction of movement, change in direction, change in altitude, etc.
  • a component different from the movement or action of the user can be output from the terminal device 200.
  • a component different from the movement or behavior of the user who carries the terminal device 200 in the moving body becomes noise. Therefore, unless the noise is removed, the accuracy of the user carrying the terminal device 200 in the moving body within the moving body is correct. It is not possible to grasp the correct position and behavior.
  • a person when a person carries a gyro sensor in a moving object, when the gyro sensor detects rotation, it can be distinguished whether the moving object is turning around a curve or a person is walking around. Absent. For example, when a person carries an acceleration sensor in a moving body, when the acceleration sensor detects a change in acceleration, it cannot be distinguished whether the moving body has accelerated or decelerated or the person has accelerated or decelerated. . Also, for example, when a person carries a barometric sensor in a moving body, whether the altitude of the moving body has changed when the barometric sensor detects a change in barometric pressure, or has the person's altitude changed in the moving body? Can not be distinguished.
  • the server device 100 corrects the sensor data transmitted from the terminal device 200 using the sensor data measured by the moving body. And the server apparatus 100 performs the process which estimates the action of the user who carries the terminal device 200, or measures the position of the user who carries the terminal apparatus 200 using the corrected sensor data.
  • the server device 100 corrects the sensor data transmitted from the terminal device 200 using the sensor data measured by the moving body, thereby more accurately estimating the behavior and position of the user carrying the terminal device 200. Can do. Further, the server device 100 corrects the sensor data transmitted from the terminal device 200 using the sensor data measured by the moving body, thereby estimating the behavior of the user carrying the terminal device 200 even inside the moving body.
  • a map (context map) in which the result and the estimation result of the current position are linked can be generated.
  • FIG. 2 is an explanatory diagram illustrating a configuration example of the terminal device 200 according to the first embodiment of the present disclosure.
  • a configuration example of the terminal device 200 according to the first embodiment of the present disclosure will be described with reference to FIG.
  • the terminal device 200 includes a sensor unit 210, an input unit 220, a control unit 230, and an output unit 240.
  • the sensor unit 210 is a device that senses the state of the terminal device 200.
  • the sensor unit 210 outputs sensor data to the input unit 220.
  • the sensor unit 210 includes a geomagnetic sensor 211, an acceleration sensor 212, a gyro sensor 213, an atmospheric pressure sensor 214, a communication device 215, a microphone 216, and a camera 217.
  • the communication device 215 is originally a communication device, in the present embodiment, it is used as a sensor that detects a radio wave reception state.
  • the microphone 216 and the camera 217 are also used as sensors for detecting ambient sounds and environments.
  • the geomagnetic sensor 211 is a sensor that outputs the magnitude and direction of a magnetic field (magnetic field) as sensor data.
  • the acceleration sensor 212 is a sensor that outputs acceleration information as sensor data.
  • the gyro sensor 213 is a sensor that outputs angular velocity information as sensor data.
  • the atmospheric pressure sensor 214 is a sensor that outputs atmospheric pressure information as sensor data.
  • the sensors constituting the sensor unit 210 are not limited to those shown in FIG.
  • the input unit 220 receives sensor data output from the sensor unit 210 and data transmitted from another device such as the server device 100.
  • the input unit 220 passes the sensor data output from the sensor unit 210 and data transmitted from another device such as the server device 100 to the control unit 230.
  • the control unit 230 executes various processes for controlling the operation of the terminal device 200.
  • the control unit 230 includes a processor or processing circuit such as a CPU (Central Processing Unit), a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Gate Array).
  • the control unit 230 may include a memory or a storage device that temporarily or permanently stores a program executed in the processor or the processing circuit and data read / written in the processing.
  • the control unit 230 executes processing for causing the output unit 240 to output the current location of the terminal device 200, or performs processing for outputting the sensor data output by the sensor unit 210 to the server device 100 through the output unit 240.
  • the output unit 240 outputs the information provided from the control unit 230 to a user (may be the same user as the user of the terminal device 200 or a different user), an external device, or another service. To do.
  • the output unit 240 may include an output device, a control device, or software that provides information to an external service.
  • the output device uses the information provided from the control unit 230 as the visual (visual), auditory, tactile, olfactory, and taste of the user (may be the same user as the user of the terminal device 200 or a different user).
  • the output device is a display and outputs information as an image.
  • the display is not limited to a reflective or self-luminous display such as an LCD (Liquid Crystal Display) or an organic EL (Electro-Luminescence) display, and an image is displayed on the user's eye as used in a wearable device.
  • a combination of a light guide member that guides light and a light source is also included.
  • the output device may include a speaker and output information by voice.
  • the output device may include a projector, a vibrator, and the like.
  • the control device controls the device based on the information provided from the control unit 230.
  • the controlled device may be included in a device that realizes the output unit 240 or may be an external device. More specifically, for example, the control device includes a processor or a processing circuit that generates a control command.
  • the output unit 240 may further include a communication device that transmits a control command to the external device.
  • the control device controls a printer that outputs information provided from the control unit 230 as a printed matter.
  • the control device may include a driver that controls writing of information provided from the control unit 230 to the storage device or the removable recording medium.
  • the control device may control a device other than the device that outputs or records the information provided from the control unit 230.
  • the control device controls the lighting device to turn on the illumination, controls the television to erase the image, controls the audio device to adjust the volume, controls the robot to control its movement, etc. You may do it.
  • the software that provides information to the external service provides the information provided from the control unit 230 to the external service by using an API of the external service, for example.
  • the software may provide information to a server of an external service, or may provide information to application software of a service executed on the client device.
  • the provided information does not necessarily have to be immediately reflected in the external service, and may be provided as a candidate for a user to post or transmit to the external service, for example.
  • the software may provide text used as a candidate for a search keyword or URL (Uniform Resource Locator) input by the user in browser software executed on the client device.
  • the software may post text, images, videos, sounds, and the like on an external service such as social media on behalf of the user.
  • the configuration example of the terminal device 200 according to the first embodiment of the present disclosure has been described above with reference to FIG. Subsequently, a configuration example of the server device 100 according to the first embodiment of the present disclosure will be described.
  • FIG. 3 is an explanatory diagram illustrating a configuration example of the server apparatus 100 according to the first embodiment of the present disclosure.
  • FIG. 3 shows a configuration example of the server device 100 when generating a context map.
  • the server device 100 includes a processing unit 110.
  • the processing unit 110 When generating a context map, the processing unit 110 includes a sensor correction unit 111 and a context map generation unit 112.
  • the sensor correction unit 111 corrects the sensor data sent from the terminal device 200 using the sensor data sent from the moving body posture movement measurement unit 130.
  • the sensor data sent from the terminal device 200 can include acceleration data, angular velocity data, geomagnetic data, atmospheric pressure data, and the like.
  • the sensor correction unit 111 outputs the corrected sensor data to the context map generation unit 112.
  • the moving body posture movement measuring unit 130 measures the posture and movement state of the moving body itself, and includes, for example, various sensors such as an acceleration sensor, an angular velocity sensor, a geomagnetic sensor, an illuminance sensor, a temperature sensor, and an atmospheric pressure sensor. Composed.
  • the sensor data sent from the moving body posture movement measuring unit 130 indicates the posture and the amount of movement of the moving body.
  • the sensor data X includes the posture / motion change amount Y of the terminal device 200 itself and the sensor data sent from the moving body posture motion measuring unit 130.
  • Z can be included. Therefore, the sensor correction unit 111, as an example of correcting the sensor data sent from the terminal device 200, sensor data X sent from the terminal device 200 and sensor data sent from the moving body posture motion measurement unit 130. The difference with Z is obtained, and the change amount Y of the posture and movement of the terminal device 200 itself is obtained.
  • the sensor correction unit 111 sends the sensor data X sent from the terminal device 200 for each sensor data and the mobile body posture movement measurement unit 130 when obtaining the change amount Y of the posture and movement of the terminal device 200 itself.
  • the difference from the incoming sensor data Z is obtained.
  • the sensor correction unit 111 obtains the sensor data of the gyro sensor 213 sent from the terminal device 200 and the gyro sent from the moving body posture motion measurement unit 130 in order to obtain a change in the angular velocity of the terminal device 200 itself.
  • the difference from the sensor data of the sensor is obtained.
  • the moving body posture movement measuring unit 130 is configured by a sensor that measures a posture change and a movement change of a moving body on which a user who carries or wears the terminal device 200 is riding. If the server apparatus 100 is provided in the mobile body, the mobile body posture movement measurement unit 130 may be provided in the server apparatus 100 or may be connected to the server apparatus 100 by wire or wirelessly. . If the server device 100 is not provided in the moving body, the moving body posture movement measuring unit 130 is provided in the moving body and transmits sensor data to the server device 100 wirelessly.
  • the context map generation unit 112 generates or updates the moving body context map 120 using the sensor data corrected by the sensor correction unit 111.
  • “generation or update” is sometimes simply described as “generation”.
  • the context map generator 112 uses the positioning data in the moving body relative positioning unit 140 when generating the moving body context map 120.
  • the in-mobile relative positioning unit 140 is configured by a device that measures a relative position of the terminal device 200 in the mobile body.
  • the mobile relative positioning unit 140 transmits, for example, a predetermined radio wave, and the position of the terminal device 200 (relative relative to the mobile relative positioning unit 140 from the radio wave intensity when the terminal device 200 receives the radio wave). Measure position.
  • the server device 100 has a configuration as shown in FIG. 3, thereby correcting the sensor data sent from the terminal device 200 using the sensor data sent from the moving body posture motion measuring unit 130. I can do it.
  • the server device 100 corrects the sensor data sent from the terminal device 200 by correcting the sensor data sent from the terminal device 200 using the sensor data sent from the moving body posture movement measuring unit 130. It is possible to generate a moving body context map 120 with high accuracy.
  • FIG. 4 is an explanatory diagram illustrating a configuration example of the server apparatus 100 according to the first embodiment of the present disclosure.
  • FIG. 4 shows a configuration example of the server device 100 when the position of the terminal device 200 in the moving body is measured using the context map.
  • the server device 100 includes a sensor correction unit 111, a context estimation unit 113, a moving body positioning unit 114, a POI extraction unit 115, and a positioning result distribution unit 116.
  • the sensor correction unit 111 includes a sensor correction unit 111, a context estimation unit 113, a moving body positioning unit 114, a POI extraction unit 115, and a positioning result distribution unit 116.
  • the sensor correction unit 111 corrects the sensor data sent from the terminal device 200 using the sensor data sent from the mobile body posture motion measurement unit 130, as in the sensor correction unit 111 shown in FIG.
  • the sensor data sent from the terminal device 200 or the moving body posture motion measuring unit 130 can include acceleration data, angular velocity data, geomagnetic data, atmospheric pressure data, and the like.
  • the sensor correction unit 111 outputs the corrected sensor data to the context estimation unit 113.
  • the context estimation unit 113 uses the sensor data corrected by the sensor correction unit 111 to estimate the context of the user who carries or wears the terminal device 200.
  • the context estimation unit 113 estimates the user's context
  • the context estimation unit 113 outputs the estimation result to the in-car positioning unit 114.
  • the context estimation unit 113 recognizes an action label such as stay, walk, run, sit, meal, sleep, jump, stairs, elevator, escalator, bicycle, bus, train, car, ship, or airplane by action recognition. be able to. Since the action recognition method is described in many documents such as Japanese Patent Application Laid-Open No. 2012-8771, for example, detailed description thereof is omitted.
  • the moving body positioning unit 114 measures the position of the terminal device 200 in the moving body using the user context estimated by the context estimation unit 113 and the moving body context map 120.
  • the mobile body positioning unit 114 measures the position of the terminal device 200 in the mobile body, the mobile body positioning unit 114 outputs the positioning result to the POI extraction unit 115.
  • the POI extracting unit 115 extracts a POI (Point Of Interest) in the moving body as facility information in the moving body from the moving body POI map 145 using the positioning result of the terminal device 200 in the moving body positioning unit 114.
  • a POI Point Of Interest
  • the POI extraction unit 115 measures the positioning result in the moving body of the terminal device 200 and the extracted POI. The result is output to the result distribution unit 116.
  • the positioning result distribution unit 116 distributes the positioning result output from the POI extraction unit 115 in the moving body of the terminal device 200 and the extracted POI to the terminal device 200.
  • the terminal device 200 presents the current position in the moving body to the user of the terminal device 200 by outputting the positioning result in the moving body of the terminal device 200 and the extracted POI distributed from the positioning result distribution unit 116. can do.
  • FIG. 5 is an explanatory diagram showing examples of the in-mobile context map 120 and the mobile POI map 145.
  • the server apparatus 100 can generate the mobile in-context context map 120 as shown in FIG. 5, for example, by having the configuration as shown in FIG.
  • Each ellipse in the moving body context map 120 shown in FIG. 5 indicates a place where the terminal device 200 is likely to be located.
  • Each ellipse is associated with the action of the user who is carrying or wearing the terminal device 200.
  • the user's action to be linked is not limited to one. For example, information such as the probability of sitting, the probability of standing, and the probability of walking may be linked to each ellipse.
  • the server device 100 can grasp the behavior of the user who is carrying or wearing the terminal device 200 at each location by referring to the in-mobile context map 120. However, the server device 100 cannot grasp the location of each location only by referring to the mobile context map 120.
  • the server device 100 can grasp what each location is by comparing with the mobile POI map 145 shown in FIG.
  • the moving body POI map 145 represents the seating chart of the Shinkansen
  • the POI extraction unit 115 determines, for example, the terminal device 200 from the positioning result of the terminal device 200 in the moving body positioning unit 114. Can extract information that is in the seat of 1E.
  • Second Embodiment> [2.1. Server device configuration example 1] Next, a second embodiment of the present disclosure will be described.
  • the server apparatus 100 which correct
  • 2nd Embodiment demonstrates the server apparatus 100 which estimates what kind of moving state the user who carries or wears the terminal device 200 from the sensor data transmitted from the terminal device 200.
  • FIG. The server device 100 estimates the moving state of the user, and generates context maps and positions within the moving body from the sensor data transmitted from the terminal device 200 according to the moving state.
  • the sensor data to be used can be selected.
  • FIG. 6 is an explanatory diagram illustrating a configuration example of the server apparatus 100 according to the second embodiment of the present disclosure.
  • FIG. 6 shows a configuration example of the server apparatus 100 when generating a context map.
  • the server device 100 includes a movement mode estimation unit 117, a use sensor selection unit 118, and a context map generation unit 112. .
  • the movement mode estimation unit 117 performs behavior recognition using the sensor data transmitted from the terminal device 200, and estimates the movement state of the user who carries or wears the terminal device 200.
  • the sensor data sent from the terminal device 200 can include acceleration data, angular velocity data, geomagnetic data, atmospheric pressure data, and the like.
  • the movement state of the user is also referred to as a movement mode.
  • the movement mode estimation unit 117 estimates the movement mode of the user, the movement mode estimation unit 117 outputs the estimated result to the use sensor selection unit 118.
  • the movement mode estimation unit 117 is configured to recognize, for example, stay, walk, run, sit, meal, sleep, jump, stairs, elevator, escalator, bicycle, bus, train, car, ship, or airplane by action recognition using sensor data. Can be recognized.
  • the action recognition method using the sensor data is described in many documents such as Japanese Patent Application Laid-Open No. 2012-8771, and detailed description thereof is omitted.
  • the sensor selection unit 118 uses the mobile mode context from the sensor data sent from the terminal device 200 based on the mobile mode estimation result of the user carrying or wearing the terminal device 200 by the travel mode estimation unit 117. Sensor data used to generate the map 120 is selected.
  • the use sensor selecting unit 118 may be determined not to be used.
  • the sensor selection unit 118 uses the sensor data output by the geomagnetic sensor 211. You may decide not to use.
  • the use sensor selection unit 118 uses the sensor data output from the acceleration sensor 212. You may decide not to use it.
  • the use sensor selection unit 118 may perform weighting when using the sensor data, instead of determining whether the sensor data is used or not. For example, when the movement mode estimation unit 117 estimates a movement mode in which the terminal device 200 is in a ship that sails in a sea with high waves, the sensor selection unit 118 uses the weight of the sensor data output from the acceleration sensor 212. You may decide to lighten.
  • the use sensor selection unit 118 may use whether or not the estimated movement mode causes a magnetic disturbance as a reference for determining whether to perform weighting. For example, when the estimated movement mode is one that causes magnetic disturbance such as when in a train, the sensor selection unit 118 does not weight the sensor data output by the geomagnetic sensor 211. Use or non-use may be determined. In addition, even when the passengers are on the same railroad, if the vehicle is not affected by the motor, such as in a trolley train, the sensor selection unit 118 weights the sensor data output from the geomagnetic sensor 211. May be performed.
  • the context map generation unit 112 generates or updates the moving body context map 120 using the sensor data selected or weighted by the use sensor selection unit 118. As in the first embodiment, the context map generation unit 112 uses the positioning data in the moving body relative positioning unit 140 when generating or updating the moving body context map 120.
  • the in-mobile relative positioning unit 140 is configured by a device that measures a relative position of the terminal device 200 in the mobile body.
  • the mobile relative positioning unit 140 transmits, for example, a predetermined radio wave, and the position of the terminal device 200 (relative relative to the mobile relative positioning unit 140 from the radio wave intensity when the terminal device 200 receives the radio wave). Measure position.
  • the weight can be used for determination at the time of positioning with reference to a moving body context map 120 described later.
  • the server apparatus 100 can select the sensor data sent from the terminal apparatus 200 based on the movement mode by having the configuration as shown in FIG.
  • the server device 100 selects sensor data sent from the terminal device 200 based on the movement mode, thereby selecting highly accurate sensor data from the sensor data sent from the terminal device 200.
  • a high mobile context map 120 can be generated.
  • FIG. 7 is an explanatory diagram illustrating a configuration example of the server apparatus 100 according to the second embodiment of the present disclosure.
  • FIG. 7 shows a configuration example of the server device 100 when the position of the terminal device 200 in the moving body is measured using the context map.
  • the server device 100 includes a movement mode estimation unit 117, a use sensor selection unit 118, and a moving body positioning unit 114. .
  • the functions of the movement mode estimation unit 117 and the used sensor selection unit 118 are the same as those described with reference to FIG.
  • the use sensor selection unit 118 outputs the sensor data selected or weighted by the use sensor selection unit 118 to the in-mobile positioning unit 114.
  • the moving body positioning unit 114 measures the position of the terminal device 200 in the moving body using the sensor data selected by the use sensor selection unit 118 or the weighted sensor data and the moving body context map 120.
  • the mobile positioning unit 114 measures the position of the terminal device 200 in the mobile body and outputs the positioning result to the terminal device 200.
  • the in-mobile positioning unit 114 may extract POI in the mobile body as facility information in the mobile body. As described in the first embodiment, the moving body positioning unit 114 may extract the POI in the moving body using the positioning result of the terminal device 200 in the moving body positioning unit 114.
  • the server apparatus 100 can select the sensor data sent from the terminal apparatus 200 based on the movement mode by having the configuration as shown in FIG.
  • the server device 100 can measure the position of the terminal device 200 in the moving body based on the moving body context map 120 by selecting the sensor data sent from the terminal device 200 based on the moving mode.
  • Server device configuration example 2 In the configuration example 1 described above, the server device 100 that estimates the movement mode of the user who carries or wears the terminal device 200 from the sensor data and selects or weights the sensor data to be used based on the result of the movement mode estimation is shown. It was. Continuing configuration example 2 shows server apparatus 100 that not only selects or weights sensor data to be used, but also selects a context map based on the estimation result of the movement mode.
  • FIG. 8 is an explanatory diagram illustrating a configuration example of the server apparatus 100 according to the second embodiment of the present disclosure.
  • FIG. 8 shows a configuration example of the server device 100 when generating a context map.
  • the server device 100 includes a movement mode estimation unit 117, a use sensor selection unit 118, a context map selection unit 119, and a context map generation unit 112. , Including.
  • the functions of the movement mode estimation unit 117 and the used sensor selection unit 118 are the same as those described with reference to FIG.
  • the context map selection unit 119 selects a context map to be generated based on the estimation result of the movement mode of the user who carries or wears the terminal device 200 by the movement mode estimation unit 117.
  • the context map selection unit 119 selects a context map to be generated, the context map selection unit 119 outputs the selection result to the context map generation unit 112.
  • FIG. 8 shows mobile context maps 120a and 120b and a global context map 121 as context maps to be selected.
  • the moving body context map 120a is a context map that is selected when a user who carries or wears the terminal device 200 is in a train
  • the moving body context map 120b carries or wears the terminal device 200, for example. It is assumed that the context map is selected when the user is in the ship.
  • the global context map 121 is a context map that is selected when a user who carries or wears the terminal device 200 is not in a moving body.
  • the context map selection unit 119 may use position information measured by the absolute positioning unit 150 or information provided from the moving body identification unit 160.
  • the absolute positioning unit 150 measures the absolute position of the moving body and can be configured by a GNSS receiver or the like.
  • the mobile object identification unit 160 provides information for identifying which part (floor, vehicle, or the like) in the mobile object, and can be provided by, for example, sound waves or electromagnetic waves.
  • the context map selection unit 119 can determine whether the user is in a mobile object (such as a train or a passenger ship) that can move inside.
  • the context map selection unit 119 determines whether the global context map 121 is not used and the moving body context maps 120a and 120b. You can choose. Further, when the context map selection unit 119 determines that the user is not in the moving body from the estimation result of the movement mode of the user, the context map selection unit 119 can select use of the global context map 121 and non-use of the moving body context maps 120a and 120.
  • the context map selection unit 119 can identify that the user is on the track from the position information measured by the absolute positioning unit 150, and determines that the user is on the train from the estimation result of the user's movement mode. In such a case, the in-mobile context map 120a can be selected.
  • the context map selection unit 119 can identify that it is at sea from the position information measured by the absolute positioning unit 150, and can identify that it is inside the passenger ship from the information provided from the moving body identification unit 160.
  • the mobile context map 120b can be selected.
  • the context map generation unit 112 generates or updates the context map selected by the context map selection unit 119 using the sensor data selected by the use sensor selection unit 118 or weighted.
  • the server apparatus 100 can select the sensor data sent from the terminal apparatus 200 based on the movement mode by having the configuration as shown in FIG. Further, the server apparatus 100 has the configuration as shown in FIG. 8 so that the context map to be generated is selected based on the movement mode, the positioning information, and the information for identifying the location inside the moving body. I can do it.
  • the server device 100 selects sensor data sent from the terminal device 200 based on the movement mode, thereby selecting highly accurate sensor data from the sensor data sent from the terminal device 200, and has high accuracy.
  • a context map can be generated. Further, the server device 100 selects sensor data sent from the terminal device 200 based on the movement mode, so that when a plurality of context maps are prepared according to the type of the moving body, an appropriate context map is displayed. You can choose.
  • FIG. 9 is an explanatory diagram illustrating a configuration example of the server apparatus 100 according to the second embodiment of the present disclosure.
  • FIG. 9 shows a configuration example of the server device 100 when the position of the terminal device 200 in the moving body is measured using the context map.
  • the server device 100 includes a movement mode estimation unit 117, a use sensor selection unit 118, and a moving body positioning unit 114. .
  • the functions of the movement mode estimation unit 117, the used sensor selection unit 118, and the context map selection unit 119 are the same as those described with reference to FIG.
  • the use sensor selection unit 118 outputs the sensor data selected or weighted by the use sensor selection unit 118 to the in-mobile positioning unit 114.
  • the context map selection unit 119 selects a context map to be used at the time of positioning, the context map selection unit 119 outputs the selection result to the in-mobile positioning unit 114.
  • the in-moving body positioning unit 114 measures the position of the terminal device 200 in the moving body using the sensor map selected by the use sensor selecting unit 118 or the weighted sensor data and the context map selected by the context map selecting unit 119.
  • the mobile positioning unit 114 measures the position of the terminal device 200 in the mobile body and outputs the positioning result to the terminal device 200.
  • the server apparatus 100 can select the sensor data transmitted from the terminal apparatus 200 based on the movement mode by having the configuration shown in FIG. Further, the server apparatus 100 has a configuration as shown in FIG. 9 so that the context map used for positioning can be selected based on the movement mode, positioning information, and information for identifying the location inside the moving body. I can do it.
  • the server device 100 can measure the position of the terminal device 200 in the moving body based on the moving body context map 120 by selecting the sensor data sent from the terminal device 200 based on the moving mode. Further, the server device 100 selects sensor data sent from the terminal device 200 based on the movement mode, so that when a plurality of context maps are prepared according to the type of the moving body, an appropriate context map is displayed. You can choose.
  • the server device 100 can select the global context map 121. it can. Thereafter, if the absolute position measurement indicates that the user carrying or wearing the terminal device 200 is at sea, the server device 100 can select the in-mobile context map 120b.
  • the server apparatus 100 combining the first embodiment and the second embodiment described above can also be realized. That is, the sensor data transmitted from the terminal device 200 is corrected using the sensor data measured by the moving body, and then the user who carries or wears the terminal device 200 from the sensor data transmitted from the terminal device 200. It is also possible to realize the server device 100 that estimates whether the mobile device is in such a moving state.
  • the server apparatus 100 can measure the position of the terminal apparatus 200 in the moving body and provide the terminal apparatus 200 with information on the position. For example, when the user who carries the terminal device 200 is on the bullet train, the server device 100 detects the number and seat position of the vehicle on the bullet train by measuring the position of the terminal device 200 in the bullet train. I can do it.
  • the server device 100 measures the position of the terminal device 200 in the passenger ship, so that the detailed position (for example, restaurant, pool) in the passenger ship is measured. , Casino, etc.).
  • the server device 100 may change the information provided to the terminal device 200 according to the position of the user carrying the terminal device 200. For example, when a user who carries the terminal device 200 is on a sightseeing trolley train, the server device 100 determines whether the user is sitting on the right side of the train or the user is sitting on the left side. Information to be provided, for example, information on a landscape seen from a car window may be changed.
  • the position information in the moving body detected by the server apparatus 100 may be shared with another terminal apparatus. In that case, the server apparatus 100 moves to another terminal apparatus of the user carrying the terminal apparatus 200 in the moving body. Location information can be provided.
  • FIG. 10 is a block diagram illustrating a hardware configuration example of the information processing apparatus according to the embodiment of the present disclosure.
  • the information processing apparatus 900 includes a CPU (Central Processing unit) 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905.
  • the information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925.
  • the information processing apparatus 900 may include an imaging device 933 and a sensor 935 as necessary.
  • the information processing apparatus 900 may include a processing circuit such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Gate Array) instead of or in addition to the CPU 901.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • the CPU 901 functions as an arithmetic processing device and a control device, and controls all or a part of the operation in the information processing device 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927.
  • the ROM 903 stores programs and calculation parameters used by the CPU 901.
  • the RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
  • the CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
  • PCI Peripheral Component Interconnect / Interface
  • the input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
  • the input device 915 may be, for example, a remote control device that uses infrared rays or other radio waves, or may be an external connection device 929 such as a mobile phone that supports the operation of the information processing device 900.
  • the input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data and instruct processing operations to the information processing device 900.
  • the output device 917 is configured by a device capable of notifying the acquired information to the user using a sense such as vision, hearing, or touch.
  • the output device 917 can be, for example, a display device such as an LCD (Liquid Crystal Display) or an organic EL (Electro-Luminescence) display, an audio output device such as a speaker or headphones, or a vibrator.
  • the output device 917 outputs the result obtained by the processing of the information processing device 900 as video such as text or image, sound such as sound or sound, or vibration.
  • the storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 900.
  • the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the storage device 919 stores, for example, programs executed by the CPU 901 and various data, and various data acquired from the outside.
  • the drive 921 is a reader / writer for a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 900.
  • the drive 921 reads information recorded on the attached removable recording medium 927 and outputs the information to the RAM 905.
  • the drive 921 writes a record in the attached removable recording medium 927.
  • the connection port 923 is a port for connecting a device to the information processing apparatus 900.
  • the connection port 923 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like.
  • the connection port 923 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like.
  • the communication device 925 is a communication interface configured with, for example, a communication device for connecting to the communication network 931.
  • the communication device 925 can be, for example, a communication card for LAN (Local Area Network), Bluetooth (registered trademark), Wi-Fi, or WUSB (Wireless USB).
  • the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
  • the communication device 925 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example.
  • the communication network 931 connected to the communication device 925 is a network connected by wire or wireless, and may include, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication, or the like.
  • the imaging device 933 uses various members such as an imaging element such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge Coupled Device), and a lens for controlling the formation of a subject image on the imaging element. It is an apparatus that images a real space and generates a captured image.
  • the imaging device 933 may capture a still image or may capture a moving image.
  • the sensor 935 is various sensors such as an acceleration sensor, an angular velocity sensor, a geomagnetic sensor, an illuminance sensor, a temperature sensor, an atmospheric pressure sensor, or a sound sensor (microphone).
  • the sensor 935 acquires information about the state of the information processing apparatus 900 itself, such as the posture of the information processing apparatus 900, and information about the surrounding environment of the information processing apparatus 900, such as brightness and noise around the information processing apparatus 900, for example. To do.
  • the sensor 935 may include a GPS receiver that receives a GPS (Global Positioning System) signal and measures the latitude, longitude, and altitude of the device.
  • GPS Global Positioning System
  • Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Such a configuration can be appropriately changed according to the technical level at the time of implementation.
  • the sensor data sent from the terminal device 200 can be corrected using the sensor data sent from the moving body posture motion measuring unit 130.
  • a server device 100 that can be used is provided.
  • the server device 100 corrects the sensor data sent from the terminal device 200 by correcting the sensor data sent from the terminal device 200 using the sensor data sent from the moving body posture movement measuring unit 130.
  • it is possible to generate a highly accurate moving body context map 120, or to perform highly accurate positioning in the moving body by referring to the moving body context map 120.
  • the server device 100 that can select sensor data transmitted from the terminal device 200 based on the movement mode is provided.
  • the server device 100 selects sensor data sent from the terminal device 200 based on the movement mode, thereby selecting highly accurate sensor data from the sensor data sent from the terminal device 200. It is possible to generate a high moving body context map 120 and to perform highly accurate positioning in the moving body by referring to the moving body context map 120.
  • Embodiments of the present disclosure include, for example, an information processing apparatus, system, information processing method executed by the information processing apparatus or system, a computer program for causing the information processing apparatus to function, and a computer program recorded And non-transitory tangible media.
  • the software that implements the user interface and application shown in the above embodiment may be realized as a web application used via a network such as the Internet.
  • the web application may be realized by, for example, a markup language such as HTML (HyperText Markup Language), SGML (Standard Generalized Markup Language), or XML (Extensible Markup Language).
  • a correction unit that corrects first sensing data provided by one or more sensors carried or worn by the user using second sensing data provided by one or more sensors provided on a moving body on which the user is riding.
  • a processing unit that executes a process for obtaining the position of the user in the movable body using the correction result of the correction unit;
  • An information processing apparatus comprising: (2) The information processing apparatus according to (1), wherein the correction unit calculates a difference between the first sensing data and the second sensing data.
  • the processing unit executes a positioning process in the moving body of the user who provides the first sensing data using map information generated using a correction result of the correction unit.
  • the information processing apparatus according to any one of (3).
  • the first sensing data includes any of acceleration data, angular velocity data, and geomagnetic data.
  • the moving body is any one of a train, an automobile, and a ship.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)
  • Instructional Devices (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention vise à fournir un dispositif de traitement d'informations qui, lorsqu'une technologie de mesure de position qui utilise des capteurs est appliquée à un grand corps mobile, peut améliorer la précision de détection d'une position dans le corps mobile. À cet effet, l'invention concerne un dispositif de traitement d'informations qui comprend : une unité de correction qui utilise des secondes données de détection, qui sont fournies en provenance d'un ou plusieurs capteurs disposés dans le corps mobile dans lequel se trouve un utilisateur, pour corriger des premières données de détection, qui sont fournies en provenance de l'utilisateur par un téléphone mobile ou un ou plusieurs capteurs montés ; et une unité de traitement qui utilise le résultat de correction de l'unité de correction pour réaliser un traitement pour calculer la position de l'utilisateur dans le corps mobile.
PCT/JP2016/074157 2015-09-28 2016-08-18 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme d'ordinateur WO2017056774A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-189383 2015-09-28
JP2015189383A JP2017067468A (ja) 2015-09-28 2015-09-28 情報処理装置、情報処理方法およびコンピュータプログラム

Publications (1)

Publication Number Publication Date
WO2017056774A1 true WO2017056774A1 (fr) 2017-04-06

Family

ID=58427459

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/074157 WO2017056774A1 (fr) 2015-09-28 2016-08-18 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme d'ordinateur

Country Status (2)

Country Link
JP (1) JP2017067468A (fr)
WO (1) WO2017056774A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110502980A (zh) * 2019-07-11 2019-11-26 武汉大学 一种行人边过马路边玩手机情景行为的识别方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6776287B2 (ja) * 2018-02-16 2020-10-28 Kddi株式会社 移動体で使用可能な装置並びに当該装置の制御プログラム及び方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002148071A (ja) * 2000-11-10 2002-05-22 Fuji Xerox Co Ltd 乗 物
JP2014142345A (ja) * 2010-04-05 2014-08-07 Qualcomm Inc 関連情報の表示方法、及び携帯通信端末

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002148071A (ja) * 2000-11-10 2002-05-22 Fuji Xerox Co Ltd 乗 物
JP2014142345A (ja) * 2010-04-05 2014-08-07 Qualcomm Inc 関連情報の表示方法、及び携帯通信端末

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110502980A (zh) * 2019-07-11 2019-11-26 武汉大学 一种行人边过马路边玩手机情景行为的识别方法
CN110502980B (zh) * 2019-07-11 2021-12-03 武汉大学 一种行人边过马路边玩手机情景行为的识别方法

Also Published As

Publication number Publication date
JP2017067468A (ja) 2017-04-06

Similar Documents

Publication Publication Date Title
WO2017056777A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme informatique
WO2016098457A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
US10867195B2 (en) Systems and methods for monitoring driver state
EP3252432A1 (fr) Système d'obtention d'informations basé sur la surveillance d'un occupant
US20190383620A1 (en) Information processing apparatus, information processing method, and program
KR101730534B1 (ko) 네비게이션을 위한 카메라 인에이블 헤드셋
JP6311478B2 (ja) 情報処理装置、情報処理方法およびプログラム
US9870535B2 (en) Method and apparatus for determining probabilistic context awareness of a mobile device user using a single sensor and/or multi-sensor data fusion
US20150354951A1 (en) Method and Apparatus for Determination of Misalignment Between Device and Pedestrian
US11181376B2 (en) Information processing device and information processing method
JP2007164441A (ja) 移動体方位決定装置、移動体方位決定方法、ナビゲーション装置、および移動可能端末装置
JP5870817B2 (ja) 情報処理装置、情報処理方法、及びプログラム
JP2017520762A (ja) 推定された軌道の潜在的な妨害の尺度に基づく、移動体デバイスのポジションの不確実性
WO2017056774A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme d'ordinateur
US20200292349A1 (en) Audio information providing system, control method, and non-transitory computer readable medium
Zaib et al. Smartphone based indoor navigation for blind persons using user profile and simplified building information model
Mahida et al. Indoor positioning framework for visually impaired people using Internet of Things
WO2015194270A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US20190205580A1 (en) Information processing apparatus, information processing method, and computer program
US20220146662A1 (en) Information processing apparatus and information processing method
WO2022029894A1 (fr) Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et programme
WO2015194269A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP2014115769A (ja) 情報提供装置、情報提供方法及びプログラム
JP2019049486A (ja) 表示制御装置、表示制御方法およびプログラム
JP2008304400A (ja) ナビゲーション装置、ナビゲーション方法およびナビゲーションプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16850950

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16850950

Country of ref document: EP

Kind code of ref document: A1