CN114205408B - Co-ride support device, co-ride support method, and storage medium - Google Patents

Co-ride support device, co-ride support method, and storage medium Download PDF

Info

Publication number
CN114205408B
CN114205408B CN202110972783.XA CN202110972783A CN114205408B CN 114205408 B CN114205408 B CN 114205408B CN 202110972783 A CN202110972783 A CN 202110972783A CN 114205408 B CN114205408 B CN 114205408B
Authority
CN
China
Prior art keywords
user
information
point
terminal device
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110972783.XA
Other languages
Chinese (zh)
Other versions
CN114205408A (en
Inventor
弥永惠
仓持俊克
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN114205408A publication Critical patent/CN114205408A/en
Application granted granted Critical
Publication of CN114205408B publication Critical patent/CN114205408B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention provides a co-ride support device, a co-ride support method, and a storage medium, which can appropriately perform co-ride on a moving object owned by an individual. The ride support device is provided with: an acquisition unit that acquires first information on a first user who owns a mobile body and second information on a second user who is a candidate for co-riding with the mobile body; a determination unit that extracts a second user who is co-located with the first user on the mobile body based on the first information and the second information, and determines a junction point and a break-up point at which the second user joins the first user; and a providing unit that calculates a joining period joining at the joining point determined by the determining unit, and transmits at least the calculated joining period and information of the joining point and the breaking point determined by the determining unit to the terminal device of the first user and the terminal device of the second user using the communication device.

Description

Co-ride support device, co-ride support method, and storage medium
Technical Field
The present invention relates to a co-ride support apparatus, a co-ride support method, and a storage medium.
Background
Conventionally, a technology related to a tandem vehicle in which a plurality of users are riding on a moving body has been disclosed (for example, refer to japanese patent application laid-open No. 2019-211279). Further, a technology related to an information processing apparatus has been disclosed in which, in order to improve convenience for a user who uses a mobile body, a position where the user sits on the mobile body is determined based on position information of the user (for example, refer to japanese patent application laid-open No. 2019-095942).
However, the conventional technology is suitable for co-riding vehicle operators such as buses and taxis, and is not sufficiently considered to be suitable for co-riding of mobile objects owned by individuals.
Disclosure of Invention
The present invention has been made in view of the above-described problem, and an object thereof is to provide a boarding assistance device, a boarding assistance method, and a storage medium that can appropriately carry out boarding of a mobile object owned by an individual.
Means for solving the problems
The co-operation support device, the co-operation support method, and the storage medium of the present invention employ the following means.
(1): the co-operation support device according to an aspect of the present invention includes: an acquisition unit that acquires first information on a first user who owns a mobile body and second information on a second user who is a candidate for co-riding with the mobile body; a determination unit that extracts a second user who is on the same side as the first user as the mobile body, based on the first information and the second information, and determines a junction point and a break-away point at which the second user and the first user are joined; and a providing unit that calculates a joining period joining at the joining point determined by the determining unit, and transmits at least the calculated joining period and information of the joining point and the breaking point determined by the determining unit to the terminal device of the first user and the terminal device of the second user using a communication device.
(2): in the aspect of (1) above, the determining unit determines a point at which an index value obtained based on the environmental information of each point satisfies a criterion as the merging point.
(3): in the above-described aspect (1) or (2), the determination unit may determine a point at which an index value obtained based on the environmental information of each point satisfies a criterion as the disassembly point.
(4): in the aspect of (3) above, the index value is a traffic volume of the traffic participant, and the determination unit sets a point at which the traffic volume is equal to or greater than a predetermined volume as a point at which the index value satisfies the criterion.
(5): in addition to any one of the above (1) to (4), the acquiring unit acquires the movement status of the first user as the first information and acquires the movement status of the second user as the second information, and the providing unit provides information to the first user, the second user, or a terminal device of another user using the communication device according to the movement status of the first user and/or the movement status of the second user.
(6): in the aspect of (5) above, the providing unit calculates an estimated arrival time at which the mobile object on which the second user is co-located arrives at the release point based on the movement condition of the first user and/or the movement condition of the second user, and provides information on the calculated estimated arrival time to at least a terminal device of another user associated with the second user using the communication device.
(7): in the aspect of (5) or (6), the providing unit may determine whether or not the mobile unit of the second user is within a predetermined range from the disconnection point based on the movement state of the first user and/or the movement state of the second user, and may calculate an estimated arrival time at which the mobile unit arrives at the disconnection point when it is determined that the mobile unit is within the predetermined range from the disconnection point, and may provide information of the calculated estimated arrival time to at least a terminal device of another user associated with the second user using the communication device.
(8): in any one of the above aspects (5) to (7), the providing unit calculates a time estimated to return to home when the first user and the second user arrive at the respective destinations after the mobile object on which the second user is simultaneously present reaches the break-away point, and provides information of the calculated time estimated to return to home to the terminal device of the first user, the second user, or another user using the communication device.
(9): in the aspect of (8) above, the providing unit calculates the estimated time of arrival at the destination of each of the users based on information of a geofence set between the break-away point and the destination.
(10): in addition to any one of the above (5) to (9), the providing unit calculates a first expected time when the first user arrives at a first destination based on the arrival of the mobile object on which the second user is co-located at the break-away point, and outputs confirmation information to the terminal device of the first user using the communication device at a time obtained based on the position information of the first user and the first expected time.
(11): in addition to any one of the above (5) to (10), the providing unit calculates a second expected time for the second user to arrive at a second destination based on the fact that the mobile object on which the second user is co-located arrives at the break-away point, and outputs confirmation information to the terminal device of the second user using the communication device at a time obtained based on the position information of the second user and the second expected time.
(12): in the aspect of (11), the providing unit provides the second expected time of arrival to a terminal device of another user associated with the second user using the communication device.
(13): in any one of the above aspects (10) to (12), when a response to the confirmation information is not yet obtained after a predetermined time has elapsed after the confirmation information is output, the providing unit provides information of a user who has not obtained the response to the terminal device of the first user, the second user, and other users using the communication device.
(14): in addition to any one of the above (1) to (13), the acquiring unit further acquires operation information related to the operation of the mobile unit, the determining unit extracts the second user to determine the merging point and the breaking point when the acquiring unit acquires the operation information indicating that the operation of the mobile unit has started, and the providing unit transmits at least the calculated merging period and the information of the merging point and the breaking point determined by the determining unit to the terminal device of the first user and the terminal device of the second user using a communication device.
(15): in the aspect (14), the operation information is information indicating that an operation of a driving unit for driving the mobile body is started.
(16): the method for supporting the co-operation according to an aspect of the present invention is a method for supporting the co-operation by a computer, comprising: acquiring first information related to a first user who owns a mobile body and second information related to a second user who is a candidate for co-riding with the mobile body; extracting a second user who is co-located with the first user on the mobile body based on the first information and the second information, and determining a junction point and a break-up point at which the second user joins the first user; and calculating a joining period joining at the joining point, and transmitting at least the calculated joining period and the determined joining point and the information of the breaking point to the terminal device of the first user and the terminal device of the second user using a communication device.
(17): a storage medium according to an aspect of the present invention stores a program for causing a computer to: acquiring first information related to a first user who owns a mobile body and second information related to a second user who is a candidate for co-riding with the mobile body; extracting a second user who is co-located with the first user on the mobile body based on the first information and the second information, and determining a junction point and a break-up point at which the second user joins the first user; and calculating a joining period joining at the joining point, and transmitting at least the calculated joining period and the determined joining point and the information of the breaking point to the terminal device of the first user and the terminal device of the second user using a communication device.
Effects of the invention
According to the above-described aspects (1) to (17), it is possible to appropriately perform the co-operation with the moving object owned by the individual.
Drawings
Fig. 1 is a diagram showing an example of the configuration and the environment in which the co-operation support device according to the embodiment is used.
Fig. 2 is a diagram showing an example of vehicle owner information in the co-operation support apparatus according to the embodiment.
Fig. 3 is a diagram showing an example of the boarding information in the boarding assistance device according to the embodiment.
Fig. 4 is a diagram showing an example of matching information in the co-operation support apparatus according to the embodiment.
Fig. 5 is a diagram schematically showing an example of the movement of the matched user in the co-operation support apparatus according to the embodiment.
Fig. 6 is a timing chart showing an example of the flow of processing performed by the co-operation support apparatus according to the embodiment.
Fig. 7 is a diagram showing an example of an information screen on which information is provided in the process of the co-operation support device according to the embodiment.
Fig. 8 is a diagram showing an example of an information screen on which information is provided in the process of the co-operation support device according to the embodiment.
Fig. 9 is a diagram showing an example of an information screen on which information is provided in the process of the co-operation support device according to the embodiment.
Fig. 10 is a diagram showing an example of an information screen on which information is provided in the process of the co-operation support device according to the embodiment.
Fig. 11 is a diagram showing an example of an information screen on which information is provided in the process of the co-operation support device according to the embodiment.
Fig. 12 is a diagram showing an example of an information screen on which information is provided in the process of the co-operation support device according to the embodiment.
Detailed Description
Embodiments of a co-operation support apparatus, a co-operation support method, and a storage medium according to the present invention will be described below with reference to the accompanying drawings. The boarding assistance device is a device for assisting a service (hereinafter, simply referred to as "service") for matching a user who owns a mobile body with a user who wishes to board the mobile body. Examples of the mobile device used in the service include two-wheeled, three-wheeled, four-wheeled vehicles. The vehicle may be a vehicle that travels under a driving operation by a user who owns the vehicle, or may be a so-called automated driving vehicle that travels without requiring a driving operation by a user who owns the vehicle. In the following description, the vehicle is a motorcycle. A vehicle is an example of a "mobile body" in the present invention.
[ Environment for Using the same-riding support device ]
Fig. 1 is a diagram showing an example of the configuration and the environment in which the co-operation support device according to the embodiment is used. The co-operation support device 100 communicates with one or more terminal devices T1 and one or more terminal devices T2 via the network NW. The network NW includes, for example, the internet, WAN (Wide Area Network), LAN (Local Area Network), provider devices, wireless base stations, and the like.
The boarding assistance device 100 performs, for example, a process for receiving a boarding commitment to the boarding of the vehicle M from the user U1 who owns the vehicle M and a boarding request from another user U2 who wishes to boarding the vehicle, and causing the user U2 to board the vehicle. The reservation of the utilization of the advance time may be included in the boarding commitment from the user U1 and the boarding request from the user U2, and is not limited to immediate execution. The demand from the user U2 may or may not include information specifying the type of the vehicle to be co-ridden, such as "vehicle intended to be co-ridden only with four wheels". The user U1 is an example of the "first user" in the present invention, and the user U2 is an example of the "second user" in the present invention.
The process for allowing the user U2 to ride on the vehicle includes, for example, a process of extracting the user U2 of the vehicle M to be ride on the user U1, a process of detecting the movement states of the user U1 and the user U2, and the like. The boarding assistance device 100 extracts, for example, a user U2 who wishes to board the vehicle to the vicinity of a destination (for example, the user U2's own home) in the same direction as the destination of the return trip of the user U1 to home (that is, the user U1's own home). The user U1 is one example of the "first destination" in the technical means, and the user U2 is one example of the "second destination" in the technical means.
The terminal devices T1 and T2 are mobile terminal devices used by the user U1 and U2 (including the family members of the respective users) such as a smart phone and a tablet terminal. The terminal device T1 is an example of "a first user terminal device" in the present invention, and the terminal device T2 is an example of "a second user terminal device" in the present invention. In the following description, the terminal device T1 and the terminal device T2 are not distinguished from each other, and the terminal device T1 and the user U2 are not distinguished from each other, respectively, and are referred to as "user U". The person related to the user U2 (e.g., family, neighbor, etc.) is referred to as "user UR", and the terminal device used by the user UR is referred to as "terminal device TR".
The terminal devices T1 and T2 perform procedures for using services according to operations performed by the user U who carries the devices. In the terminal devices T1 and T2, an application or the like for utilizing a service is executed. The application for example performs the following operations: information to be added (registered) to or used by the application to the service, and position information of the terminal device T (hereinafter referred to as "terminal position information") which are input by the operation performed by the user U are transmitted to the co-operation support device 100; and displaying an image based on the information and notification transmitted from the boarding assistance device 100 on a display device, and presenting the voice data to the user U by emitting a voice through a speaker. The terminal position information is information indicating the current position of the terminal device T acquired from a sensor such as GPS (Global Positioning System) provided in the terminal device T, for example.
The vehicle M may have a communication function of communicating with the boarding assistance device 100 via the network NW. In this case, the vehicle M transmits position information (hereinafter, referred to as "vehicle position information") indicating the position of the vehicle M to the co-operation support device 100. The vehicle position information is, for example, information acquired from a navigation device, not shown, provided in the vehicle M. The vehicle position information may be position information acquired from a GPS or the like sensor provided in the vehicle M, regardless of the navigation device. The vehicle M may transmit information on the destination set in the navigation device by the user U1 to the boarding assistance device 100. The information related to the destination includes, for example, information indicating the name and position of the set destination, information indicating a route leading to the destination by the navigation device, information indicating an estimated arrival time at the set destination based on the route leading by the navigation device, and the like.
[ Structure of the ride control device 100 ]
The tandem support apparatus 100 includes, for example, a storage unit 110, a communication unit 120, an information acquisition unit 130, a matching processing unit 140, and an information providing unit 150.
The information acquisition unit 130, the matching process unit 140, and the information providing unit 150 are each implemented by executing a program (software) by a hardware processor such as CPU (Central Processing Unit). Some or all of the above-described components may be realized by hardware (including a circuit part) such as LSI (Large Scale Integration), ASIC (Application Specific Integrated Circuit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), or by cooperation of software and hardware. Some or all of the functions of the above-described constituent elements may be realized by dedicated LSIs. The program may be stored in advance in a storage device (storage device including a non-transitory storage medium) such as HDD (Hard Disk Drive) and a flash memory provided in the co-operation support device 100, or may be stored in a removable storage medium (non-transitory storage medium) such as a DVD and a CD-ROM, and the storage medium may be mounted on a HDD and a flash memory provided in the co-operation support device 100 by being mounted on a drive device provided in the co-operation support device 100. The ride control support apparatus 100 may be implemented by a server apparatus or a storage apparatus incorporated in a cloud computing system. In this case, the functions of the ride control support apparatus 100 are realized by a plurality of server apparatuses and storage apparatuses in the cloud computing system.
The storage unit 110 stores, for example, vehicle owner information 112, co-occupant information 114, matching information 116, and map information 118. The vehicle owner information 112, the co-occupant information 114, and the matching information 116 will be described later. The map information 118 is, for example, information showing the shape of a road by a route showing the road and nodes connected by the route.
The communication unit 120 is a communication interface such as a network card for connecting to the network NW. The communication unit 120 communicates with the terminal devices T1 and T2 (including the terminal device TR) via the network NW. The communication unit 120 is an example of a "communication device" in the present invention.
The information acquisition unit 130 performs a process of receiving the user U to join the service based on the information acquired from the terminal device T1 and the terminal device T2 via the communication unit 120. The information acquisition unit 130 registers the information of the user U who has performed the added reception process as the vehicle owner information 112 or the co-occupant information 114, and stores the information in the storage unit 110. The information of the user U subjected to the added receiving process is information for specifying the user U such as name, sex, and date of birth, and information for specifying the destination such as an address of the user, an address of a place or facility near the user's home, and the like. The above information is transmitted from the application to the co-operation support apparatus 100 by the user U operating and inputting the application executed by the terminal apparatus T. The information may include, for example, information on facilities to be moved to a destination such as a home. The information acquisition unit 130 may register the information separately for the vehicle owner information 112 or the co-occupant information 114, or may register the information in a unified manner, when registering the information in the added reception process.
The information acquisition unit 130 may perform processing for receiving the boarding commitment to the vehicle M from the user U1, the boarding request to the vehicle from the user U2, and the like acquired via the communication unit 120. In this case, the information acquisition unit 130 adds information indicating that the co-operation promise is accepted to the vehicle owner information 112 of the user U1 who has performed the co-operation promise acceptance process, and adds information indicating that the co-operation promise is accepted to the co-operation promise information 114 of the user U2 who has performed the co-operation promise acceptance process. The co-operation promise and the co-operation request are also transmitted to the co-operation support device 100 by the user U operating and inputting the application executed by the terminal device T. The application may transmit the terminal position information from the time when the co-operation promise or the co-operation request is transmitted to the co-operation support apparatus 100, or may transmit the terminal position information from the time when the user U starts the movement for returning home. The information acquisition unit 130 is an example of the "acquisition unit" in the present embodiment. The vehicle owner information 112 is an example of "first information" in the technical scheme, and the fellow passenger information 114 is an example of "second information" in the technical scheme.
Fig. 2 is a diagram showing an example of the vehicle owner information 112 in the boarding assistance device 100 according to the embodiment. The vehicle owner information 112 is information related to a user U who owns the vehicle (hereinafter, also referred to as "vehicle owner"). The vehicle owner information 112 may be information of only the user U1 who has made the ride-through promise. The vehicle owner information 112 is, for example, information that associates a destination, a ride promise, a current position, a mobile body position with a vehicle owner ID that identifies the vehicle owner. The destination is information indicating the location of the own home where the vehicle owner registers as the destination. The ride-on promise is information indicating whether or not the vehicle owner promises the ride-on of the vehicle owned by the vehicle owner. Either "Y" that makes a commitment at the current point in time or "N" that does not make a commitment at the current point in time is noted in the ride commitment. The current position is information indicating the position where the vehicle owner is currently located. The current position is transmitted from the terminal device T (here, the terminal device T1) of the vehicle owner, and is successively updated by the information acquisition unit 130 each time the communication unit 120 receives it. The mobile body position is information indicating the current position of the vehicle owned by the vehicle owner. The mobile body position is transmitted by the vehicle owned by the vehicle owner and is successively updated by the information acquisition section 130 each time the communication section 120 receives it. When the vehicle owned by the vehicle owner is a vehicle having no communication function, the mobile body position is omitted. In this case, the mobile position is regarded as the same position as the current position, and may be updated similarly at the timing when the current position is updated, and the information of the mobile position may be described as "the same".
The vehicle owner information 112 is not limited to the information shown in fig. 2, and may include various information such as information indicating the type of vehicle owned by the vehicle owner, information on a home-returning route to be passed when the vehicle owner goes to a destination (returns home), and information indicating the position of a junction point candidate existing midway in the home-returning route. The returning route may be registered in advance by the vehicle owner, or may be set by the boarding assistance device 100 based on a history of routes that the vehicle owner has passed through in the past. The home route may be one or a plurality of routes. When a plurality of home paths are provided, the candidate of the merging point may be shown for each home path or may not be shown.
Fig. 3 is a diagram showing an example of the boarding information 114 in the boarding assistance device 100 according to the embodiment. The co-passenger information 114 is information related to a user U who wishes to co-passenger the vehicle (hereinafter, also referred to as "co-passenger"). The co-passenger information 114 may be information of only the user U2 who has made the co-passenger request. The co-multiplier information 114 is, for example, information that associates the destination, the co-multiplier request, and the current position with the co-multiplier ID of the determined co-multiplier. The destination is information indicating the location of the own home where the fellow passenger registered as the destination. The co-ride request is information indicating whether or not the co-rider requests co-ride to the vehicle. In the co-ordination request, either "Y" requested at the current time point or "N" not requested at the current time point is recorded. The current position is information indicating the position where the co-located person is currently located. The current position is transmitted from the terminal device T (here, the terminal device T2) of the co-located person and is updated successively by the information acquisition unit 130 each time the communication unit 120 receives it.
The co-occupant information 114 is not limited to the information shown in fig. 3, and may include various information such as information on a home-returning route to be passed when the co-occupant is not able to co-occupant to any vehicle and information indicating a position of a junction point candidate existing in the middle of the home-returning route. The home-returning route may be registered in advance by the co-located person, or may be set by the co-located support device 100 based on a history of the route passed by the co-located person in the past. The home route may be one or a plurality of routes. When a plurality of home paths are provided, the candidate of the merging point may be shown for each home path or may not be shown.
Returning to fig. 1, the matching process unit 140 extracts the user U2 who is the same as the vehicle M to the user U1 based on the information acquired from the terminal device T1 and the terminal device T2 via the communication unit 120, the vehicle owner information 112 and the co-occupant information 114 stored in the storage unit 110, and determines a junction point where the user U1 and the extracted user U2 join together, and a break-away point where the user U1 and the extracted user U2 break away from each other when the user is the same as the vehicle M. The matching process section 140 is an example of the "determination section" in the technical means. The matching process section 140 includes, for example, a co-passenger extracting section 142 and a merging/deaggregation point determining section 144.
The co-passenger extracting unit 142 extracts a co-passenger who has made a co-passenger request for a vehicle co-passenger owned by the vehicle owner who made the co-passenger promise, based on the co-passenger promise and the co-passenger request acquired via the communication unit 120, and with reference to the vehicle owner information 112 and the co-passenger information 114 stored in the storage unit 110. The boarding completion unit 142 extracts, for example, a boarding completion whose position is within a predetermined range (for example, within 1km from the current position of the vehicle owner) centered on the current position of the vehicle owner or the parking lot where the vehicle is parked, based on the terminal position information (vehicle position information may be included) of each user U acquired via the communication unit 120. The boarding pass extraction unit 142 may estimate a route of the vehicle owner on a return trip when the vehicle owner gets home by referring to the map information 118, based on, for example, information on the destination (i.e., the vehicle owner's own home) recorded in the vehicle owner information 112 and the terminal position information (vehicle position information may be included), and extract the boarding pass whose destination is located within a predetermined range (for example, within 1km from the route) based on the estimated route.
Here, the occupant extraction unit 142 extracts the occupant U2 who is co-ridden with the vehicle M of the occupant U1. The co-passenger extraction unit 142 outputs information that the user U1 and the extracted user U2 have a correspondence relationship to the merging/merging point determination unit 144. The occupant extraction unit 142 may pick out and collect information related to the user U1 and the user U2 from the information included in the vehicle owner information 112 and the occupant information 114, and store the information as the matching information 116 in the storage unit 110, for example. In this case, the co-passenger extraction unit 142 may output information indicating that the matching information 116 is stored in the storage unit 110 to the merging/unbinding place determination unit 144 instead of the information associating the user U1 with the user U2.
When the information for associating the user U1 with the user U2 is output from the co-occupant extraction unit 142, the merging/diverging point determination unit 144 determines a merging point and/or a diverging point based on the vehicle owner information 112 and the co-occupant information 114 stored in the storage unit 110. The merging/deaggregation point determination unit 144 determines a point existing between the current position of the vehicle owner or the parking lot where the vehicle is parked and the current position of the fellow passenger, and a point within a predetermined range as a merging point, for example, based on the terminal position information (vehicle position information may be included) of each terminal device T acquired via the communication unit 120. The merging/deaggregation point determination unit 144 determines, for example, a point existing between the destinations of the respective users U and a point within a predetermined range as a deaggregation point based on the information of the destination (i.e., the home of the vehicle owner) described in the vehicle owner information 112 and the information of the destination (i.e., the home of the fellow passenger) described in the fellow passenger information 114. The merging/diverging point determining unit 144 may estimate a route along which each user U moves based on the terminal position information (including vehicle position information) of each terminal device T and the destination information of each user U, and may determine a point within a predetermined range as a merging point or a diverging point based on the estimated route by referring to the map information 118.
The merging/diverging point determining unit 144 determines, as a merging point and/or a diverging point, a point at which an index value obtained based on the environmental information of the point satisfies a predetermined criterion, among points which can be the merging point and the diverging point included in the map information 118. The index value obtained based on the environmental information is a value for evaluating the safety of the place (in particular, the safety of the user U2) based on whether or not there are a large number of fixations of other people than the user U1 and the user U2, such as the degree of luxury of facilities existing at the place and around the place. For example, the index value is higher at a point where the gaze of a person other than the user U1 or the user U2 is present in a large number (indicating that the point is appropriate). This is because there is a lower probability that a large number of sites where the gaze of other people is present will encounter an unexpected situation at that site. The index value is a value obtained based on traffic volume of traffic participants such as other passers or other vehicles (e.g., bicycles, motorcycles, and automobiles) existing at the place and around the place (e.g., the traffic volume itself or a value obtained by logarithm of the traffic volume itself). The merging/diverging point determining unit 144 sets, for example, a point at which the index value is equal to or greater than a threshold value, that is, a point at which the traffic participant's traffic volume is equal to or greater than a predetermined amount, as points at which the index value meets the criterion, and determines the points as merging points and/or diverging points, respectively. As the junction point and the break-away point, facilities such as shopping malls, supermarkets, convenience stores (which may include restaurants, accommodation facilities, various kinds of living facilities, hospitals, administrative offices, and the like), and parking lots attached to these facilities are used as candidates. The parking lot attached to the facility is a parking lot provided in the use area of the facility, a parking lot (including a dedicated parking lot) for preferential use by the user U1 or the like accessing the facility, or the like.
The merging/diverging point determining unit 144 outputs the information of each of the determined merging point and diverging point to the information providing unit 150. The merging/deaggregation point determination unit 144 may update the matching information 116 stored in the storage unit 110 by adding information on each of the determined merging point and deaggregation point to the matching information 116 stored in the storage unit 110 when the information indicating that the matching information 116 is stored in the storage unit 110 is output by the co-located person extraction unit 142. In this case, instead of the information on each of the determined merging point and the determined breaking point, the merging/breaking point determining unit 144 may output information indicating that the matching information 116 is updated to the information providing unit 150.
The information providing unit 150 predicts the time when each of the user U1 and the user U2 moves based on the terminal position information of each terminal device T acquired via the communication unit 120, the vehicle owner information 112 and the co-occupant information 114 stored in the storage unit 110, and the information of the merging point and the breaking point output from the matching processing unit 140 (more specifically, the merging/breaking point determining unit 144). The information providing unit 150 transmits the information of each estimated time to the terminal devices T1 and T2 via the communication unit 120. The information providing unit 150 is an example of a "providing unit" in the present embodiment. The information providing unit 150 includes, for example, a time estimation unit 152 and a notification unit 154.
When the information of the merging point and the breaking point is input from the matching process unit 140, the time estimation unit 152 calculates various times related to the movement of the user U1 and the user U2 based on the vehicle owner information 112 and the fellow passenger information 114 stored in the storage unit 110, the terminal position information of each terminal device T acquired via the communication unit 120, and the current time. The time estimation unit 152 calculates a joining period in which the user U1 and the user U2 join at the joining point, based on the information of the joining point and the terminal position information of each terminal device T. The junction period is a period including an arrival predicted time at which the user U1 arrives at the junction point and an arrival predicted time at which the user U2 arrives at the junction point. The time estimation unit 152 calculates estimated arrival times at which the user U1 and the user U2 arrive at the disconnection point based on the disconnection point, the terminal position information, and the vehicle position information after the user U1 and the user U2 are merged at the merging point and are mounted on the vehicle M. Here, the position information used when the time estimation unit 152 calculates the estimated time may be any one of the terminal position information of the terminal device T1, the terminal position information of the terminal device T2, and the vehicle position information of the vehicle M. This is because the user U1 and the user U2 travel on the same vehicle M, and thus the respective pieces of position information indicate the same position. The time estimation unit 152 may calculate the estimated arrival time at the break-away point when the positions of the vehicles M on which the user U1 and the user U2 are seated fall within a predetermined range from the break-away point (for example, within 10km from the break-away point). The time estimation unit 152 calculates the time when each user U arrives at the destination after the disconnection, that is, the estimated time of returning home to the user U, based on the information of the disconnection point and the destination of each of the user U1 and the user U2. In this case, the time estimation unit 152 may calculate the estimated time of returning home by including the time when the vehicle owner information 112 and/or the co-vehicle information 114 register information on the facility that is to be moved to the destination. The estimated time of returning home of the user U1 is an example of "first estimated time of returning home" in the technical scheme, and the estimated time of returning home of the user U2 is an example of "second estimated time of returning home" in the technical scheme.
The calculation of the various times by the time estimation unit 152 is performed by estimating a route from the current position of the user U1 and/or the user U2 to the point of the calculated time based on the terminal position information (vehicle position information may be included) of each terminal device T acquired via the communication unit 120 and referring to the map information 118, and calculating each time by moving each user U along the estimated route. The time estimation unit 152 outputs the calculated information of each time to the notification unit 154. When the matching process section 140 (more specifically, the merging/unbinding point determining section 144) outputs information indicating that the matching information 116 is updated, the time estimation section 152 may calculate various times based on the matching information 116 stored in the storage section 110, and add the calculated information of each time to the matching information 116 to update the information. In this case, the time estimation unit 152 may output information indicating that the time is added to the matching information 116 to the time estimation unit 152 instead of the calculated information of each time.
Fig. 4 is a diagram showing an example of the matching information 116 in the co-operation support apparatus 100 according to the embodiment. The matching information 116 shown in fig. 4 is updated by the time estimation unit 152. The matching information 116 is, for example, information for associating the vehicle owner current position, the mobile body position, the co-vehicle ID, the co-vehicle current position, the junction point, the vehicle owner arrival estimated time, the co-vehicle arrival estimated time, the break-away point arrival estimated time, the vehicle owner home estimated time, and the co-vehicle home estimated time with the vehicle owner ID of the vehicle owner. The vehicle owner current position is information of the current position picked up from the vehicle owner information 112 by the co-occupant extraction unit 142. The mobile body position is information of the mobile body position selected by the fellow passenger extracting unit 142 from the vehicle owner information 112. The moving body position may be omitted as described above. The co-passenger ID is the co-passenger ID of the co-passenger extracted (determined) by the co-passenger extracting unit 142. The co-passenger current position is information of the co-passenger current position selected (determined) from the co-passenger information 114 by the co-passenger extraction unit 142. The merging point is information indicating the position of the merging point determined by the merging/breaking point determining unit 144. The estimated time of arrival of the vehicle owner is the estimated time of arrival of the vehicle owner at the junction point calculated by the time estimating unit 152. The arrival estimated time of the co-passenger is the arrival estimated time at which the co-passenger arrives at the junction point calculated by the time estimating unit 152. The break-up point is information indicating the position of the break-up point determined by the merging/breaking-up point determining unit 144. The estimated time of arrival at the break-up point is the estimated time of arrival at the break-up point calculated by the time estimation unit 152 by the co-vehicle owned by the vehicle owner. The estimated time when the vehicle owner returns home is the estimated arrival time when the vehicle owner, which is calculated by the time estimation unit 152 and is released at the release point, arrives at the destination. The arrival estimated time at which the fellow passenger arrives at the destination after the fellow passenger is separated at the separation point, which is calculated by the time estimating unit 152. The notification unit 154 performs various notifications using the above information.
The matching information 116 is not limited to the information shown in fig. 4, and may include various information such as information on a junction route from a junction point to a junction point of a vehicle owner, information on a junction route from a junction point to a junction point of a fellow occupant, information on a fellow route from a junction point to a break-away point of the vehicle owner together with the fellow occupant, information on a return route from the break-away point to a destination, and information on a return route from the break-away point to the destination of the fellow occupant.
Returning to fig. 1, the notification unit 154 transmits various information to the terminal device T1 and/or the terminal device T2 by using the communication unit 120 at a proper timing based on the current time and the terminal position information (vehicle position information may be included) of each terminal device T acquired via the communication unit 120. Thus, the application executed in each terminal device T notifies the user U1 and the user U2 of information from the co-operation support device 100. The notification unit 154 notifies the user U1 and the user U2 of at least information such as information on a convergence point and/or a dispersion point and information on a convergence period included in the matching information 116. The notification unit 154 may transmit information indicating that the user U2 who wants to co-ride (co-ride) the vehicle M is present to the terminal device T1 to notify the user U1, for example, before notifying the information of the merging point, the breaking point, and the merging period to the user U1 and the user U2. In this case, for example, the notification unit 154 may transmit information indicating that the user U1 has promised to take advantage of the user U2 based on the notification, and that the user U2 can take advantage of the vehicle M to the terminal device T2, and notify the user U2. The notification unit 154 may transmit information for guiding the route from the user U1 and the user U2 to the junction, the break-away point, or the destination to the terminal device T1 and the terminal device T2 using the map information 118, for example. In this case, the application displays the route to the junction point and the break-away point on the terminal device T or/and emits the route as sound by using the navigation function provided in each terminal device T.
The notification unit 154 notifies the user U1 and the user U2 of the arrival estimated time and the arrival estimated time of each user U included in the matching information 116. The notification unit 154 may cause the communication unit 120 to transmit information of the estimated arrival time and the estimated arrival time to the terminal device of the relevant person of each user U, such as the family of the user U1 and the family of the user U2, via the network NW. The notification unit 154 may transmit information about the estimated time of arrival and the estimated time of returning to home to at least the terminal device TR of the user UR, which is the relevant person of the user U2. That is, the notification unit 154 may not transmit the information of the estimated arrival time and the estimated arrival time to the terminal devices of the relevant persons of the users U1, U2, and U1. This is because the following is considered: although the user U1 travels (returns home) while riding on the vehicle M again at the break-away point, it is assumed that the user U2 returns home after getting off from the vehicle M at the break-away point, for example, by walking, and therefore the family of the user U2 receives the user U2 at the break-away point.
After each user U breaks up at the break-up point, the notification unit 154 transmits instruction information for starting the voice communication function such as the telephone function provided in the terminal device T of each user U to each terminal device T, and causes the application to perform voice communication between the terminal device T1 and the terminal device T2. The voice communication may be a so-called voice call implemented by a telephone function, or may be a voice communication in which a voice collected by one party is transmitted to a security agency such as a security company, which monitors the security of each user U. Thus, although the users U do not need to make a conversation, the information of the sound to the periphery of the destination period is shared among the users U, and if the user U does not measure the situation, the other user U can contact the security company or the like. Alternatively, by listening to information of sounds around the destination period of each user U by a monitor belonging to a security company or the like, it is possible to monitor whether or not a situation is generated which is not detected by the user U. In this case, the notification unit 154 may transmit, to each terminal device T, instruction information for starting a sound collection function for collecting sounds such as a microphone provided in the terminal device T of each user U, and may transmit, to a security company or the like, sounds collected by each terminal device T, instead of instruction information for causing the terminal device T1 to perform sound communication with the terminal device T2. The monitoring of sound by a security company or the like can be performed by automatic recognition by a sound analysis system or the like, for example. The sound analysis system determines whether or not an unmeasured situation occurs based on, for example, the frequency, volume, and transition pattern of volume of the received sound. The voice analysis system may determine whether an unexpected situation occurs by inputting a voice signal into the post-learning model.
The notification unit 154 transmits information for confirming whether the user U1 and the user U2 safely arrived at the destination (own home) to the terminal device T1 and the terminal device T2, for example, at the timing when the user U1 and the user U2 respectively return to (arrive at) own home, based on the information of the expected time of returning home of each user U included in the matching information 116. The information for confirming whether or not each user U has reached the destination is, for example, information for displaying a report screen prompting an operation for reporting that the destination has been reached on the terminal device T. The notification unit 154 obtains the report operation of each user U on the report screen displayed on the terminal device T via the communication unit 120, thereby confirming that each user U has safely arrived at the destination. The notification unit 154 may transmit a notification to the terminal apparatuses T each other that the operation of reporting that the destination has been reached has been performed. The notification may be sent to the terminal device of the person associated with each user U. Thus, each user U and the related person of each user U can confirm that the user U who is co-located with the break-away point has safely reached the destination. The notification unit 154 may report that information for causing the terminal device T to display the report screen is transmitted to each terminal device T, and then when the user U is considered to have returned to home or when a predetermined time (for example, 15 minutes) has elapsed from the time, no report has been made by the user U of either or both sides that the user U has arrived at the destination. The report is transmitted to, for example, the terminal device T of the user U who has performed the operation of reporting that the destination has been reached, and the terminal device of the relevant person of the user U who has not performed the operation of reporting that the destination has been reached. The notification unit 154 may also report this to a security company or the like.
The notification unit 154 may confirm the movement status of each user U to the destination (home) based on the terminal position information transmitted from each terminal device T after each user U breaks at the break-away point. When it is confirmed that the user U is greatly deviated from the route to the destination, the notification unit 154 may notify the user U, a person associated with the user U, a security company, or the like of the deviation from the route to home. However, it is considered that there is also a case where the user U goes to a certain facility in a period before reaching the destination. Therefore, the notification unit 154 sets the geofence including the facility that the user U has moved to when the information of the facility is set in advance in (registered in) the vehicle owner information 112 and the co-occupant information 114 of the user U. Geofences refer to virtual boundaries that are set corresponding to real space. The notification unit 154 does not notify that the user U is deviated from the home route when it is confirmed that the user U does not exceed the geofence based on the information of the set geofence and the terminal position information of the terminal device T. In this case, the time estimation unit 152 may calculate the time estimated to return the user U home again, taking this into account.
[ procedure of overall procedure ]
Next, an example of the overall processing in the co-operation support apparatus 100 will be described. Fig. 5 is a diagram schematically showing an example of the movement of the matched user U in the co-operation support apparatus 100 according to the embodiment. Fig. 5 shows an example of a case where, for example, when the user U1 who owns the vehicle M returns from the work unit WD to the home HD as the destination, the user U2 who returns from the work unit WP to the home HP as the destination is multiplied by the section of a part of the home-returning route. In the following description, it is assumed that the user U1 operates the terminal device T1 (application) to make a co-ride promise for the vehicle M, and the user U2 operates the terminal device T2 (application) to make a co-ride request for the vehicle M.
The co-ride support apparatus 100 extracts the user U2 from the users U who make the co-ride promise and the users U who make the co-ride request as the co-riders who co-ride with the vehicle M. Then, the boarding assistance device 100 starts a process for causing the user U2 to board the vehicle. First, the boarding assistance device 100 determines the merging point MP and the breaking point DP, and notifies the user U1 and the user U2 of the determined merging point MP and the determined breaking point DP, respectively. Thus, the user U1 sits on the vehicle M and goes to the junction point MP, and the user U2 goes to the junction point MP by walking or the like, for example, and the user U1 and the user U2 come together at the junction point MP.
Then, the user U1 brings the user U2 to the vehicle M and goes to the release point DP. When reaching the release point DP, the user U1 gets the user U2 off the vehicle M and returns to his home HD. On the other hand, the user U2 returns to his home HP by walking or the like, for example.
In this way, the user U1 and the user U2 travel while riding on the same vehicle M in the section from the convergence point MP to the divergence point DP notified by the boarding assistance device 100. In this case, the user U2 can pay the use fee corresponding to the section to which the vehicle M is to be ridden, and the user U1 can collect the use fee corresponding to the section to which the user U2 is to be ridden. The payment and collection of the utilization fee may be managed by a company operating the service.
[ processing by the ride support apparatus 100 ]
Next, an example of the processing in the co-operation support apparatus 100 will be described. Fig. 6 is a timing chart showing an example of the flow of the process performed by the co-operation support apparatus 100 according to the embodiment. Fig. 7 to 12 are diagrams showing an example of an information screen on which information is provided in the process of the co-operation support apparatus 100 according to the embodiment. Fig. 7 to 12 show examples of images of the information providing screen displayed by the co-operation support device 100 in each scene in order to provide information to each user U. In the following description, in the process of the co-operation support apparatus 100 shown in fig. 6, an example of an image of each information providing screen shown in fig. 7 to 12 is appropriately referred to.
In the following description, it is assumed that the terminal device T1 and the terminal device T2 execute an application, and the co-operation support device 100 transmits information of an image to be displayed to the terminal device T1 and the terminal device T2, thereby notifying the user U1 and the user U2, respectively. In the boarding assistance device 100, as described above, the information acquisition unit 130, the matching processing unit 140 (the boarding extraction unit 142 and the merging/deaggregation point determination unit 144), and the information providing unit 150 (the time estimation unit 152 and the notification unit 154) exchange the respective information via the communication unit 120 and the network NW, but for ease of explanation, the respective components included in the boarding assistance device 100 are assumed to exchange the respective information directly.
The user U2 operates the terminal device T2 (application) to perform a procedure of a co-operation request for the co-operation of the vehicle. The application executed by the terminal device T2 transmits a request for the co-operation of the user U2 for the co-operation of the vehicle to the co-operation support device 100 (step S100). The process of step S100 may be performed before the user U2 reaches the work unit WP, after the user U2 reaches the work unit WP, or when the user U2 starts from the work unit WP. The information acquisition unit 130 receives the request for the same vehicle ride, which is transmitted from the terminal device T2 in step S100 (step S102).
On the other hand, the user U1 operates the terminal device T1 (application) to perform a procedure of a ride-on promise for the ride-on of the vehicle M. The application executed in the terminal device T1 transmits a boarding commitment to the boarding of the vehicle M by the user U1 to the boarding assistance device 100 (step S200). The process of step S200 may be performed before the user U1 reaches the work unit WD, after the user U1 reaches the work unit WD, or when the user U1 starts from the work unit WD. The information acquisition unit 130 receives the boarding commitment to the boarding of the vehicle M transmitted from the terminal device T1 in step S200 (step S202).
In the co-operation support apparatus 100, the processing of step S100 and step S102 and the processing of step S200 and step S202 are not limited to the sequence shown in fig. 6, and may be performed simultaneously or in reverse order.
When the co-operation request is received in step S102 and the co-operation promise is received in step S202, the co-operation extracting unit 142 extracts the user U2 who has made the co-operation request for the vehicle M co-operation with the user U1 who made the co-operation promise, based on the co-operation promise and the co-operation request outputted from the information acquiring unit 130 (step S300). The merging/diverging point determining unit 144 determines a merging point and a diverging point of the user U1 and the user U2 (step S302). The time estimation unit 152 calculates a joining period in which the user U1 joins the user U2 based on the information of the joining point and the terminal position information of each terminal device T (step S304). The timing of calculating the convergence period by the time estimation unit 152 is, for example, a timing at which it is confirmed that the user U1 leaves the current position (here, the work unit WD) to be out of a predetermined range. In other words, the timing at which the time estimation unit 152 calculates the confluence period is the timing at which the user U1 steps on the return stroke. The confirmation that the user U1 has left the predetermined range may be performed by the co-operation support device 100 based on the terminal position information of the terminal device T1, or may be performed when, for example, the communication function provided in the vehicle M transmits information indicating the engine operation (start) of the vehicle M to the co-operation support device 100, and the communication unit 120 receives the information and acquires the information by the information acquisition unit 130. In this case, for example, even when the parking lot where the vehicle M is parked is located at a place away from the current position (for example, the work unit WD) of the user U1, the time estimation unit 152 can calculate the merging period more accurately based on the time at which the engine of the vehicle M is started. The engine is an example of "driving unit" in the present invention, and the information indicating the start of the engine is an example of "operation information" in the present invention.
The notification unit 154 transmits information of each of the merging point, the dismissal point, and the merging period to the terminal device T1 to notify the user U1 (step S306). The notification unit 154 transmits information of each of the merging point, the dismissal point, and the merging period to the terminal device T2 to notify the user U2 (step S308). Fig. 7 shows an example of a notification screen displayed on the terminal device T1 for notifying the user U1 in step S306, and fig. 8 shows an example of a notification screen displayed on the terminal device T2 for notifying the user U2 in step S308. As shown in fig. 7, the terminal device T1 displays an image including information of each of the merging point MP, the breaking point DP, and the merging period. The image may be automatically displayed by the push notification function of the application, or may be provided with information indicating that notification is received from the co-ordination support apparatus 100 and displayed when the user U1 performs a viewing operation. As shown in fig. 8, the terminal device T2 displays an image including information of each of the merging point MP, the breaking point DP, and the merging period. The image may include information of the vehicle M as shown in fig. 8. The image may be displayed by a push notification function of the application or a viewing operation performed by the user U2 based on the information provided by the notification. Based on the notification, the user U1 and the user U2 go to the junction MP.
Returning to fig. 6, when the user U1 and the user U2 reach the junction MP, respectively, the user U2 is carried on the vehicle M at the junction MP and goes to the escape point DP together with the user U1. The notification unit 154 confirms to either or both of the user U1 and the user U22 that the user U2 is riding on the vehicle M (step S400). Fig. 9 shows an example of a confirmation screen displayed on the terminal device T2 in order to confirm to the user U2 whether or not the vehicle M can be ridden with the user U2 in step S400. As shown in fig. 9, the terminal device T2 displays an image including information for confirming to the user U2 whether or not the vehicle M can be ridden. The image may also be displayed by a push notification function or the like of the application. This image includes two answer buttons as shown in fig. 9. The answer button BY is an answer button for answering the vehicle M, and the answer button BN is an answer button for answering the vehicle M. The user U2 can answer to the notification unit 154 whether or not the vehicle M can be ridden BY touching one of the answer button BY and the answer button BN displayed at the lower portion of the confirmation screen. Fig. 9 shows an example in which the user U2 touches the answer button BY to answer a state that can be carried BY the vehicle M. When the answer button BY is touched, the terminal device T2 transmits information indicating that the user U2 can be ridden on the vehicle M to the ride control device 100 (more specifically, the notification unit 154). On the other hand, when the answer button BN is touched, the terminal device T2 transmits information indicating that the user U2 cannot ride on the vehicle M to the ride control support apparatus 100.
Returning to fig. 6, when it is confirmed that the user U2 can be on the vehicle M, the time estimation unit 152 calculates estimated arrival times at which the user U1 and the user U2 arrive at the release point DP (step S402). As described above, the time estimation unit 152 may calculate the estimated arrival time at the disconnection point DP when the user U1 and the user U2 enter the predetermined range from the disconnection point DP. The notification unit 154 transmits information of the estimated time to at least the terminal device TR to notify the user UR (step S404). Fig. 10 shows an example of a notification screen displayed on the terminal device TR for notifying the user UR in step S404. As shown in fig. 10, the terminal device TR displays an image including information of the disconnection point DP and the estimated arrival time at which the user U2 arrives at the disconnection point DP. As shown in fig. 10, the image may include information of the user U1 and the vehicle M that the user U2 is on. The image may also be displayed by a push notification function or the like of the application. Based on this notification, the user UR can detach the point DP and connect to the user U2.
Returning to fig. 6, the boarding assistance device 100 confirms whether the vehicle M that the user U2 board reaches the dispersion point DP (step S500). When it is confirmed in step S500 that the vehicle M arrives at the dispersing point DP, the time estimation unit 152 calculates the time when each user U arrives at the destination after the dispersion, that is, the return-to-home estimated time when the user U returns to his home, based on the dispersing point and the information of the destinations of each user U1 and user U2 (step S502). The notification unit 154 transmits information indicating that the vehicle M arrives at the distribution point DP and information of the estimated time of returning home to at least the terminal device TR to notify the user UR (step S504). Then, the notification unit 154 transmits instruction information for starting the voice communication function to the applications executed in the terminal device T1 and the terminal device T2, respectively, to perform voice communication (step S506).
Then, the notification unit 154 transmits information on a confirmation screen for confirming whether or not the user U1 and the user U2 safely arrive at the destination (own home) to the terminal device T1 and the terminal device T2 at the timing when the user U1 and the user U2 arrive at the destination, respectively (step S600). The user U1 and the user U2 each respond (notify) to the notification unit 154 that the destination (own home) is reached by operating the confirmation screen displayed on the terminal device T (step S602 and step S604). Fig. 11 shows an example of a confirmation screen displayed on the terminal device T2 in order to confirm (inquire) to the user U2 whether the destination (home) is safely reached in step S600. As shown in fig. 11, the terminal device T2 displays an image including information for confirming to the user U2 whether the destination is safely reached. The image may also be displayed by a push notification function or the like of the application. The image includes a response button BA as shown in fig. 11. The answer button BA is an answer button for reporting that the destination is reached. The user U2 can notify (report) to the notification unit 154 that the destination (home) is safely reached by touching the response button BA displayed at the lower part of the confirmation screen. Fig. 11 shows an example of a state in which the user U2 touches the response button BA and reports that the destination (home) is safely reached in step S602. When the response button BA is touched, the terminal device T2 transmits information indicating that the user U2 has safely arrived at the destination to the co-operation support device 100 (more specifically, the notification unit 154). The information of the confirmation screen is an example of "confirmation information" in the technical scheme.
Returning to fig. 6, the notification unit 154 then confirms, for each of the user U1 and the user U2, whether or not a predetermined time has elapsed from the expected time of returning home (step S700). When the predetermined time has elapsed in step S700, it is checked whether or not an answer indicating that the destination has been reached is obtained from the user U (step S702). When it is confirmed in step S702 that the answer from the user U arrives at the destination, the co-ordination support apparatus 100 ends the present process. Thereafter, the boarding assistance device 100 can perform a payment procedure for collecting the usage fee from the user U2 and paying the usage fee to the user U1.
Here, in the example of the confirmation screen shown in fig. 11, it is assumed that the user U2 goes directly from the point of separation DP to the destination (home), but it is also assumed that the user U2 goes to the destination (home) after going to a certain facility. In this case, a report button for reporting whether to go to the facility, an input area for inputting a predetermined time to go to the facility, may be provided in the confirmation screen shown in fig. 11. Further, the notification unit 154 may change the predetermined time checked in step S700 in response to the report from the user U2 when the user U2 reports the facility to be moved to the facility and the predetermined time for moving to the facility.
On the other hand, if it is not confirmed in step S702 that the answer from the user U arrives at the destination, the notification unit 154 reports that there is a user U who has not received the answer (step S704). Fig. 12 shows an example of a report screen for reporting to the user UR that the user U2 has not received the answer to the destination (his own home) and has not received the confirmation of the arrival in step S702. As shown in fig. 12, the terminal device TR displays an image including information indicating that no answer to the destination is obtained from the user U2. As shown in fig. 12, this image may include information on, for example, the arrival time (or the estimated arrival time) and the estimated arrival time of the point DP to be released. The image may also be displayed by a push notification function or the like of the application. As shown in fig. 12, this image includes information for confirming to the user UR whether the user U2 safely arrives at the destination, and two answer buttons. The answer button BY is an answer button for answering that the user U2 has returned home, and the answer button BN is an answer button for answering that the user U2 has not returned home. The user UR can answer to the notification unit 154 whether or not the user U2 has reached the destination (home) BY, for example, sliding (scrolling) the report screen upward to touch one of the answer button BY and the answer button BN displayed at the lower portion of the report screen. Fig. 12 shows an example of a state in which the user UR touches the answer button BY and answers that the user U2 has returned home. When the answer button BY is touched, the terminal device TR transmits information indicating that the user U2 has safely arrived at the destination to the co-operation support device 100 (more specifically, the notification unit 154). On the other hand, when the answer button BN is touched, the terminal device TR transmits information indicating that the user U2 has not arrived at the destination to the co-operation support device 100. In this case, the notification unit 154 may report (notify) to the security company or the like that the user U2 has not arrived at the destination (has not arrived).
However, in the case where the answer to the destination is not confirmed from the user U of one party but the answer to the destination is confirmed from the user U of the other party in step S702, the notification unit 154 may perform the same report as the example of the report screen shown in fig. 12 to the user U who has confirmed the answer at the first or at the same time. For example, when the answer from the user U1 is confirmed but the answer from the user U2 is not confirmed, it is possible to report to the user U1 that the answer from the user U2 is not obtained. This is because it is considered that the user U1 and the user U2 arrive at the respective destinations at the same time, and when the reports from the boarding assistance device 100 match, the user U1 can immediately respond by, for example, returning to the disconnection point DP. For example, it is considered that the user U1 in a state where the user UR arrives at the destination (returns home) earlier than the current situation of the user UR is in a state where the user UR can move more immediately, and thus the user U is likely to be handled earlier than the user UR.
Here, in the example of the report screen shown in fig. 12, the case where the user U2 goes directly from the point DP of resolution to the destination (home) is also assumed, but as described above, the case where the user U2 goes to the destination (home) after going to a certain facility by the way is also assumed. Therefore, the notification unit 154 may not perform notification based on the report screen as shown in fig. 12 when the current position of the user U2 is within the set range of the geofence, and may perform notification when the current position of the user U2 is out of the range of the geofence. It is considered that the geofence may be different from a geofence set (registered) in advance depending on the break-away point DP reached by the user U2 riding on the vehicle M. Therefore, for example, a setting area for setting the geofence may be set on the confirmation screen shown in fig. 11, and the user U2 may be able to set the geofence after the user U1 is separated from the user U1 at the separation point DP.
Through this procedure, the boarding assistance device 100 matches the user U1 who owns the vehicle M with the user U2 who wishes to board the vehicle. At this time, the boarding assistance device 100 sets the junction point where the matched user U1 and user U2 join and the break-away point where the break-away occurs as a safe point where a large number of other people are gazing. The boarding assistance device 100 provides (notifies) not only the user U that is boarding the vehicle but also the related person of the user U, and confirms whether the matched user U1 and user U2 each safely reach their respective destinations.
As described above, according to the boarding assistance device 100 of the embodiment, the user U2 boarding the vehicle M to the user U1 is matched with a highly safe point where a large number of other people are present as the merging point and/or the breaking point. In the boarding assistance device 100 of the embodiment, the movement status of each user U is sequentially reported to the relevant person. In the boarding assistance device 100 of the embodiment, when it is assumed that an unmeasured situation occurs in any one of the users U boarding the vehicle M, the same assistance device reports the same to related personnel including a security company and the like. Thus, the person who receives the report can cope with the assumed unexpected situation earlier. Thus, the user U1 and the user U2 (including the related person of each user U) can use the service using the co-operation support device 100 more securely.
In an example of the flow of the process of the above-described co-operation support apparatus 100, the following is described: the current position of the user U1 is the work unit WD, the current position of the user U2 is the work unit WP, and the user U2 is riding on the vehicle M of the user U1 when the user U1 and the user U2 return home from the work units, respectively. However, it is considered that the timing at which the user U2 makes the co-operation request to the co-operation support apparatus 100 may be after the user U1, which has not determined the state of the co-operation of the user U2, starts from the work unit WD. It is considered that the user U1 may ride on the vehicle M at the junction MP and then the user U2 may request the same. In this case, the boarding assistance device 100 may transmit information indicating the presence of the user U2 desiring to board the vehicle M to the terminal device T1 to notify the user U1, and check whether or not to return from the current position to the junction MP, for example, to board the user U2. Thus, the user U1 can answer whether to allow the user U2 to take the current situation into consideration. When the user U1 makes a promise to the user U2 for the co-operation based on the notification, the co-operation support device 100 transmits information indicating that the vehicle M can be co-operated to the terminal device T2 and notifies the user U2 of the fact. Thus, the user U2 can travel to the release point DP with the vehicle M that is on the same line as the user U1 that made the same-line promise and returns.
In the above embodiment, the description has been made of the case where the user U2 is one person who is the vehicle M that is the co-passenger with the user U1. However, the vehicle owner is not limited to the vehicle owner having the motorcycle such as the user U1, and a vehicle owner of a vehicle capable of allowing a plurality of co-riders to co-ride such as a vehicle having four wheels is also conceivable. Further, it is considered that there are many persons who are sharing the same destination in the same direction and who are sharing the destination. In this case, the boarding assistance device 100 may extract a plurality of boarding persons who are boarding the vehicle to the vehicle owner in consideration of the number of persons who are able to boarding the vehicle owned by the vehicle owner at the same time. In this case, the boarding assistance device 100 may set a junction between the vehicle owner and the boarding person that is boarding the same vehicle at one location or at a plurality of locations, taking into account the boarding path that the vehicle owner passes through when going to the destination (boarding). In this case, the home returning route of the vehicle owner is a route through each junction point where the passengers are allowed to ride together. The configuration, operation, and processing of the boarding assistance device 100 in this case may be equivalent to those of the boarding assistance device 100 in the case where the above-described boarding person is the single user U2.
The co-operation support device according to the above-described embodiment includes: an information acquisition unit 130 that acquires vehicle owner information, which is first information related to a user U1 who is a first user who owns a mobile body (vehicle M), and occupant information, which is second information related to a user U2 who is a candidate for occupant registration with the vehicle M; a matching process unit 140 that extracts a user U2 who is co-located with the user U1 on the vehicle M based on the vehicle owner information and the co-located information, and determines a merging point and a dismissal point at which the user U2 merges with the user U1; and an information providing unit 150 that calculates a joining period joining at the joining point determined by the matching process unit 140, and transmits at least the calculated joining period and information of the joining point and the breaking point determined by the matching process unit 140 to the terminal device T1 as the terminal device of the user U1 and the terminal device T2 as the terminal device of the user U2 using the communication unit 120 as the communication device, whereby it is possible to appropriately perform the co-operation of the vehicle M with the user U1.
The embodiment described above can be expressed as follows.
A co-ride support device is provided with:
a hardware processor; and
a storage device in which a program is stored,
the program stored in the storage device is read out and executed by the hardware processor to perform the following processing:
acquiring first information related to a first user who owns a mobile body and second information related to a second user who is a candidate for co-riding with the mobile body;
extracting a second user who is co-located with the first user with the mobile body based on the first information and the second information, and determining a junction point, a junction period, and a break-up point where the second user joins the first user; and
at least the determined information of the merging point, the merging period, and the breaking point is transmitted to the terminal device of the first user and the terminal device of the second user using a communication device.
While the embodiments for carrying out the present invention have been described above, the present invention is not limited to the embodiments, and various modifications and substitutions may be made without departing from the spirit of the present invention.

Claims (15)

1. A co-operation support device, wherein,
The co-operation support device includes:
an acquisition unit that acquires first information on a first user who owns a mobile body and second information on a second user who is a candidate for co-riding with the mobile body;
a determination unit that extracts a second user who is on the same side as the first user as the mobile body, based on the first information and the second information, and determines a junction point and a break-away point at which the second user and the first user are joined; and
a providing unit that calculates a joining period joining at the joining point determined by the determining unit, and transmits at least the calculated joining period and information of the joining point and the breaking point determined by the determining unit to the terminal device of the first user and the terminal device of the second user using a communication device,
the acquisition section acquires a movement condition of the first user as the first information, and acquires a movement condition of the second user as the second information,
the providing unit provides information to the terminal device of the first user, the second user, or another user using the communication device according to the movement status of the first user and/or the movement status of the second user,
The providing unit calculates an estimated arrival time at which the mobile object with which the second user is co-located arrives at the disconnection point based on the movement condition of the first user and/or the movement condition of the second user, and provides information of the calculated estimated arrival time to at least a terminal device of another user associated with the second user using the communication device.
2. The ride control device according to claim 1, wherein,
the determination unit determines a point at which an index value obtained based on the environmental information of each point satisfies a criterion as the merging point.
3. The co-operation support apparatus according to claim 1 or 2, wherein,
the determination unit determines a point at which an index value obtained based on the environmental information of each point satisfies a criterion as the resolution point.
4. The ride control device according to claim 3, wherein,
the index value is the traffic volume of the traffic participant,
the determination unit sets a point at which the traffic is equal to or greater than a predetermined amount as a point at which the index value satisfies the criterion.
5. The ride control device according to claim 1, wherein,
the providing unit determines whether or not the mobile body on which the second user is traveling is within a predetermined range from the disconnection point based on the traveling condition of the first user and/or the traveling condition of the second user, calculates an estimated arrival time at which the mobile body arrives at the disconnection point when it is determined that the mobile body is within the predetermined range from the disconnection point, and provides information on the calculated estimated arrival time to at least a terminal device of another user associated with the second user using the communication device.
6. The ride control device according to claim 1, wherein,
the providing unit calculates a time estimated to get home when the first user and the second user arrive at the respective destinations after the mobile object on which the second user is simultaneously present reaches the release point, and provides information of the calculated time estimated to get home to the terminal device of the first user, the second user, or another user using the communication device.
7. The ride control device according to claim 6, wherein,
the providing unit calculates the estimated time of arrival at the destination of each user based on the information of the geofence set between the break-away point and the destination.
8. The ride control device according to claim 1, wherein,
the providing unit calculates a first estimated time for the first user to get to a first destination based on the arrival of the mobile object on which the second user is simultaneously present at the release point, and outputs confirmation information to the terminal device of the first user using the communication device at a timing obtained based on the position information of the first user and the first estimated time.
9. The ride control device according to claim 1, wherein,
the providing unit calculates a second estimated time for the second user to get to a second destination based on the fact that the mobile object on which the second user is simultaneously present reaches the release point, and outputs confirmation information to the terminal device of the second user using the communication device at a timing obtained based on the position information of the second user and the second estimated time.
10. The ride control device according to claim 9, wherein,
the providing unit provides the second expected time of coming home to a terminal device of another user associated with the second user using the communication device.
11. The co-operation support apparatus according to any one of claims 8 to 10, wherein,
when a response to the confirmation information is not obtained after a predetermined time has elapsed after the confirmation information is output, the providing unit provides information of the user who has not obtained the response to the terminal device of the first user, the second user, and the other users using the communication device.
12. The ride control device according to claim 1, wherein,
The acquisition section also acquires operation information related to the operation of the moving body,
the determination unit extracts the second user to determine the merging point and the breaking point when the acquisition unit acquires the operation information indicating that the operation of the mobile body has started,
the providing unit transmits information of at least the calculated joining time period and the joining point and the breaking point determined by the determining unit to the terminal device of the first user and the terminal device of the second user using a communication device.
13. The ride control device according to claim 12, wherein,
the operation information is information indicating that operation of a driving unit for driving the mobile body is started.
14. A method for supporting co-operation, wherein,
the following processing is carried out by a computer:
acquiring first information related to a first user who owns a mobile body and second information related to a second user who is a candidate for co-riding with the mobile body;
extracting a second user who is co-located with the first user on the mobile body based on the first information and the second information, and determining a junction point and a break-up point at which the second user joins the first user;
Calculating a joining period joining at the joining point, and transmitting at least the calculated joining period and the determined information of the joining point and the breaking point to the terminal device of the first user and the terminal device of the second user using a communication device;
acquiring a movement condition of the first user as the first information, and acquiring a movement condition of the second user as the second information;
providing information to a terminal device of the first user, the second user or other users by using the communication device according to the movement condition of the first user and/or the movement condition of the second user; and
based on the movement condition of the first user and/or the movement condition of the second user, a predicted arrival time at which the mobile object on which the second user is co-located arrives at the break-away site is calculated, and information of the calculated predicted arrival time is provided to at least the terminal device of the other user associated with the second user using the communication device.
15. A storage medium storing a program, wherein,
The program causes a computer to perform the following processing:
acquiring first information related to a first user who owns a mobile body and second information related to a second user who is a candidate for co-riding with the mobile body;
extracting a second user who is co-located with the first user on the mobile body based on the first information and the second information, and determining a junction point and a break-up point at which the second user joins the first user;
calculating a joining period joining at the joining point, and transmitting at least the calculated joining period and the determined information of the joining point and the breaking point to the terminal device of the first user and the terminal device of the second user using a communication device;
acquiring a movement condition of the first user as the first information, and acquiring a movement condition of the second user as the second information;
providing information to a terminal device of the first user, the second user or other users by using the communication device according to the movement condition of the first user and/or the movement condition of the second user; and
Based on the movement condition of the first user and/or the movement condition of the second user, a predicted arrival time at which the mobile object on which the second user is co-located arrives at the break-away site is calculated, and information of the calculated predicted arrival time is provided to at least the terminal device of the other user associated with the second user using the communication device.
CN202110972783.XA 2020-08-27 2021-08-24 Co-ride support device, co-ride support method, and storage medium Active CN114205408B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020143606A JP7449823B2 (en) 2020-08-27 2020-08-27 Rider support device, passenger support method, and program
JP2020-143606 2020-08-27

Publications (2)

Publication Number Publication Date
CN114205408A CN114205408A (en) 2022-03-18
CN114205408B true CN114205408B (en) 2024-02-13

Family

ID=80498168

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110972783.XA Active CN114205408B (en) 2020-08-27 2021-08-24 Co-ride support device, co-ride support method, and storage medium

Country Status (2)

Country Link
JP (1) JP7449823B2 (en)
CN (1) CN114205408B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11109187B2 (en) * 2019-09-24 2021-08-31 T-Mobile Usa, Inc. Device to device communication and wayfinding
CN115273522A (en) * 2022-06-23 2022-11-01 歌尔股份有限公司 Traffic information prompting method, device and medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110580272A (en) * 2018-06-08 2019-12-17 丰田自动车株式会社 Information processing apparatus, system, method, and non-transitory storage medium
CN110753948A (en) * 2017-06-16 2020-02-04 本田技研工业株式会社 Service management device, service providing system, service management method, and program
CN110782305A (en) * 2018-07-25 2020-02-11 丰田自动车株式会社 Information processing apparatus, information processing method, and non-transitory storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002267487A (en) * 2001-03-13 2002-09-18 Sharp Corp Arrival home time forecasting system and remote control system
JP2017211889A (en) * 2016-05-27 2017-11-30 大和ハウス工業株式会社 Return home scheduled time confirmation system
JP6948935B2 (en) * 2017-12-20 2021-10-13 日産自動車株式会社 Information management method and information management device
JP6965147B2 (en) * 2017-12-21 2021-11-10 日産自動車株式会社 Information processing method and information processing equipment
JP7056463B2 (en) * 2018-08-21 2022-04-19 トヨタ自動車株式会社 Information processing equipment, information processing system, and information processing method
JP7122239B2 (en) * 2018-12-12 2022-08-19 本田技研工業株式会社 Matching method, matching server, matching system, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110753948A (en) * 2017-06-16 2020-02-04 本田技研工业株式会社 Service management device, service providing system, service management method, and program
CN110580272A (en) * 2018-06-08 2019-12-17 丰田自动车株式会社 Information processing apparatus, system, method, and non-transitory storage medium
CN110782305A (en) * 2018-07-25 2020-02-11 丰田自动车株式会社 Information processing apparatus, information processing method, and non-transitory storage medium

Also Published As

Publication number Publication date
JP7449823B2 (en) 2024-03-14
CN114205408A (en) 2022-03-18
JP2022038897A (en) 2022-03-10

Similar Documents

Publication Publication Date Title
CN114205408B (en) Co-ride support device, co-ride support method, and storage medium
KR101855257B1 (en) Method and apparatus for providing public transportation service in mobile communication system
KR101159230B1 (en) System and method for providing bus board information
US20190087875A1 (en) Ridesharing support system, ridesharing support method, and ridesharing support device
JP4148002B2 (en) Taxi sharing management program and taxi sharing management method
CN110024361A (en) Method and system for integrated management travel condition of vehicle
JP2004062490A (en) Ride sharing proxy negotiation system and ride sharing proxy negotiation method
JP5504640B2 (en) Navigation system, information center, navigation device, and mobile terminal
TW201303754A (en) Taxi-calling service management system and server and device thereof
JP2011008454A (en) Information providing system
JP4387555B2 (en) Traffic information system and information system
KR20130082834A (en) Call taxi service system
CN112272199A (en) Passenger information processing system, passenger information processing method, user equipment and two-dimensional code scanning device
JP2001188996A (en) Taxi service management system
CN107229997A (en) Information processing method, electronic equipment, public transport and management equipment by bus
JP4568314B2 (en) Information collection / distribution system, information collection / distribution server, user terminal device, and information collection / distribution method
KR102052602B1 (en) Seat information providing system and the method thereof suing spatial information technlogy
JP2022106573A (en) Information processing device, information processing system, information processing method, and terminal device
JP2016218894A (en) Potential customer location information notification system, method and program
CN110430120A (en) A kind of Bus information reminding method and system based on SNS
JP5612925B2 (en) Traffic information processing apparatus, traffic information processing system, program, and traffic information processing method
CN115086873A (en) Ride-sharing support device, ride-sharing support method, and storage medium
JP7430080B2 (en) Management devices, management methods, and programs
JP4482577B2 (en) Vehicle allocation taxi management system, vehicle allocation taxi management server, vehicle allocation taxi management method, and user terminal device
JP6629498B2 (en) Information processing system, information processing method, and information processing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant