CN113034727A - Mobile in-vivo image providing system, mobile in-vivo image providing method, server and storage medium - Google Patents

Mobile in-vivo image providing system, mobile in-vivo image providing method, server and storage medium Download PDF

Info

Publication number
CN113034727A
CN113034727A CN202011351579.8A CN202011351579A CN113034727A CN 113034727 A CN113034727 A CN 113034727A CN 202011351579 A CN202011351579 A CN 202011351579A CN 113034727 A CN113034727 A CN 113034727A
Authority
CN
China
Prior art keywords
image
unit
vehicle
information
terminal device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011351579.8A
Other languages
Chinese (zh)
Inventor
今井直子
斋木亮
铃木敦行
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN113034727A publication Critical patent/CN113034727A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers
    • G07C5/0866Registering performance data using electronic data carriers the electronic data carrier being a digital video recorder in combination with video camera
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/188Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Information Transfer Between Computers (AREA)
  • Time Recorders, Dirve Recorders, Access Control (AREA)
  • Alarm Systems (AREA)

Abstract

Provided are a mobile in-vivo image providing system, a mobile in-vivo image providing method, a storage medium, and a mobile in-vivo image providing server, wherein when an image captured in a mobile body is provided to a terminal device outside the mobile body, the image is provided in a limited manner when a specific passenger rides in a car. The mobile in-vivo image providing system is configured to include: an image acquisition unit that acquires an image captured by the imaging unit on the moving body; an entering/leaving information acquiring unit that acquires entering/leaving information for each passenger detected as to entering/leaving of the moving object; and an image providing unit that transmits, to an external terminal device associated with a certain passenger who gets in the moving body, an image corresponding to a period during which the passenger gets in the moving body, among the images acquired by the image acquiring unit, so as to be viewable, and does not transmit, to the external terminal device, an image corresponding to a period after the passenger gets off the vehicle, so as to be viewable, based on the getting-on/off information acquired by the getting-on/off information acquiring unit.

Description

Mobile in-vivo image providing system, mobile in-vivo image providing method, server and storage medium
Technical Field
The invention relates to a mobile in-vivo image providing system, a mobile in-vivo image providing method, a storage medium, and a mobile in-vivo image providing server.
Background
Conventionally, there is known a technique of controlling start and end of recording of image data captured by a camera on a memory card based on an open/close state of a door of a taxi and a charging state signal (for example, refer to japanese patent application laid-open No. 2010-258762).
Disclosure of Invention
Problems to be solved by the invention
For example, a driver may use a vehicle to get access to a child or the like in the vicinity. In such a case, it is preferable that the image captured of the child in the vehicle be provided to the caretaker such as the parent who has given up the child so that the situation of the caretaker such as the child in the vehicle can be grasped. In addition, from the viewpoint of privacy of the driver, traffic volume, and the like, there is a need to provide each watching observer with an image corresponding to a period during which the child of the observer takes a car as a passenger in the vehicle.
The present invention is intended to provide a terminal device outside a mobile body with an image captured inside the mobile body, the image being limited to an image when a specific passenger rides the vehicle.
Means for solving the problems
The mobile in-vivo image providing system, mobile in-vivo image providing method, storage medium, and mobile in-vivo image providing server according to the present invention adopt the following configurations.
(1): a mobile in-vivo image providing system according to an aspect of the present invention includes: an image acquisition unit that acquires an image obtained by imaging the inside of the moving body by the imaging unit; an entering/leaving information acquiring unit that acquires entering/leaving information for each passenger detected as to entering/leaving of the moving object; and an image providing unit that transmits, to an external terminal device associated with a certain passenger who gets in the mobile unit, an image corresponding to a period during which the passenger gets in the mobile unit, among the images acquired by the image acquiring unit, so as to be viewable, and does not transmit, to the external terminal device, an image corresponding to a period after the passenger gets off the vehicle, so as to be viewable, based on the getting-on/off information acquired by the getting-on/off information acquiring unit.
(2): in the moving body internal image providing system according to the aspect of (1), the image acquisition unit may start the image capturing by the image capturing unit in response to at least one passenger riding in the moving body, and the image providing unit may start the transmission of the image in response to a passenger riding in the external terminal device.
(3): in the moving body image providing system according to the above-described aspect (1) or (2), the image providing unit may end transmission of the image in response to the passenger who has made a correspondence with the external terminal device getting off the vehicle, and the image acquiring unit may end capturing in response to all the passengers getting off the vehicle from the moving body.
(4): in the in-vivo mobile image providing system according to any one of the above (1) to (3), the in-vivo mobile image providing system includes a storage unit that stores the images, and the image providing unit deletes the images stored in the storage unit in response to receiving a notification indicating approval of deletion from each of external terminal devices to which the images stored in the storage unit are transmitted.
(5): in the moving body image providing system according to any one of the above (1) to (4), the image providing unit is provided in the moving body, and transfers an image transmitted for viewing an image on the external terminal device to the external terminal device through a passenger terminal device carried by a passenger present in the moving body.
(6): in the moving body image providing system according to the aspect of (5) above, the image providing unit may make the quality of the image transmitted to view the image on the external terminal device different from the quality of the image transmitted to a predetermined device different from the passenger terminal device.
(7): in the mobile in-vivo image providing system according to any one of the above (1) to (6), the image providing unit transmits an image obtained by imaging a passenger corresponding to the external terminal device from among a plurality of images obtained by a plurality of imaging units provided to respectively image different ranges within the mobile body.
(8): in an image providing method according to an aspect of the present invention, the image providing method causes a computer to perform: acquiring an image obtained by imaging the inside of a moving body by an imaging unit; acquiring boarding/alighting information on each occupant detected for boarding/alighting to/from the mobile body; and transmitting, based on the acquired getting-on/off information, an image corresponding to a period during which a certain passenger gets on the vehicle in the moving body among the acquired images to an external terminal device associated with the passenger in a manner so as to be viewable, and not transmitting, to the external terminal device, an image corresponding to a period after the passenger gets off the vehicle in a manner so as to be viewable.
(9): a storage medium according to an aspect of the present invention stores a program that causes a computer to perform: acquiring an image obtained by imaging the inside of a moving body by an imaging unit; acquiring boarding/alighting information on each occupant detected for boarding/alighting to/from the mobile body; and transmitting, based on the acquired getting-on/off information, an image corresponding to a period during which a certain passenger gets on the vehicle in the moving body among the acquired images to an external terminal device associated with the passenger in a manner so as to be viewable, and not transmitting, to the external terminal device, an image corresponding to a period after the passenger gets off the vehicle in a manner so as to be viewable.
(10): a mobile in-vivo image providing server according to an aspect of the present invention includes: an image acquisition unit that acquires an image obtained by imaging the inside of the moving body by the imaging unit; an occupant getting-on/off information acquiring unit that acquires getting-on/off information of an occupant in the moving body; and an image providing unit that distributes the image to an external terminal device based on the boarding/alighting information and user registration information, the user registration information including information on the occupant and the external terminal device, respectively, and the image providing unit distributing an image of the image during a time period in which the occupant rides in the moving body to the external terminal device.
Effects of the invention
According to (1), (8), (9), and (10), based on the information on the occupant who has detected the boarding and disembarking of the mobile object, the external terminal device can read the image captured while the occupant is riding in the vehicle, the image being associated with the external terminal device. Thus, when an image captured of the inside of the mobile body is provided to the terminal device outside the mobile body, the image can be provided only in the case where a specific passenger rides the vehicle.
According to (2), even when the passenger is riding in the vehicle but the passenger associated with the external terminal device is not riding in the vehicle, the transmission of the image viewable by the external terminal device is not performed. This makes it possible to prevent unnecessary transmission of images.
According to (3), even if the passenger who has made a correspondence relationship with the external terminal device in the moving object is in a state of not riding the vehicle and the transmission of the image is terminated, the image capturing by the image capturing unit is continuously executed if the passenger is present. This makes it possible to grasp the state of the moving body until the occupant is not present in the moving body.
According to (4), the stored image can be deleted in accordance with the fact that the external terminal device of the same image as the stored image is notified of approval to delete, after the image captured by the imaging unit is stored. This makes it possible to appropriately delete images that are not necessary for the user of any external terminal device, from among the stored images.
According to (5), even if an image is not transmitted from a moving object to an apparatus such as a server for providing an image via a network, the image can be transferred from the passenger terminal apparatus and viewed by the external terminal apparatus by transmitting the image to the passenger terminal apparatus carried by the passenger. This can suppress the amount of data transfer in network communication of a mobile object, for example.
According to (6), the quality of the image transmitted to the external terminal device via the passenger terminal device can be made different from the quality of the image transmitted to a device other than the external terminal device. This makes it possible to reduce the quality of an image and suppress the amount of data transfer in network communication, for example, when it is necessary to transmit an image from a mobile object to a predetermined device via a network, in addition to viewing a high-quality image viewed by an external terminal device.
According to (7), the image including the occupant who is associated with the external terminal device among the images obtained by imaging the plurality of imaging units in the moving body is transmitted to the external terminal device. In this case, the image portion of the occupant in each image becomes larger than that in the case where the space corresponding to all the seats in the moving body is imaged by, for example, 1 imaging unit, and therefore the situation of the subject occupant can be accurately grasped when viewing the image.
Drawings
Fig. 1 is a diagram showing an example of the overall configuration of an in-vehicle image providing system according to a first embodiment.
Fig. 2 is a diagram showing an example of a functional configuration of the in-vehicle control device according to the first embodiment.
Fig. 3 is a diagram showing an example of the driver registration information according to the first embodiment.
Fig. 4 is a diagram showing an example of a functional configuration of the in-vehicle image providing server according to the first embodiment.
Fig. 5 is a diagram showing an example of user registration information according to the first embodiment.
Fig. 6 is a diagram showing an example of the boarding/alighting information according to the first embodiment.
Fig. 7 is a diagram showing a configuration example of the patron terminal device of the first embodiment.
Fig. 8 is a flowchart showing an example of processing steps executed by the in-vehicle control device, the in-vehicle image providing server, and the protector terminal device according to the first embodiment in response to the occupant riding in the vehicle.
Fig. 9 is a flowchart showing an example of processing steps to be executed by the in-vehicle control device, the in-vehicle image providing server, and the protector terminal device according to the first embodiment in response to the passenger getting off the vehicle.
Fig. 10 is a flowchart showing an example of processing procedures to be executed by the guardian terminal apparatus and the in-vehicle image providing server according to the second embodiment in order to delete a captured image.
Fig. 11 is a diagram showing an example of the vehicle route information according to the third embodiment.
Fig. 12 is a diagram showing an example of seat specification information according to the third embodiment.
Fig. 13 is a flowchart showing an example of processing procedures to be executed by the occupant terminal device, the in-vehicle image providing server, and the in-vehicle control device according to the third embodiment in response to seat designation.
Fig. 14 is a diagram showing an example of a functional configuration of a terminal device of an attended person according to the fourth embodiment.
Fig. 15 is a flowchart showing an example of processing steps executed by the in-vehicle control device, the in-vehicle image providing server, and the protector terminal device according to the fourth embodiment in response to the occupant riding in the vehicle.
Fig. 16 is a flowchart showing an example of processing steps to be executed by the in-vehicle control device, the in-vehicle image providing server, and the protector terminal device according to the fourth embodiment in response to the passenger getting off the vehicle.
Fig. 17 is a diagram showing an example of an installation form of a plurality of imaging units according to the fifth embodiment.
Fig. 18 is a flowchart showing an example of processing steps executed by the in-vehicle control device, the in-vehicle image providing server, and the protector terminal device according to the fifth embodiment in response to the occupant riding in the vehicle.
Fig. 19 is a flowchart showing an example of processing steps to be executed by the in-vehicle control device, the in-vehicle image providing server, and the protector terminal device according to the fifth embodiment in response to the passenger getting off the vehicle.
Description of reference numerals:
100 … vehicle, 110 … image capturing unit, 120 … in-vehicle control device, 200 … in-vehicle image providing server, 201 … communication unit, 202 … control unit, 203 … storage unit, 300 … guardian terminal device, 301 … communication unit, 302 … control unit, 303 … storage unit, 304 … display unit, 305 … operation unit, 400 … watched person terminal device, 401 … communication unit, 402 … control unit, 403 … storage unit, 404 … display unit, 405 … operation unit, 121 … communication unit, 122 … control unit, 123 … storage unit.
Detailed Description
Embodiments of an in-vehicle image providing system, an in-vehicle image providing method, a storage medium, and a mobile in-vivo image providing server according to the present invention will be described below with reference to the drawings.
< first embodiment >
[ example of configuration of in-vehicle image providing System ]
Fig. 1 shows an example of the overall configuration of the in-vehicle image providing system according to the present embodiment. The vehicle 100 (an example of a mobile body) in the in-vehicle image providing system according to the present embodiment is driven by the driver DR. The vehicle 100 may be a vehicle owned by the driver DR, or may be a vehicle temporarily used by the driver DR based on, for example, an operation such as vehicle sharing or a manual leaving.
In the present embodiment, the vehicle 100 is used for the driver DR to equally transfer and receive, for example, a normal fellow passenger PS-1 such as a child in the vicinity, and a close fellow passenger PS-2 such as a child of the driver DR. In the following description, the ordinary fellow passenger PS-1 and the close fellow passenger PS-2 will be referred to as the fellow passenger PS without being particularly distinguished from each other. The driver DR and the fellow passenger PS may be referred to as passengers without particular distinction as persons who ride on the vehicle 100.
In addition, the in-vehicle image providing system according to the present embodiment enables a protector to view a captured image captured in the vehicle 100 through the protector terminal device 300 (an example of an external terminal device) so that a protector GD (a watching observer), such as a parent or a caregiver who is not seated in the vehicle 100, can watch a watched person, such as a child, an elderly person, or a caretaker who is seated in the vehicle 100. The patron terminal device 300 may be, for example, a smart phone, a tablet terminal, a personal computer, a computer game, or the like. Hereinafter, in the present application, a case where the observed person is a child, and the watching person is a parent or a protector will be described.
In the figure, 2 ordinary fellow passengers PS-1, 2 protectors GD corresponding to the ordinary fellow passengers PS-1, and 2 protector terminal apparatuses 300 corresponding to the protectors GD, respectively, are shown. The number of the general occupants PS-1 and the number of the protectors GD and the protector terminal devices 300 corresponding to the number of the general occupants PS-1 may be changed within the range of the occupant of the vehicle 100.
In the figure, 1 close fellow passenger PS-2 is shown, but the number of close fellow passengers PS-2 is not particularly limited. Further, a plurality of general passengers PS-1 may be managed by 1 protector GD and the protector terminal device 300.
The in-vehicle image providing system of the present embodiment includes an imaging unit 110, an in-vehicle control device 120, an in-vehicle image providing server 200, and a protector terminal device 300.
The observed observer terminal device 400 (an example of an occupant terminal device) corresponds to the fourth embodiment, the fifth embodiment, and the like described below, and is provided in the in-vehicle image providing system, and therefore, the description thereof is omitted. In the present embodiment, the attended observer terminal device 400 may be omitted.
The imaging unit 110 is provided to image the inside of the vehicle 100. In the present embodiment, the in-vehicle control device 120 includes one imaging unit 110. The imaging unit 110 in this case is provided so as to be able to image occupants corresponding to all seats in the vehicle 100.
The in-vehicle control device 120 is provided to the vehicle 100, and executes control related to provision of an in-vehicle image. The in-vehicle control device 120 transmits the captured image captured by the imaging unit 110 to the in-vehicle image providing server 200.
The in-vehicle image providing server 200 executes processing for viewing the captured image transmitted from the in-vehicle control apparatus 120 at the protector terminal apparatus 300. The in-vehicle image providing server 200 of the present embodiment transmits the captured image transmitted from the in-vehicle control device 120 to the protector terminal device 300.
For example, the time for each of the fellow passengers PS to get on and off the vehicle 100 differs depending on the place where the fellow passenger PS takes the vehicle, the position of the own home, and the like. Based on such a situation, the in-vehicle image providing server 200 of the present embodiment transmits, to the protector terminal device 300, the captured image limited to the period in which the fellow passenger PS, which is the child of the protector GD, rides in the vehicle 100, among the captured images transmitted from the in-vehicle control device 120.
Thus, the protector GD can accurately grasp the situation of the person to be protected who is in the vehicle and who is not in the vehicle by viewing the captured image displayed on the protector terminal device 300, and cannot unnecessarily view the captured image of the person to be protected who is not in the vehicle. Further, for example, it is possible to achieve privacy protection of the driver DR, the family of the driver, and the like when the driver DR drives alone, or when a person who does not need to take care of attention such as the driver DR and the child (family) of the driver DR takes a ride. Further, the amount of communication between the in-vehicle image providing server 200 and each of the protector terminal devices 300 can be reduced, and for example, the communication cost and the like consumed by the protector terminal devices 300 can be saved.
[ example of functional Structure of on-vehicle control device ]
Fig. 2 shows a functional configuration example of the in-vehicle control device 120. The functions of the in-vehicle control device 120 are realized by executing a program (software) by a hardware processor such as a cpu (central Processing unit), for example. Some or all of these components may be realized by hardware (including circuit units) such as lsi (large Scale integration), asic (application Specific Integrated circuit), FPGA (Field-Programmable Gate Array), gpu (graphics Processing unit), or the like, or may be realized by cooperation of software and hardware. The program may be stored in advance in a storage device (a storage device including a non-transitory storage medium) such as an HDD or a flash memory of the in-vehicle control device 120, or may be stored in a removable storage medium such as a DVD or a CD-ROM, and mounted in the HDD or the flash memory of the in-vehicle control device 120 by being attached to the drive device via the storage medium (the non-transitory storage medium). Further, the in-vehicle control device 120 may be connected to an external network via the communication unit 121 or an external communication terminal, thereby realizing all or a part of the functions executed by the in-vehicle control device 120.
The in-vehicle control device 120 of the present embodiment includes a communication unit 121, a control unit 122, and a storage unit 123.
The communication unit 121 communicates with the in-vehicle image providing server 200 via a network including wireless communication, for example.
The control unit 122 executes various controls in the in-vehicle control device 120. The control unit 122 includes a getting-on/off detection unit 1221, an image transmission control unit 1222, a getting-on/off information control unit 1223, and a position information acquisition unit 1224.
The entrance/exit detection unit 1221 detects entrance/exit of an occupant at the vehicle 100. That is, the boarding/alighting detection unit 1221 detects that the occupant has ridden and detects that the occupant has disembarked. The getting-on/off detecting unit 1221 can detect the getting-on/off of the occupant based on, for example, the open/close state of the door of the vehicle 100, the output of a seating sensor provided in the seat, the result of recognition of the occupant in the captured image by the imaging unit 110, and the like.
The image transmission control unit 1222 performs control for transmitting the captured image captured by the imaging unit 110 to the in-vehicle image providing server 200.
The getting on/off information control unit 1223 performs control as follows: the result of the occupant getting on or off the vehicle 100 is reflected in the getting on or off information stored in the in-vehicle image providing server 200. The boarding/alighting information is information related to boarding/alighting of the occupants of the vehicle 100, and includes history of boarding/alighting of each occupant with respect to the vehicle 100.
The position information acquiring unit 1224 acquires position information indicating the position of the vehicle 100. The position information acquiring unit 1224 can acquire, as position information, a position measured by using, for example, a gps (global Positioning system). The location information acquiring unit 1224 may acquire location information based on the location of an access point of the wireless LAN, for example, which is acquired by communication with the access point. The position information acquiring unit 1224 may acquire the position information of the vehicle 100 from a device provided in the vehicle and capable of acquiring other position information.
The storage unit 123 stores various information corresponding to the in-vehicle control device 120. The map storage unit 123 includes a driver registration information storage unit 1231 and a captured image storage unit 1232. The seat specification information storage 1233 corresponds to a third embodiment described later, and therefore, the description thereof will be omitted. In the present embodiment, the seat specification information storage 1233 may be omitted.
The driver registration information storage 1231 stores driver registration information. The driver registration information is information of a person who has registered the driver DR and a close person of the driver DR and who is likely to ride in the vehicle 100 (close fellow passenger).
The captured image storage 1232 stores the captured image captured by the imaging unit 110. That is, the image capturing unit 110 and the captured image storage unit 1232 realize a function of a drive recorder that records a situation in the vehicle 100 as an image in the vehicle 100.
Fig. 3 shows an example of driver registration information. The driver registration information includes driver information and close fellow passenger information. The driver information is information related to the driver. The close fellow rider information is information related to the close fellow rider PS-2. The number of the close fellow passengers PS-2 may be two or more. Therefore, the number of the close fellow passengers PS-2 registered in the driver registration information may be two or more.
The driver information of the figure includes name, address, emergency contact address, face image information. The close fellow passenger information in the figure corresponds to a case where the close fellow passenger PS-2 is a child of the driver, and includes information such as a name, read information, certificate information, an emergency contact address, and a face image. The walk-through information is information on the name, residence, contact address, and the like of a child care facility or school, etc., which the child as the close fellow passenger PS-2 walks through. The certificate information is information of copies of certificates such as insurance certificates and child medical fee acquirer certificates.
Such driver registration information is registered in the in-vehicle control device 120 by the driver DR in advance. The registration may be performed by the driver DR operating the in-vehicle control device 120, or may be performed by the driver DR operating a terminal device connected to the in-vehicle control device 120 so as to be able to communicate with the terminal device.
[ example of functional Structure of in-vehicle image providing Server ]
Fig. 4 shows an example of a functional configuration of in-vehicle image providing server 200. The function of the in-vehicle image providing server 200 is realized by executing a program (software) by a hardware processor such as a CPU, for example. Some or all of these components may be realized by hardware (including a circuit section) such as an LSI, an ASIC, an FPGA, or a GPU, or may be realized by cooperation of software and hardware. The program may be stored in advance in a storage device (a storage device including a non-transitory storage medium) such as an HDD or a flash memory of the in-vehicle image providing server 200, or may be stored in a removable storage medium such as a DVD or a CD-ROM, and may be attached to the HDD or the flash memory of the in-vehicle image providing server 200 by being mounted on a drive device via the storage medium (the non-transitory storage medium). Further, it is also possible that a plurality of devices operate in cooperation, and preferably, a plurality of devices operate in cooperation for each function, thereby functioning as the in-vehicle image providing server 200. In this case, the plurality of devices may operate in cooperation via communication.
The in-vehicle image providing server 200 in the figure includes a communication unit 201, a control unit 202, and a storage unit 203.
The communication unit 201 communicates with the in-vehicle control device 120, the guardian terminal device 300, and the like via a network including wireless communication.
Control unit 202 performs control in-vehicle image providing server 200. The control unit 202 in this figure includes an image acquisition unit 221, an entering/leaving information acquisition unit 222, and an image providing unit 223.
The image acquisition unit 221 acquires the captured image transmitted from the in-vehicle control device 120.
The boarding/alighting information acquisition unit 222 acquires boarding/alighting information. The boarding/alighting information acquisition unit 222 acquires the occupant boarding information and the occupant alighting information for each occupant transmitted from the in-vehicle control device 120 as boarding/alighting information.
The boarding/disembarking information acquiring unit 222 stores the acquired boarding/disembarking information in the boarding/disembarking information storage unit 232. Further, each time the occupant identification information is transmitted from the in-vehicle control device 120, the getting-on/off information acquisition unit 222 updates the getting-on/off information stored in the getting-on/off information storage unit 232 based on the transmitted occupant identification information.
The image providing unit 223 outputs the captured image corresponding to the period in which the child of the protector GD gets in the vehicle, among the captured images acquired by the image acquiring unit 221, so that the protector GD can view the captured image through the protector terminal device 300.
The storage unit 203 stores information associated with the in-vehicle image providing server 200. The storage unit 203 of the present embodiment includes a user registration information storage unit 231, an entering/leaving information storage unit 232, and a captured image storage unit 233.
The vehicle route information storage unit 234 and the seat designation information storage unit 235 correspond to a third embodiment described later, and therefore, the description thereof is omitted.
The user registration information storage unit 231 stores user registration information. The user registration information is information on a guardian GD as a user registered as a user who receives the information from the vehicle 100 and a child (guardian) as a fellow person PS.
Fig. 5 shows an example of user registration information. The user registration information is a structure in which user information is associated with each user ID uniquely indicating a user unit.
The user information corresponding to 1 user unit includes the protector information and the protected person information. The angel information includes information related to an angel. The protector information includes name, address, emergency contact, application ID.
The guardian terminal device 300 is installed with a watching application program having a function of displaying a captured image transmitted from the in-vehicle image providing server 200. The watching-attention applications installed in the guardian terminal apparatus 300 are assigned application IDs uniquely representing them, respectively. The application ID in the guardian information indicates a watching application installed in the guardian terminal device 300 owned by the corresponding guardian GD.
In addition, the guardian information includes name, read information, certificate information, emergency contact information, facial image information.
For example, the guardian GD may connect the guardian terminal device 300 operating the watching application to the in-vehicle image providing server 200 so as to register the guardian terminal device 300. In accordance with such an operation, in-vehicle image providing server 200 can cause user registration information storage unit 231 to store user registration information for 1 user unit.
Note that 1 guardian GD may register two or more guardians. Therefore, two or more pieces of information on the person to be protected can be stored in the user registration information of 1 user unit. Further, 1 guardian may be registered corresponding to a plurality of guardians GD.
The explanation will be made with reference to fig. 4. The getting-on/off information storage unit 232 stores getting-on/off information.
Fig. 6 shows an example of the boarding/alighting information. The boarding/alighting information in the figure indicates history regarding boarding/alighting of each passenger including the driver.
The boarding/alighting information of the map includes a vehicle identifier and a plurality of occupant-unit boarding/alighting information.
The vehicle identifier is an identifier that uniquely represents the vehicle 100.
The one-occupant-unit getting-on/off information indicates a history of getting-on/off of one occupant. The one-occupant-unit getting-on/off information includes occupant type, occupant information, riding time, riding position, getting-off time, and getting-off position information.
The occupant category indicates the category of the corresponding occupant. The occupant category indicates which of the driver, the close fellow passenger PS-2, and the normal fellow passenger PS-1 the corresponding occupant is. In the figure, the boarding/alighting information is stored in units of passengers, and each passenger type indicates a driver, a close-up passenger PS-2, and a normal passenger PS-1.
The occupant information stores predetermined information related to the corresponding occupant.
When the occupant type is a driver, the occupant information includes a name, an address, an emergency contact address, a face image, and the like. These name, address, emergency contact address, and face image information are information obtained from the driver information stored in the in-vehicle control device 120. Alternatively, the face image may be extracted from a captured image captured by the imaging unit 110 while the driver DR is riding.
In the case where the occupant category is the close fellow passenger PS-2, the occupant information holds the face image. The information of the face image in this case may be information obtained from the information of the close fellow passenger stored in the in-vehicle control device 120, or may be extracted from a captured image captured by the imaging unit 110 while the close fellow passenger PS-2 is riding.
When the occupant type is a normal fellow passenger PS-1, the occupant information includes a face image and a user ID. The user ID stored in the occupant information is the user ID stored in the user registration information stored in the in-vehicle image providing server 200, and is the user ID in which the corresponding general fellow passenger PS-1 is associated with the guardian information.
The riding time, riding position, getting-off time, and getting-off position are information recorded in accordance with riding and getting-off of the corresponding occupant. The riding time indicates the time when the occupant rides the vehicle. The riding position indicates a position of the vehicle 100 when the occupant rides. The getting-off time indicates a time at which the occupant gets off the vehicle. The alighting position indicates a position of the vehicle 100 when the occupant gets off the vehicle.
The captured image storage unit 233 stores the captured image acquired by the image acquisition unit 221.
[ example of Structure of protector terminal device ]
An example of the configuration of the protector terminal device 300 will be described with reference to fig. 7. The function of the protector terminal device 300 is realized by executing a program (software) by a hardware processor such as a CPU, for example. Some or all of these components may be realized by hardware (including a circuit section) such as an LSI, an ASIC, an FPGA, or a GPU, or may be realized by cooperation of software and hardware. The program may be stored in advance in a storage device (a storage device including a non-transitory storage medium) such as an HDD or a flash memory of the patron terminal device 300, or may be stored in a removable storage medium such as a DVD or a CD-ROM, and the storage medium (the non-transitory storage medium) may be attached to a drive device to mount the program on the HDD or the flash memory of the patron terminal device 300.
The protector terminal device 300 in the figure includes a communication unit 301, a control unit 302, a storage unit 303, a display unit 304, and an operation unit 305.
The communication unit 301 communicates with the in-vehicle image providing server 200 via a network including a wireless path.
The control unit 302 executes various controls in the guardian terminal apparatus 300. The control unit 302 includes a notification output unit 321 and a display control unit 322.
The notification output unit 321 outputs various notifications transmitted from the in-vehicle image providing server 200.
The display control unit 322 performs control related to display of the captured image transmitted from the in-vehicle image providing server 200.
The storage unit 303 stores various information corresponding to the protector terminal device 300.
The display unit 304 performs display according to the control of the control unit 302.
The operation unit 305 collectively indicates an operation tool and an input device provided in the terminal device 300 for a person to be protected, and an input device connected to the terminal device 300 for a person to be protected. The protection terminal device 300 may include a touch panel in which display devices corresponding to the display unit 304 are combined as the operation unit 305.
[ example of processing procedure corresponding to occupant riding detection ]
An example of processing steps executed by the in-vehicle control device 120, the in-vehicle image providing server 200, and the protector terminal device 300 in association with the occupant riding in the vehicle 100 will be described with reference to the flowchart of fig. 8.
In the in-vehicle control device 120, the boarding/alighting detection unit 1221 waits for the detection of the boarding of the occupant by the boarding/alighting detection unit 1221 (no in step S100). The getting-on/off detection unit 1221 may detect riding of an occupant based on, for example, a detection output of opening/closing of a door, a detection output of a seating sensor provided for each seat and capable of detecting whether or not the occupant is seated, or the like. The boarding/alighting detection unit 1221 may detect the riding of the occupant based on a change in the captured image according to the riding of the occupant after the start of the capturing.
When the occupant is detected (step S100: YES), the image transmission control unit 1222 determines whether or not the image capturing currently performed by the image capturing unit 110 is in an executed state (step S102). When the image capturing by the image capturing unit 110 is not yet executed (no in step S102), the image transmission control unit 1222 starts the image capturing by the image capturing unit 110 (step S104). In response to the start of shooting, the image transmission control unit 1222 may cause the shot image storage unit 1232 to store the shot image. The captured image stored in the captured image storage 1232 can be output (displayed, transmitted, etc.) to the outside as needed.
After the process of step S104 or when the image capturing by the image capturing unit 110 is being executed (yes in step S102), the boarding/alighting detection unit 1221 recognizes the occupant who has detected the boarding (step S106).
As the identification of the occupant in step S106, the boarding/alighting detection unit 1221 identifies which occupant type the occupant is in accordance with among the driver DR, the close-up occupant PS-2, and the normal occupant PS-1.
Upon recognizing the occupant category, the boarding/alighting detection section 1221 may extract a face image of the occupant whose riding is detected from the captured image. The getting-on/off vehicle detecting unit 1221 can recognize which of the driver DR, the close fellow passenger PS-2, and the normal fellow passenger PS-1 the passenger who has detected the vehicle has been riding this time by comparing the extracted face image with the face images of the driver DR and the close fellow passenger PS-2 stored in the driver registration information of the driver registration information storing unit 1231.
Next, the image transmission control unit 1222 determines whether or not the current riding result of the occupant is a state in which the driver DR and the ordinary passenger PS-1 ride for the first time in the vehicle 100 (step S108).
When it is determined that the vehicle is in a state in which the normal passenger PS-1 has taken the vehicle for the first time (yes in step S108), the image transmission control unit 1222 starts transmission of the captured image to the in-vehicle image providing server 200 (step S110).
After the process of step S110, or when it is determined that the vehicle is not in a state in which the normal passenger PS-1 is riding (no in step S10 g), the boarding/disembarking information control unit 1223 transmits the passenger riding information of the passenger who has detected riding in step S100 this time to the in-vehicle image providing server 200 (step S112).
The occupant riding information includes information of the occupant category identified by step S106. In addition, when the passenger type is the driver DR, the passenger riding information may include driver information (name, address, emergency contact address, face image) in the driver registration information. When the occupant type is the close fellow passenger PS-2, the occupant riding information may include close fellow passenger information in the driver registration information.
The occupant riding information includes, in common regardless of the type of the occupant, the time when the occupant takes a car (riding time) and the position of the vehicle 100 when the occupant is detected (riding position).
Next, the process of in-vehicle image providing server 200 will be described. In-vehicle image providing server 200, image acquisition unit 221 determines whether or not reception of a captured image has been started, based on the start of transmission of the captured image from in-vehicle control device 120 in step S110 (step S200). In step S200, the image acquisition unit 221 determines that the reception of the captured image has not been started when the captured image has been received, in addition to when the reception of the captured image has not been started.
When the reception of the captured image is started (yes in step S200), the image acquisition unit 221 starts the storage of the received captured image in the captured image storage unit 233 (step S202).
When the captured image is not received (no in step S200) or after the process in step S202, the boarding/alighting information acquisition unit 222 determines whether or not the passenger boarding information transmitted from the in-vehicle control device 120 in step S112 is received (step S204).
When the passenger riding information is received (yes in step S204), the boarding/disembarking information acquisition unit 222 generates passenger unit boarding/disembarking information using the received passenger riding information (step S206). When the passenger type corresponding to the received passenger riding information is a normal passenger PS-1, the boarding/alighting information acquisition unit 222 compares the face image of the passenger whose riding is finally recognized in the received captured image with the face image of the guardian information included in the user registration information, and specifies the user ID associated with the passenger (the guardian) corresponding to the received passenger riding information. The boarding/alighting information acquisition unit 222 includes, in the occupant information of the boarding/alighting information for each occupant, the face image of the occupant who has last boarded in the captured image and the specified user ID.
The boarding information acquisition unit 222 causes the boarding information storage unit 232 to store the generated passenger-unit boarding information (step S208).
The image providing unit 223 determines whether or not the occupant type corresponding to the occupant-based boarding/alighting information stored in step S208 is a normal passenger PS-1 (step S210).
When the occupant type is the normal fellow passenger PS-1 (YES in step S210), the image providing unit 223 transmits a viewable notification to the protector terminal device 300 of the protector of the normal fellow passenger PS-1 (the protector) corresponding to the occupant-unit boarding/alighting information stored in step S208 (step S212).
The image providing unit 223 may specify, from the user registration information storage unit 231, the application ID associated with the user ID included in the boarding/alighting information for the occupant unit stored in step S208, and may set the protector terminal device 300, to which the attentive application of the specified application ID is installed, as a transmission destination of the viewable notification.
The viewable notification notifies the protector that the captured image can be viewed by the protector terminal device 300, in response to the protector riding in the vehicle 100. Such a viewable notification may be sent as a push notification to a watching application installed in the guardian terminal apparatus 300. Alternatively, the viewable notification may be sent as an electronic mail addressed to a pre-registered mail address of the protector.
In the case where the occupant riding information is not received (no at step S204), or in the case where the occupant type is not the normal passenger PS-1 (no at step S210), or after the processing at step S212, the image providing section 223 performs the following determination. That is, the image providing unit 223 determines whether or not the viewing request transmitted from the protector terminal device 300 is received (step S214). If no viewing request is received (no in step S214), the process returns to step S200.
When the viewing request is received (yes in step S214), the image providing unit 223 starts transmission of the captured image stored in the captured image storage unit 233 to the protector terminal device 300 of the transmission source of the viewing request (step S216). The image providing unit 223 may transmit the photographed image as a moving image so as to be displayed by the protector terminal device 300, so that the current situation in the vehicle 100 can be grasped by the protector terminal device 300. Alternatively, the captured image may be transmitted so that the still picture is switched and displayed at regular intervals.
After the process of step S216, the process returns to step S200.
Next, an example of processing procedures of the protector terminal device 300 will be described. In the protector terminal device 300, the notification output unit 321 waits for the reception of the viewable notification transmitted from the in-vehicle image providing server 200 in step S212 (step S300).
When the viewable notification is received (step S300: YES), the notification output section 321 outputs the received viewable notification (step S302).
The protector terminal device 300, which knows that the photographed image can be viewed by the output of the viewable notification, can perform an operation (display instruction operation) of instructing the protector terminal device 300 to display the photographed image. When the display instruction operation is performed, the display control unit 322 transmits a viewing request to the in-vehicle image providing server 200 (step S304).
In response to the transmission of the viewing request, the in-vehicle image providing server 200 starts transmission of the captured image in step S216. The display control unit 322 displays the received captured image in response to the start of the reception of the captured image. Thus, the protector views the captured image through the protector terminal device 300, and can thereby watch the own person to be protected who is looking at the vehicle 100.
When the captured image is displayed on the protector terminal device 300, the corresponding captured image may be distributed by streaming (streaming) through the web page for viewing the captured image associated with each user, without particularly transmitting a viewable notification, depending on the fact that the in-vehicle image providing server 200 receives the captured image.
In this case, the protector GD logs in a predetermined web page for viewing a captured image through its own user account (user information that may be included in the user registration information) and makes the protector terminal device 300 access the web page. In the web page for viewing a captured image accessed by the protector terminal device 300, the captured image currently being streamed is guided to the protector GD, for example, and the streamed captured image is displayed.
[ example of processing procedure corresponding to detection of getting-off of a passenger ]
An example of processing steps executed by the in-vehicle control device 120, the in-vehicle image providing server 200, and the protector terminal device 300 in response to the passenger getting off the vehicle 100 will be described with reference to the flowchart of fig. 9.
In the in-vehicle control device 120, the boarding detection unit 1221 waits for the detection by the boarding detection unit 1221 of the passenger getting off (no in step S130). When the passenger is detected to get off the vehicle (yes in step S130), the boarding/alighting detection unit 1221 identifies the passenger who has detected the getting off the vehicle this time (step S132).
As the identification of the occupant in step S132, the getting-on/off detection unit 1221 identifies which occupant category of the driver DR, the close fellow passenger PS-2, and the normal fellow passenger PS-1 the occupant gets off.
Upon recognizing the occupant category, the getting-on/off detection section 1221 may extract a face image of the occupant who detected the getting-off from the captured image before the current time. The getting-on/off vehicle detecting unit 1221 may recognize which of the driver DR, the close-up fellow passenger PS-2, and the normal fellow passenger PS-1 the passenger who detected getting-off this time is by comparing the extracted face image with the face images of the driver DR and the close-up fellow passenger PS-2 stored in the driver registration information of the driver registration information storing unit 1231.
The getting-on/off information control unit 1223 transmits the occupant getting-off information about the occupant who has detected the getting-off at this time through step S100 to the in-vehicle image providing server 200 (step S134). The occupant getting-off information includes information of the occupant category identified by step S132. In addition, the occupant getting-off information includes a face image of the occupant who has detected the getting-off this time extracted from the captured image before the getting-off. The occupant getting-off information includes, in common, the time when the occupant gets off the vehicle (getting-off time) and the position of the vehicle 100 when the occupant is detected to get off the vehicle (getting-off position), regardless of the type of the occupant.
Next, the image transmission control unit 1222 determines whether or not the result of the detected alighting is a transition from a state in which the general occupant PS-1 is present in the vehicle 100 to a non-present state (step S136). Therefore, the image transmission control unit 1222 can determine whether or not an occupant other than either the driver DR or the close fellow passenger PS-2 is present in the vehicle 100 by comparing the face image extracted from the captured image after the present detection of alighting with the face images of the driver DR and the close fellow passenger PS-2, respectively.
When it is determined that the vehicle has transitioned to the state where the normal passenger PS-1 is not present (yes in step S136), the image transmission control unit 1222 terminates the operation of transmitting the captured image to the in-vehicle image providing server 200, which is started in step S110 in fig. 8 (step S138).
After the process of step S138, or when it is determined that the state of the normal passenger PS-1 is present (no in step S136), the image transmission control unit 1222 determines whether the alighting detected this time corresponds to the final alighting of the passenger (step S140). Therefore, the image transmission control unit 1222 can determine whether or not an occupant is present in the vehicle 100. Specifically, the image transmission control unit 1222 may determine whether the number of getting-off detections reaches the number of riding detections. Alternatively, the image transmission control unit 1222 may determine whether or not an image of the passenger (person) is extracted from the captured image after the present detection of the alighting.
If the detected alighting does not correspond to the last passenger alighting (no in step S140), the process returns to step S130.
When the last detected alighting corresponds to the last alighting (yes in step S140), the image transmission control unit 1222 terminates the image capturing by the image capturing unit 110 (step S142).
Next, an example of processing steps executed by in-vehicle image providing server 200 will be described. The getting-on/off information acquiring unit 222 determines whether or not the occupant getting-off information transmitted from the in-vehicle control device 120 in step S134 is received (step S230).
When the occupant getting-off information is received (yes in step S230), the getting-on/off information acquiring unit 222 updates the occupant getting-on/off information in units of occupants corresponding to the received occupant getting-off information this time among the occupant getting-on/off information stored in the getting-on/off information storage unit 232 (step S232). By updating the passenger-unit getting-on/off information here, the getting-off time and the getting-off position information included in the received passenger getting-off information are added to the passenger-unit getting-on/off information. The getting-on/off information acquiring unit 222 may specify, as an object of update, the passenger-unit getting-on/off information including the face image of the same person as the person identified as the passenger included in the passenger getting-off information received this time, from the passenger-unit getting-on/off information.
Next, the image providing unit 223 determines whether or not the occupant who gets off the vehicle is the normal fellow occupant PS-1 in the occupant getting off information received this time (step S234). In this case, the image providing unit 223 may determine whether or not the occupant type indicated in the occupant leaving information received this time is a normal passenger PS-1.
When the occupant is a normal fellow passenger PS-1 (YES in step S234), the image providing unit 223 specifies the protector terminal 300 to which the getting-off notification is to be transmitted (step S236). Therefore, the image providing unit 223 specifies, from the user registration information storage unit 231, the application ID associated with the same user ID as the user ID indicated in the occupant information of the occupant-unit boarding/alighting information updated in step S232. The image providing unit 223 specifies the guardian terminal device 300, which has the identified watching application with the application ID, as a transmission destination of the viewable notification.
The image providing unit 223 transmits the get-off notification to the patron terminal 300 specified in step S236 as a transmission destination (step S238). The get-off notification is a notification for notifying the guardian GD that the guardian gets off the vehicle 100.
Further, the image providing unit 223 ends the operation of transmitting the captured image to the protector terminal device 300 specified in step S236 (step S240).
After the process of step S240 or when it is determined that the occupant getting-off information has not been received (no in step S230), the image acquisition unit 221 determines whether or not transmission of the captured image from the in-vehicle control device 120 is finished (step S242). If the transmission of the captured image is not completed (no in step S242), the process returns to step S230.
When the transmission of the captured image is completed (yes in step S242), the image acquisition unit 221 ends the storage of the captured image started in step S202 of fig. 8 (step S244).
Next, an example of processing procedures executed by the protector terminal device 300 will be described. In the patron terminal device 300, the notification output unit 321 waits for the reception of the get-off notification transmitted from the in-vehicle control device 120 in step S238 (no in step S330).
When the get-off notification is received (yes in step S330), the notification output section 321 outputs the get-off notification (step S332). As a form of output of the get-off notification, for example, a push notification corresponding to the watching application may be used as the viewable notification. In addition, as a form of output of the get-off notification, an electronic mail received as the get-off notification may be opened in accordance with an operation.
Further, in response to completion of transmission of the captured image from in-vehicle image providing server 200 in step S240, display control unit 322 stops display of the captured image (step S334).
< second embodiment >
In the first embodiment described above, the in-vehicle image providing server 200 stores the captured image transmitted from the in-vehicle control device 120 in the captured image storage unit 233. The captured image stored in-vehicle image providing server 200 in this manner may be directly stored (saved) and maintained in the stored state. However, for example, from the viewpoint of privacy protection, the captured image stored in-vehicle image providing server 200 may be deleted. In the present embodiment, the captured image stored in the in-vehicle image providing server 200 can be deleted in response to a deletion request from a guardian of a general passenger PS-1 other than the close passenger PS-2 and the driver DR, taking as an example a case where the driver DR is the owner of the vehicle 100.
An example of processing procedures executed by the in-vehicle image providing server 200 and the guardian terminal device 300 according to the present embodiment in association with deletion of a captured image will be described with reference to the flowchart of fig. 10. The configurations of the in-vehicle image providing system, the protector terminal device 300, and the in-vehicle image providing server 200 according to the present embodiment may be the same as those of the first embodiment.
First, an example of processing procedures executed by the protector terminal device 300 will be described. Steps S350 to S354 may be the same as steps S330 to S334 of fig. 9 as the processing executed by the patron terminal device 300.
After the process at step S334, the control unit 302 of the guardian terminal apparatus 300 causes the display unit 304 to display a deletion approval confirmation screen (step S356). The deletion permission confirmation screen is a screen for requesting the protecter GD to confirm whether or not the intention of deleting the captured image of the protecter is permitted. The protector can perform an operation of announcing whether to approve deletion through the delete approval confirmation screen.
In a state where the permission/deletion confirmation screen is displayed, the control unit 302 determines whether or not an operation (permission/deletion operation) for announcing permission/deletion is performed by the guardian GD (step S358). When the operation of notifying that the deletion is not approved is performed without performing the operation of approving the deletion (step S358: no), the processing of the figure is ended. Even after the operation of notifying that the deletion is not approved, the protector GD can operate the protector terminal device 300 to display the deletion approval confirmation screen again and perform the deletion approval operation.
When the operation of agreeing to delete is performed (yes in step S358), the control section 302 transmits a notification of agreeing to delete.
Next, an example of processing steps executed by in-vehicle image providing server 200 will be described. In the in-vehicle image providing server 200, the image providing unit 223 waits for the reception of the deletion approval notification transmitted from the guardian terminal device 300 in step S360 (step S250: no).
When receiving the notification of consent to deletion (step S250: yes), the image providing section 223 retrieves, from the user registration information storage section 231, the user registration information that holds the same application ID as the application ID included in the received notification of consent to deletion. The user registration information includes the guardian information. Therefore, the search of the user registration information corresponds to specifying the protector of the protector terminal device 300 that has received the transmission source of the deletion approval notification.
The image providing unit 223 sets an agreement flag indicating that the protector GD agrees to delete the captured image, in association with the user registration information retrieved in step S252. When the captured image is not yet deleted from the captured image storage unit 233, the protector terminal device 300 corresponding to the protector GD with the approval flag set thereon handles the case where the captured image does not have the right to view the captured image, and prohibits the captured image from being downloaded from the in-vehicle image providing server 200 and displayed.
When the approval flag is set in step S254, the image providing unit 223 determines whether or not approval is obtained for the guardian GD of all the ordinary fellow passengers PS-1 riding in the vehicle 100 during the shooting of the shot image with respect to the deletion of the shot image (step S256).
If the consent of all the protectors GD has not been obtained (no in step S256), the process returns to step S250.
On the other hand, when all the protectors GD agree (yes in step S256), the image providing unit 223 deletes the captured image stored in the captured image storage unit 233 (step S258).
< third embodiment >
Next, a third embodiment will be described. In the present embodiment, for example, the driver DR performs a service of recruiting and delivering the fellow passenger PS. When the driver DR performs the pickup service, the driver DR registers a route for the driver DR to pick up the passenger PS by driving the vehicle 100.
The protector GD can request the vehicle 100, for example, in which a route suitable for the own protector to be connected is registered, to be connected. In addition, at the time of the request for the transfer, the protector GD can specify (reserve) a seat on which the protector is seated in the vehicle 100. Hereinafter, a configuration for seat designation of a person to be protected in the present embodiment will be described.
The in-vehicle image providing server 200 according to the present embodiment supports route registration and seat specification, and includes a vehicle route information storage unit 234 and a seat specification information storage unit 235 (fig. 4) in the storage unit 203.
The vehicle route information storage portion 234 stores vehicle route information. The vehicle route information is information indicating a route to be received by the vehicle 100 according to the primary reception service. The vehicle route information may be registered by the driver DR connecting the terminal device of the driver DR to the in-vehicle image providing server 200 and performing a predetermined operation, for example.
Fig. 11 shows an example of the vehicle route information. Line 1 (record 1) in the figure is unit vehicle route information in which a route is registered in correspondence with the primary delivery service. One unit vehicle route information stores a delivery service identifier, a vehicle identifier, vehicle information, route information, and implementation date and time.
The delivery service identifier is an identifier that uniquely corresponds to a delivery service. The vehicle identifier is an identifier that uniquely represents the vehicle 100 used in the corresponding delivery service. The vehicle information is information on the specification and the like of the corresponding vehicle 100. The vehicle information includes, for example, information on the model number, the maximum number of passengers, the seat layout, and the like. The implementation date and time indicates the date and time when the corresponding delivery service is implemented. For example, the implementation date and time may indicate the start date and time of the delivery service implementation. Alternatively, the delivery date and time may be a start date and time and an end date and time of delivery service delivery (the end date and time may be a predetermined date and time of estimated completion).
The driver DR can use the same vehicle 100 and register a plurality of pieces of unit vehicle route information corresponding to a plurality of delivery services having different dates and times of implementation and routes.
The seat specification information storage unit 235 stores seat specification information. The seat specification information is information indicating a specification (reservation) state of a seat on which the occupant PS sits in the vehicle 100 based on the transmission service of the present embodiment.
Fig. 12 shows an example of seat specification information. The unit seat specifying information is associated with each implementation identifier. That is, one unit seat designation information corresponds to one delivery implementation. The unit seat specification information is information in which a specification status is associated with each seat number. The seat number is a number that is assigned to each seat of the vehicle 100 used for the corresponding delivery according to a predetermined rule. The designation condition indicates whether the corresponding seat is designated as completed.
An example of processing procedures executed by the in-vehicle image providing server 200, the in-vehicle control device 120, and the protector terminal device 300 according to the present embodiment in association with seat designation will be described with reference to the flowchart of fig. 13.
First, an example of processing procedures executed by the protector terminal device 300 will be described. The protector GD operates the protector terminal device 300 that takes care of the operation of the watching application, and causes the protector terminal device 300 to execute an inquiry of a delivery service appropriate for the person to be protected who is to deliver the protector terminal device. At this time, the protector GD inputs conditions such as the date and time of using the delivery service, the boarding location and the alighting location of the protected person to the protector terminal device 300, and then instructs the execution of the inquiry. In the protector terminal device 300, the control unit 302 transmits an inquiry of the delivery service suitable for the input condition to the in-vehicle image providing server 200 (step S370).
In response to the transmission of the inquiry of the delivery service in step S370, the in-vehicle image providing server 200 transmits the vehicle route information corresponding to the delivery service suitable for the specified condition and the unit seat specification information. Then, the control unit 302 displays the route indicated by the vehicle route information and the empty seat condition (the reservation condition of the seat) indicated by the unit seat designation information on the display unit 304 using the transmitted vehicle route information and the unit seat designation information (step S372). Such display may be performed as one application screen for watching the application. The route display and the free seat state display may be displayed on the same screen, or may be displayed by switching between different screens.
The protector GD can perform an operation of specifying a seat used by the protector from the empty seat by operating the screen on which the empty seat status is displayed, after confirming that the displayed route is suitable. The control unit 302 receives an operation of designating a seat by the guardian GD (step S372). When receiving the operation of designating the seat by the protector GD, the control unit 302 transmits a seat reservation request for requesting the designated seat to be secured (step S376).
In response to the transmission of the seat reservation request in step S376, the in-vehicle image providing server 200 sets the completion of the designation (reservation completion) for the designated seat and transmits a reservation completion notification. The patron terminal device 300 receives the transmitted reservation completion notification (step S378).
Next, an example of processing steps executed by in-vehicle image providing server 200 will be described. In the in-vehicle image providing server 200 according to the present embodiment, the control unit 202 retrieves the vehicle route information of the delivery service suitable for the conditions such as the route specified by the query and the date and time of execution from the vehicle route information storage unit 234 in response to the query of the delivery service transmitted from the protector terminal device 300 in step S370 (step S270).
Next, the control unit 202 searches the seat specification information storage unit 235 for unit seat specification information corresponding to the same implementation identifier as the vehicle route information searched in step S270 (step S272).
The control unit 202 transmits the vehicle route information retrieved in step S270 and the unit seat designation information retrieved in step S272 to the protector terminal device 300 (step S274).
Upon receiving the seat reservation request transmitted from the protector terminal device 300 in step S376, the control unit 302 updates the seat designation information retrieved in step S272 with respect to the completion of the seat setting designation (reservation completion) designated by the received seat reservation request (step S276).
Further, the control unit 302 transmits the unit seat designation information updated in step S276 to the corresponding in-vehicle control device 120 of the vehicle 100 (step S278). Further, the control unit 302 transmits a reservation completion notification to the patron terminal device 300 (step S280). The reservation completion notification notifies that the reservation of the seat is completed in response to the reception of the seat reservation request of this time.
Next, an example of the processing procedure of the in-vehicle control device 120 will be described. The in-vehicle control device 120 of the present embodiment includes a seat designation information storage unit 235 (fig. 4) in the storage unit 123. The seat specification information storage unit 235 stores the unit seat specification information transmitted from the in-vehicle image providing server 200 (step S170). The seat specification information storage unit 235 may store a plurality of pieces of unit seat specification information in association with each of a plurality of pieces of vehicle route information registered in association with the vehicle 100 (that is, for each delivery of the delivery service).
In the in-vehicle control device 120, upon receiving the unit seat specification information transmitted from the in-vehicle image providing server 200 in step S278, the control unit 122 causes the seat specification information storage unit 1233 to store the received unit seat specification information. In this case, when the unit seat specification information corresponding to the same delivery service has already been stored, the control unit 122 updates the unit seat specification information corresponding to the same delivery service with the received unit seat specification information.
In this way, the vehicle 100 stores unit seat designation information indicating a seat reservation situation for each delivery service. For example, the driver DR can grasp the reservation status of each currently scheduled service delivery seat by displaying the stored unit seat designation information.
The unit seat specifying information transmitted from the in-vehicle image providing server 200 in step S278 may include a photograph of the face of the passenger PS who is seated in each specified seat. Thus, for example, in the delivery service, the driver DR can confirm whether or not the fellow passenger PS is correctly seated in the reserved seat by displaying the unit seat specifying information indicating the face photograph for each seat.
< fourth embodiment >
Next, a fourth embodiment will be described. In the foregoing first embodiment, the in-vehicle control device 120 transmits the captured image to the in-vehicle image providing server 200, and the in-vehicle image providing server 200 transmits the captured image received from the in-vehicle control device 120 to the protector terminal device 300.
The protector terminal device 300 requests the captured image to be viewed with an image quality of at least a certain level. Therefore, according to the first embodiment, the in-vehicle control device 120 needs to transmit a captured image of a high bit rate to the in-vehicle image providing server 200 via the network. When data of a high bit rate is transmitted via a network, the amount of data in communication increases, and therefore congestion of transmission in the network may occur. In addition, the communication cost incurred by the communication of the in-vehicle control device 120 also increases.
As described below, the in-vehicle image providing system according to the present embodiment is configured to reduce the amount of communication between the in-vehicle control device 120 and the in-vehicle image providing server 200 while maintaining the quality of the captured image displayed on the protector terminal device 300.
In the in-vehicle image providing system according to the present embodiment, the attendee terminal device 400 is carried when a general fellow passenger PS-1 as a protected person rides on the vehicle 100. The attended gaze terminal device 400 may be a smartphone, tablet terminal, or the like.
The attended person terminal device 400 has an image transfer application corresponding to the attended person application installed. The attendee terminal device 400 having the image transfer application installed therein is connected to the in-vehicle control device 120 by short-range wireless communication, receives the captured image from the in-vehicle control device 120, and transmits the received captured image to the guardian terminal device 300 associated with the attendee terminal device 400 via the network.
The method of short-range wireless communication in the present embodiment is not particularly limited, and examples thereof include Bluetooth (registered trademark), nfc (near Field communication), and the like.
Fig. 14 shows an example of a functional configuration of the attendee terminal device 400. The attendee terminal device 400 in this figure includes a communication unit 401, a control unit 402, a storage unit 403, a display unit 404, and an operation unit 405.
The communication unit 401 communicates with an external device. Specifically, the communication unit 401 communicates with the in-vehicle image providing server 200 via a network. The communication unit 401 can communicate with the in-vehicle control device 120 by short-range wireless communication based on the state in which the terminal device 400 of the observed person is present in the vehicle 100.
The control unit 402 executes various controls in the attended person terminal device 400. The control unit 402 includes an image transfer unit 421. The image transfer unit 421 transfers the captured image. That is, the image transfer unit 421 acquires the captured image transmitted from the in-vehicle control device 120 via the short-range wireless communication of the communication unit 401. The image transfer unit 421 transmits the acquired captured image to the terminal device 400 of the attended observer through communication via the network by the communication unit 401. The captured image transmitted by the image transfer unit 421 is displayed by the attended observer terminal device 400. The function of the image transfer unit 421 is realized by an image transfer application installed in the attended observer terminal device 400.
The storage unit 403 stores various information corresponding to the attended observer terminal device 400.
The display unit 404 performs display under the control of the control unit 402.
The operation unit 405 collectively indicates operation elements and input devices provided in the attendee terminal device 400.
An example of processing steps executed by the in-vehicle control device 120, the in-vehicle image providing server 200, and the protector terminal device 300 according to the present embodiment in association with an occupant riding in the vehicle 100 will be described with reference to a flowchart of fig. 15.
In the present embodiment, the boarding/alighting information control unit 1223 in the in-vehicle control device 120 functions as a boarding/alighting information acquisition unit that acquires occupant boarding information and occupant alighting information as boarding/alighting information, and the image transmission control unit 1222 in the in-vehicle control device 120 functions as an image acquisition unit that acquires a captured image and an image providing unit that transmits the captured image so that the captured image is viewed by the protector terminal device 300. In the present embodiment, the image providing unit 223 in the in-vehicle image providing server 200 may be omitted.
The processing of steps S1100 to S1106 performed by the in-vehicle control device 120 may be the same as steps S100 to S106 in fig. 8.
Next, the image transmission control unit 1222 determines whether or not the occupant whose riding is detected this time is a normal passenger PS-1 based on the result of the occupant recognition in step S1106 (step S1108).
In the case of the normal fellow passenger PS-1 (yes in step S1108), the image transmission control unit 1222 establishes connection by short-range wireless communication with the watched-observer terminal device 400 of the normal fellow passenger PS-1 that has detected the vehicle at this time, and then starts transmission of the captured image to the watched-observer terminal device 400 of the connection destination (step S1110). At this time, the image transmission control unit 1222 transmits the captured image with a high quality and a predetermined resolution at a predetermined bit rate with a high bit rate.
After the processing of step S1110, the image transmission control unit 1222 determines whether or not the occupant who has detected the riding is a normal passenger PS-1 who has first ridden the vehicle 100 (step S1112).
When the vehicle is a normal passenger PS-1 who first gets on the vehicle 100 (step S1112: yes), the image transmission control unit 1222 starts transmission of the captured image to the in-vehicle image providing server 200 (step S1114). At this time, the image transmission control unit 1222 transmits the captured image of a predetermined resolution lower than the resolution of the captured image transmitted in step S1110, at a predetermined bit rate lower than the bit rate of the captured image transmitted in step S1110.
After the process of step S1114, or when it is determined in step S1108 that the passenger who has detected the ride is not the normal passenger PS-1 this time, or when it is determined in step S1112 that the passenger is not the normal passenger PS-1 who has first ridden the vehicle 100, the boarding/alighting information control unit 1223 executes the following process. That is, the boarding/alighting information control unit 1223 transmits the occupant riding information about the occupant whose riding is detected in step S100 this time to the in-vehicle image providing server 200 (step S1116). The entering/leaving information control unit 1223 may transmit the passenger riding information in step S1116 at the same bit rate as the captured image started to be transmitted in step S1114 or at a different bit rate.
Next, an example of processing steps executed by in-vehicle image providing server 200 will be described. The processes of steps S1200 to S1208 executed by in-vehicle image providing server 200 are the same as steps S200 to S208 of fig. 8. However, the captured image stored in the captured image storage unit 233 at step S1202 has a lower resolution quality than the watched observer terminal device 400.
Next, an example of processing procedures executed by the attendee terminal device 400 will be described. In the attended observer terminal device 400, the image transfer unit 421 waits for the start of reception of the captured image whose transmission is started in step S1110 (no in step S1400).
When the reception of the photographed image is started (yes in step S1400), the image transfer unit 421 starts transferring the received photographed image to the protector terminal device 300 designated as the photographed image transfer target (step S1402).
For example, in the terminal device 400 of the attended person who is installed with the image transfer application, information of the captured image transfer destination is registered (stored). The information of the captured image transfer destination may be, for example, an application ID of a watching application of the guardian terminal device 300 as the captured image transfer destination. The protector terminal device 300 as the captured image transfer target is the protector terminal device 300 carried by the protector GD as the child of the general fellow passenger PS-1 carrying the observed observer terminal device 400.
The captured image transmitted from the attended observer terminal device 400 to the protector terminal device 300 has, for example, the same resolution as the captured image transmitted from the in-vehicle control device 120, and is transmitted at a predetermined bit rate that is a high bit rate.
Next, an example of processing procedures of the protector terminal device 300 will be described. In the protector terminal device 300, the display control unit 322 waits for the start of reception of the captured image transmitted from the watched observer terminal device 400 in step S1402 (no in step S1300).
When the reception of the captured image is started (yes in step S1300), the display control unit 322 causes the display unit 304 to start the display of the received captured image (step S1302). In this case, the high-quality captured image with high resolution is transmitted to the protector terminal device 300. This enables the protector GD to view a high-quality captured image.
In the attended-observer terminal device 400, the image transfer unit 421 may store the received captured image in the storage unit 403.
The image forwarding unit 421 may transmit a viewable notification indicating that the transmission destination parent terminal device 300 is in a viewable state when the transmission of the captured image is started. The guardian terminal apparatus 300 can transmit a viewing request to the watched-observer terminal apparatus 400, for example, in response to an operation of the guardian GD that has received the viewable notification. The attendee terminal device 400 may start transmission of the captured image in response to reception of the viewing request.
In addition, the protector terminal device 300 may cause the storage unit 303 to store the received photographed image.
According to the processing of this figure, the in-vehicle control device 120 transmits a high-quality captured image displayed by the protector terminal device 300 to the protector terminal device 300 through a path via a network from the watched-observer terminal device 400 connected by the short-range wireless communication. In this case, since the communication for transmitting the captured image to the attendee terminal device 400 is not a network but short-range wireless communication, the in-vehicle control device 120 does not affect the network transmission even at a high bit rate. In addition, since the short-range wireless communication is used, no communication fee is incurred for transmission of the captured image from the in-vehicle control device 120. On the other hand, the captured image transmitted from the in-vehicle control device 120 to the in-vehicle image providing server 200 via the network has a lower quality than that of the captured image transmitted to the terminal device 400 of the observed observer, thereby reducing the amount of traffic and the communication cost.
In the present embodiment, when it is not necessary to store the captured image in the in-vehicle image providing server 200 in advance, the transmission of the captured image from the in-vehicle control device 120 to the in-vehicle image providing server 200 may not be performed.
An example of processing steps executed by the in-vehicle control device 120, the in-vehicle image providing server 200, and the protector terminal device 300 according to the present embodiment in response to the passenger getting off the vehicle 100 will be described with reference to the flowchart of fig. 16.
In the in-vehicle control device 120, the getting-on/off detection unit 1221 waits for the detection of the getting-off of the occupant by the getting-on/off detection unit 1221 (no in step S1120).
When the alighting of the occupant is detected (yes in step S1120), the following processing is performed.
First, the image transmission control unit 1222 determines whether or not there is a watched observer terminal device 400 in which the short-range wireless communication is cut off, based on the fact that the passenger who has alight from the vehicle is detected as a normal passenger PS-1 (step S1122).
When there is a watched-observer terminal device 400 with the short-range wireless communication disconnected (yes in step S1122), the image transmission control unit 1222 terminates transmission of the currently captured image to the watched-observer terminal device 400 with the short-range wireless communication disconnected (step S1124).
The boarding/alighting detection unit 1221 then recognizes the occupant who has detected the boarding (step S1126). The occupant recognition here may be the same as in step S132 of fig. 9.
The getting-on/off information control unit 1223 transmits the occupant getting-off information of the occupant who has detected the getting-off at this time through step S1120 to the in-vehicle image providing server 200 (step S1128).
The occupant getting-off information includes information of the occupant category identified by step S1126. In this case, the occupant getting-off information includes the face image of the occupant who has detected the getting-off this time, which is extracted from the captured image before the getting-off, similarly to the occupant getting-off information transmitted in step S134 in fig. 9. The occupant getting-off information includes, in common regardless of the type of the occupant, the time when the occupant gets off the vehicle (getting-off time) and the position of the vehicle 100 when the occupant is detected to get off the vehicle (getting-off position).
The processing in steps S1130 to S1136 may be the same as in steps S136 to S142 in fig. 9.
Next, an example of processing steps executed by in-vehicle image providing server 200 will be described. The processing in steps S1220 to S1228 may be the same as the processing in steps S230 to S238 in fig. 9.
After the process of step S1228, or when it is determined that the occupant alighting information has not been received (no at step S1220), the image acquisition unit 221 determines whether or not reception of the captured image transmitted from the in-vehicle control device 120 is finished (step S1230). The reception of the captured image by in-vehicle image providing server 200 is terminated in response to in-vehicle control device 120 terminating the transmission of the captured image in step S1132. In the case where the reception of the captured image is not ended (no in step S1230), the process returns to step S1220.
When the reception of the captured image is completed (yes in step S1230), the image acquiring unit 221 completes the storage of the captured image started in step S202 of fig. 8 (step S1232).
Next, an example of processing procedures executed by the attendee terminal device 400 will be described. In the attended observer terminal device 400, the image transfer unit 421 waits for the completion of reception of the captured image transmitted from the in-vehicle control device 120 (step S1420: no). In response to the end of the transmission of the captured image in step S1124, the in-vehicle control device 120 ends the reception of the captured image by the attended observer terminal device 400.
When the reception of the captured image is finished (yes in step S1420), the attended observer terminal device 400 finishes the transfer of the captured image to the protector terminal device 300 (step S1422).
Next, an example of processing procedures executed by the protector terminal device 300 will be described. The processing of steps S1320 and S1322 performed by the patron terminal device 300 in this figure may be the same as steps S330 and S332 in fig. 9.
In the protector terminal device 300, the display control unit 322 waits for the completion of reception of the captured image transmitted from the watched-observer terminal device 400 (no in step S1324).
When the reception of the captured image is finished (yes in step S1324), the display control unit 322 finishes the display of the current captured image (step S1326).
< fifth embodiment >
Next, a fifth embodiment will be described. In each of the above embodiments, the number of the imaging units 110 for imaging the inside of the vehicle 100 is 1. In this case, one imaging unit 110 is provided so that an occupant can be imaged regardless of which seat among all the seats of the vehicle 100 is seated. In contrast, in the present embodiment, a plurality of imaging units 110 for imaging the interior of the vehicle 100 are provided.
Fig. 17 shows an example of an installation form of each imaging unit 110 in the case where 3 imaging units 110(110-1, 110-2, 110-3) are provided in the present embodiment. In the figure, a layout example of seats 101(101-1 to 101-6) in a vehicle 100 in a plan view is shown. In the figure, a seat 101-1 as a driver seat, a seat 101-2 as a sub-driver seat, right and left seats 101-3, 101-4 in a 2 nd row, and right and left seats 101-5, 101-6 in a 3 rd row are arranged from the front of the vehicle 100.
For such a layout of the seats 101, the 3 imaging units 110 are provided as follows. That is, the imaging unit 110-1 is provided to image the occupants seated in the seats 101-1 and 101-2 in the 1 st row. The imaging unit 110-2 is provided to image the occupants seated in the 2 nd seats 101-3 and 101-4. The imaging unit 110-3 is provided to image the occupants seated in the 3 rd row seats 101-5 and 101-6. In this way, by providing a plurality of imaging units 110 and imaging the occupants of different seats for each imaging unit 110, the occupants in the captured image can be made clearer.
With reference to the flowchart of fig. 18, an example of processing steps executed by the in-vehicle control device 120, the in-vehicle image providing server 200, and the protector terminal device 300 according to the riding of the occupant in the vehicle 100 will be described.
The process in this figure is an example in which the watched observer terminal device 400 transfers the captured image transmitted from the in-vehicle control device 120 to the protector terminal device 300, as in the fourth embodiment.
In the in-vehicle control device 120, the boarding/alighting detection unit 1221 waits for the detection of the alighting of the occupant by the boarding/alighting detection unit 1221 (no in step S1140).
When the passenger gets off the vehicle (yes in step S1140), the image transmission control unit 1222 determines whether or not the image capturing by the image capturing unit (corresponding image capturing unit) 110, which captures an image corresponding to the passenger of the seat 101 on which the passenger who has been detected to get on the vehicle at the present time is in the state of being executed (step S1142). When the image capturing by the corresponding image capturing unit 110 has not been started (no in step S1142), the image transmission control unit 1222 starts the image capturing by the corresponding image capturing unit 110 (step S1144). The image transmission control unit 1222 may also start control to cause the captured image storage unit 1232 to store the captured image obtained by the corresponding imaging unit 110.
After the process of step S1144, or when the image capturing by the corresponding image capturing unit 110 is being executed (yes in step S1142), the boarding/alighting detection unit 1221 recognizes the occupant whose boarding is detected this time (step S1146).
Next, the image transmission control unit 1222 determines whether or not the passenger who has detected the riding is the normal passenger PS-1 based on the result of the passenger recognition in step S1146 (step S1148).
In the case of the normal fellow passenger PS-1 (yes in step S1148), the image transmission control unit 1222 starts transmission of the captured image to the target observed observer terminal device 400 to which the connection is being made, after the connection is established by the near-field wireless communication with the observed observer terminal device 400 of the normal fellow passenger PS-1 that has detected the vehicle at this time (step S1150). At this time, the image transmission control unit 1222 transmits the captured image of a high quality and a predetermined resolution at a predetermined bit rate at a high bit rate.
After the processing of step S1150, the image transmission control unit 1222 determines whether or not the occupant who has detected the riding this time is the normal passenger PS-1 that was first captured by the corresponding imaging unit 110 (step S1152).
In the case where the vehicle is the normal passenger PS-1 that was first captured by the corresponding imaging unit 110 (yes in step S1152), the image transmission control unit 1222 starts transmission of the captured image of the corresponding imaging unit 110 to the in-vehicle image providing server 200 (step S1154). At this time, the image transmission control unit 1222 transmits the captured image with a predetermined resolution lower than the captured image transmitted in step S1150, at a predetermined bit rate lower than the captured image transmitted in step S1150.
After the process of step S1154, or when it is determined in step S1148 that the passenger who has detected the riding is not the normal passenger PS-1 this time, or when it is determined in step S1152 that the passenger is not the normal passenger PS-1 who has first ridden the vehicle 100, the getting-on/off information control unit 1223 executes the following process. That is, the boarding/alighting information control unit 1223 transmits the occupant riding information about the occupant whose riding is detected in step S1140 this time to the in-vehicle image providing server 200 (step S1156). The getting-on/off information control unit 1223 may transmit the passenger riding information in step S1156 at the same bit rate as the captured image started to be transmitted in step S1154, or at a different bit rate.
Next, an example of processing steps executed by in-vehicle image providing server 200 will be described. The processes of steps S1240 to S1248 executed by in-vehicle image providing server 200 according to the present embodiment are the same as steps S1200 to S1208 in fig. 15. However, in the present embodiment, the transmission of the captured images captured by the imaging units 110-1, 110-2, and 110-3 is started at different timings. Therefore, every time the image acquisition unit 221 starts transmission of the captured image by the imaging units 110-1, 110-2, and 110-3, the start of storing the captured image in step S1242 is performed 3 times in total, even if it is determined that the captured image is received in step S1240.
The procedure of steps S1440 and S1442 executed by the attended person terminal device 400 according to the present embodiment is the same as steps S1400 and S1402 in fig. 15. The processing of steps S1340 and S1342 executed by the protector terminal device 300 according to this embodiment is the same as steps S1300 and S1302 in fig. 15.
An example of processing steps executed by the in-vehicle control device 120, the in-vehicle image providing server 200, and the protector terminal device 300 according to the present embodiment in response to the passenger getting off the vehicle 100 will be described with reference to the flowchart of fig. 19.
First, an example of the processing procedure of the in-vehicle control device 120 will be described. The processes in steps S1170 to S1178 executed by the in-vehicle control device 120 according to the present embodiment may be the same as those in steps S1120 to S1128 in fig. 16.
After the process of step S1178, the image transmission control unit 1222 determines whether or not the imaging unit (corresponding imaging unit) 110 that imaged the occupant who detected the alighting this time has shifted to a state where the ordinary occupant PS-1 is not imaged (the ordinary occupant PS-1 is not detected from the captured image) because of the alighting of the occupant this time (step S1180).
When the vehicle has shifted to a state where the normal passenger PS-1 is not captured (yes in step S1180), the image transmission control unit 1222 terminates the transmission of the captured image of the corresponding imaging unit to the in-vehicle image providing server 200 (step S1182).
After the process of step S1182, or when it is determined in step S1180 that the normal passenger PS-1 is still captured because, for example, another normal passenger PS-1 remains on the seat captured by the corresponding imaging unit 110 (no in step S1180), the image transmission control unit 1222 executes the following process. That is, the image transmission control unit 1222 determines whether or not the corresponding imaging unit 110 has shifted to a state in which a passenger is not imaged (a passenger is not detected from the imaged image) (step S1184).
When the corresponding image capturing unit 110 is in a state of still capturing the image of the passenger (no in step S1184), the process returns to step S1170.
When the corresponding imaging unit 110 has shifted to a state in which the passenger is not imaged (yes in step S1184), the image transmission control unit 1222 terminates the imaging by the corresponding imaging unit 110 (step S1186).
After the process of step S1186, the image transmission control unit 1222 determines whether the alighting detected this time corresponds to the last passenger alighting (step S1188). If the detected alighting does not correspond to the last getting-off of the passenger (if not all the passengers get-off) (no at step S1188), the process returns to step S1170.
In the case where the alighting detected this time corresponds to the alighting of the last occupant (yes in step S1188), the processing of the figure is ended.
Next, an example of processing procedures executed by in-vehicle image providing server 200 according to the present embodiment will be described. In this figure, the processes of steps S1270 to S1282 executed by the in-vehicle image providing server 200 according to the present embodiment are the same as steps S1220 to S1232 in fig. 16. However, in step S1282, the image acquisition unit 221 ends the storage of the received captured image in step S1280 among the plurality of captured images in a state in which the plurality of captured images corresponding to each of the plurality of imaging units 110 are stored.
Next, the image acquisition unit 221 determines whether or not the storage of all the captured images is completed by the process of the last step S1282 (step S1284). If the storage of all the captured images is not completed (no in step S1284), the process returns to step S1270 to perform the process. In the case where the storage of all the captured images is ended (yes in step S1284), the processing of this drawing is ended.
Next, an example of processing procedures executed by the attendee terminal device 400 will be described. The processing of steps S1470 and S1472 executed by the attendee terminal device 400 in this figure may be the same as that of steps S1420 and S1422 in fig. 16.
Next, an example of processing procedures executed by the protector terminal device 300 will be described. The processing in steps S1370 and S1372 executed by the patron terminal device 300 in this figure may be the same as in S1320 and S1322 in fig. 16. The processing of steps S1374 and S1376 executed by the patron terminal device 300 in the figure may be the same as S1324 and S1326 in fig. 16.
In addition to the configuration of the first embodiment, for example, the protector GD may download the captured image stored (stored) in the in-vehicle image providing server 200 to the protector terminal device 300 to be viewable after the protector gets off the vehicle. Further, the protector terminal device 300 may store the captured image transmitted from the in-vehicle image providing server 200 in advance, and the protector GD may be configured to be able to view the captured image by playing the stored captured image after the protector GD gets off the vehicle. Further, according to the fourth and fifth embodiments, the captured image received from the in-vehicle control device 120 is stored in advance in the watched-observer terminal device 400, and the protector GD gets off the vehicle of the own protected person and then transmits the captured image from the watched-observer terminal device 400 to the protector terminal device 300, whereby the captured image can be viewed by the protector terminal device 300.
In the above embodiments, for example, in a state where the driver DR and at least one of the close fellow passengers PS-2 are riding in the vehicle, but the ordinary fellow passenger PS-1 is not riding in the vehicle and it is not necessary to take special attention (an attention-unnecessary state), the imaging unit 110 performs imaging, but the transmission of the captured image to the in-vehicle image providing server 200 is not started. However, even in the above-described watching attention unnecessary state, the imaging unit 110 may start imaging and the transmission of the captured image to the in-vehicle image providing server 200 at the timing when at least one of the driver DR and the close fellow passenger PS-2 gets into the vehicle. In addition, for example, according to the first embodiment, the quality of the image is degraded based on the watching-attention-unnecessary state, and the captured image is transmitted to the in-vehicle image providing server 200 at a bit rate. Further, the state of the watching attention unnecessary state is released based on the common fellow passenger PS-1 riding, and the captured image can be transmitted to the in-vehicle image providing server 200 at a high bit rate while improving the quality of the image.
The structures of the above embodiments may be appropriately combined. For example, the fourth or fifth embodiment may be combined with the configuration of deleting the captured image according to the second embodiment, the configuration of reserving the seat according to the third embodiment, and the like. For example, as in the first embodiment, a combination of a configuration in which the in-vehicle image providing server 200 transmits a captured image to the protector terminal device 300 and a configuration in which the vehicle 100 is provided with a plurality of imaging units 110 may be used.
While the present invention has been described with reference to the embodiments, the present invention is not limited to the embodiments, and various modifications and substitutions can be made without departing from the scope of the present invention.

Claims (10)

1. A mobile in-vivo image providing system, wherein,
the mobile in-vivo image providing system includes:
an image acquisition unit that acquires an image obtained by imaging the inside of the moving body by the imaging unit;
an entering/leaving information acquiring unit that acquires entering/leaving information for each passenger detected as to entering/leaving of the moving object; and
and an image providing unit that transmits, to an external terminal device associated with a certain passenger who gets in the moving body, an image corresponding to a period during which the passenger gets in the moving body, among the images acquired by the image acquiring unit, so as to be viewable, and does not transmit, to the external terminal device, an image corresponding to a period after the passenger gets off the vehicle, so as to be viewable, based on the getting-on/off information acquired by the getting-on/off information acquiring unit.
2. The mobile in-vivo image providing system according to claim 1,
the image acquisition unit causes the imaging unit to start imaging in response to at least one occupant riding the vehicle,
the image providing unit starts transmission of the image in response to a passenger riding in the vehicle, the passenger having a correspondence relationship with the external terminal device.
3. The mobile in-vivo image providing system according to claim 1 or 2,
the image providing unit ends the transmission of the image in response to the passenger getting off the vehicle in association with the external terminal device,
the image acquisition unit terminates the image capturing in response to all the occupants getting off the moving object.
4. The mobile in-vivo image providing system according to claim 1 or 2,
the mobile in-vivo image providing system includes a storage unit for storing the image,
the image providing unit deletes the image stored in the storage unit in response to receiving a notification indicating approval of deletion from each of the external terminal devices to which the image stored in the storage unit is transmitted.
5. The mobile in-vivo image providing system according to claim 1 or 2,
the image providing unit is provided in the mobile body, and transfers an image transmitted to view the image on the external terminal device to the external terminal device through a passenger terminal device carried by a passenger present in the mobile body.
6. The mobile in-vivo image providing system according to claim 5,
the image providing unit causes the quality of an image transmitted to view the image on the external terminal device to be different from the quality of an image transmitted to a predetermined device different from the passenger terminal device.
7. The mobile in-vivo image providing system according to claim 1 or 2,
the image providing unit transmits an image obtained by imaging a passenger corresponding to the external terminal device from among a plurality of images obtained by a plurality of imaging units provided to image different ranges in the moving body.
8. A mobile in-vivo image providing method, wherein,
the moving in-vivo image providing method causes a computer to perform:
acquiring an image obtained by imaging the inside of a moving body by an imaging unit;
acquiring boarding/alighting information on each occupant detected for boarding/alighting to/from the mobile body; and
the method includes the steps of obtaining an image corresponding to a period in which a passenger gets into the moving body from among the obtained images, and transmitting the obtained image to an external terminal device associated with the passenger in a manner that the obtained image can be read based on the obtained getting-on/off information.
9. A storage medium storing a program, wherein,
the program causes a computer to perform the following processing:
acquiring an image obtained by imaging the inside of a moving body by an imaging unit;
acquiring boarding/alighting information on each occupant detected for boarding/alighting to/from the mobile body; and
the method includes the steps of obtaining an image corresponding to a period in which a passenger gets into the moving body from among the obtained images, and transmitting the obtained image to an external terminal device associated with the passenger in a manner that the obtained image can be read based on the obtained getting-on/off information.
10. A mobile in-vivo image providing server, wherein,
the mobile in-vivo image providing server is provided with:
an image acquisition unit that acquires an image obtained by imaging the inside of the moving body by the imaging unit;
an occupant getting-on/off information acquiring unit that acquires getting-on/off information of an occupant in the moving body; and
an image providing unit that distributes the image to an external terminal device based on the boarding/alighting information and user registration information,
the user registration information includes information relating to the occupant and the external terminal device respectively,
the image providing unit distributes an image of the image during the time when the occupant rides in the moving body to the external terminal device.
CN202011351579.8A 2019-12-25 2020-11-26 Mobile in-vivo image providing system, mobile in-vivo image providing method, server and storage medium Pending CN113034727A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-233902 2019-12-25
JP2019233902A JP7083806B2 (en) 2019-12-25 2019-12-25 Moving body image providing system, moving body image providing method, program, and moving body image providing server

Publications (1)

Publication Number Publication Date
CN113034727A true CN113034727A (en) 2021-06-25

Family

ID=76459205

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011351579.8A Pending CN113034727A (en) 2019-12-25 2020-11-26 Mobile in-vivo image providing system, mobile in-vivo image providing method, server and storage medium

Country Status (2)

Country Link
JP (1) JP7083806B2 (en)
CN (1) CN113034727A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040003411A1 (en) * 2002-06-28 2004-01-01 Minolta Co., Ltd. Image service system
JP2004094600A (en) * 2002-08-30 2004-03-25 Ocean Medical:Kk Mobile nursery system
JP2007087279A (en) * 2005-09-26 2007-04-05 Denso Corp School bus communication system
CN102710929A (en) * 2012-05-10 2012-10-03 陕西科技大学 Monitoring system for preventing students from falling from building
CN103714592A (en) * 2013-12-24 2014-04-09 深圳市天天上网络科技有限公司 Method and system for monitoring boarding of school bus
CN206181246U (en) * 2016-09-28 2017-05-17 毅思产品开发有限公司 Developments propelling movement formula video monitor system
CN107634991A (en) * 2017-08-31 2018-01-26 北京豪络科技有限公司 A kind of travel system for personal safe protection
CN207264160U (en) * 2017-01-19 2018-04-20 福建安贝通科技有限公司 Campus Security monitoring linkage identifying system
JP2018169942A (en) * 2017-03-30 2018-11-01 株式会社日本総合研究所 Vehicle management server, in-vehicle terminal, watching method, and program in pickup-with-watching-service system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004040221A (en) 2002-06-28 2004-02-05 Minolta Co Ltd Image distribution system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040003411A1 (en) * 2002-06-28 2004-01-01 Minolta Co., Ltd. Image service system
JP2004094600A (en) * 2002-08-30 2004-03-25 Ocean Medical:Kk Mobile nursery system
JP2007087279A (en) * 2005-09-26 2007-04-05 Denso Corp School bus communication system
CN102710929A (en) * 2012-05-10 2012-10-03 陕西科技大学 Monitoring system for preventing students from falling from building
CN103714592A (en) * 2013-12-24 2014-04-09 深圳市天天上网络科技有限公司 Method and system for monitoring boarding of school bus
CN206181246U (en) * 2016-09-28 2017-05-17 毅思产品开发有限公司 Developments propelling movement formula video monitor system
CN207264160U (en) * 2017-01-19 2018-04-20 福建安贝通科技有限公司 Campus Security monitoring linkage identifying system
JP2018169942A (en) * 2017-03-30 2018-11-01 株式会社日本総合研究所 Vehicle management server, in-vehicle terminal, watching method, and program in pickup-with-watching-service system
CN107634991A (en) * 2017-08-31 2018-01-26 北京豪络科技有限公司 A kind of travel system for personal safe protection

Also Published As

Publication number Publication date
JP2021103834A (en) 2021-07-15
JP7083806B2 (en) 2022-06-13

Similar Documents

Publication Publication Date Title
US20190003848A1 (en) Facility-information guidance device, server device, and facility-information guidance method
US11094184B2 (en) Forgetting-to-carry prevention assistance method, terminal device, and forgetting-to-carry prevention assistance system
JP2024045578A (en) Information processing device, terminal device, passenger control method, passenger acceptance method, passenger request method, and program
JP2018169942A (en) Vehicle management server, in-vehicle terminal, watching method, and program in pickup-with-watching-service system
US20130137476A1 (en) Terminal apparatus
WO2016183810A1 (en) Method and apparatus for facilitating automatic arrangement on user&#39;s journey
JP2017191371A (en) Automobile, and program for automobile
JP6076595B2 (en) Reporting system
US20190376803A1 (en) Information processing apparatus, information processing method, and non-transitory storage medium
JP2023539250A (en) Lost item notification methods, devices, electronic devices, storage media and computer programs
KR20150133953A (en) Brokerage system for taxi carpool
CN110689715B (en) Information processing apparatus, information processing method, and non-transitory storage medium
US20190283532A1 (en) Control device and control method
JP7449823B2 (en) Rider support device, passenger support method, and program
CN113034727A (en) Mobile in-vivo image providing system, mobile in-vivo image providing method, server and storage medium
US20220044268A1 (en) Information processing apparatus, information processing method, and non-transitory storage medium
JP2013200738A (en) Priority seat use support system, mobile communication terminal, passenger support device and priority seat use support method
JP2021157276A (en) Boarding support control device, boarding support system, boarding support control method, and program
CN212243270U (en) Vehicle control device
JP7036690B2 (en) Information processing equipment, information processing methods and information processing programs
US20200011680A1 (en) Information processing apparatus, information processing method and non-transitory storage medium
JP2021051545A (en) Information processing device, information processing method, and information processing program
US10338886B2 (en) Information output system and information output method
CN110852468A (en) Information processing apparatus, information processing method, and non-transitory storage medium
JP2007087279A (en) School bus communication system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210625

RJ01 Rejection of invention patent application after publication